Table of Contents

Class LBFGSOptimizer<T, TInput, TOutput>

Namespace
AiDotNet.Optimizers
Assembly
AiDotNet.dll

Implements the Limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) optimization algorithm.

public class LBFGSOptimizer<T, TInput, TOutput> : GradientBasedOptimizerBase<T, TInput, TOutput>, IGradientBasedOptimizer<T, TInput, TOutput>, IOptimizer<T, TInput, TOutput>, IModelSerializer

Type Parameters

T

The numeric type used for calculations, typically float or double.

TInput
TOutput
Inheritance
OptimizerBase<T, TInput, TOutput>
GradientBasedOptimizerBase<T, TInput, TOutput>
LBFGSOptimizer<T, TInput, TOutput>
Implements
IGradientBasedOptimizer<T, TInput, TOutput>
IOptimizer<T, TInput, TOutput>
Inherited Members
Extension Methods

Remarks

L-BFGS is a quasi-Newton method for solving unconstrained nonlinear optimization problems. It approximates the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm using a limited amount of computer memory, making it suitable for optimization problems with many variables.

For Beginners: L-BFGS is an advanced optimization algorithm that efficiently finds the minimum of a function, especially useful for problems with many variables. It uses information from previous iterations to make intelligent decisions about where to search next, while keeping memory usage low.

Constructors

LBFGSOptimizer(IFullModel<T, TInput, TOutput>, LBFGSOptimizerOptions<T, TInput, TOutput>?, IEngine?)

Initializes a new instance of the LBFGSOptimizer class.

public LBFGSOptimizer(IFullModel<T, TInput, TOutput> model, LBFGSOptimizerOptions<T, TInput, TOutput>? options = null, IEngine? engine = null)

Parameters

model IFullModel<T, TInput, TOutput>

The model to optimize.

options LBFGSOptimizerOptions<T, TInput, TOutput>

Options for the L-BFGS optimizer. If null, default options are used.

engine IEngine

The computation engine (CPU or GPU) for vectorized operations.

Methods

Deserialize(byte[])

Deserializes a byte array to restore the optimizer's state.

public override void Deserialize(byte[] data)

Parameters

data byte[]

The byte array containing the serialized optimizer state.

Remarks

This method takes a byte array (previously created by the Serialize method) and uses it to restore the optimizer's state, including its options and internal memory.

For Beginners: This is like loading a saved snapshot of the optimizer's state. It rebuilds the optimizer's memory and settings from the saved data, allowing it to continue from where it left off.

Exceptions

InvalidOperationException

Thrown when deserialization of optimizer options fails.

GetOptions()

Retrieves the current options of the optimizer.

public override OptimizationAlgorithmOptions<T, TInput, TOutput> GetOptions()

Returns

OptimizationAlgorithmOptions<T, TInput, TOutput>

The current options of the optimizer.

Remarks

This method returns the current configuration options of the L-BFGS optimizer.

For Beginners: This lets you see what settings the optimizer is currently using.

InitializeAdaptiveParameters()

Initializes or resets the adaptive parameters used in the optimization process.

protected override void InitializeAdaptiveParameters()

Optimize(OptimizationInputData<T, TInput, TOutput>)

Performs the main optimization process using the L-BFGS algorithm.

public override OptimizationResult<T, TInput, TOutput> Optimize(OptimizationInputData<T, TInput, TOutput> inputData)

Parameters

inputData OptimizationInputData<T, TInput, TOutput>

The input data for the optimization process.

Returns

OptimizationResult<T, TInput, TOutput>

The result of the optimization process.

Remarks

DataLoader Integration: This method uses the DataLoader API for epoch management. L-BFGS typically operates on the full dataset because it maintains a history of gradient and position differences that require consistent gradients between iterations. The method notifies the sampler of epoch starts using NotifyEpochStart(int) for compatibility with curriculum learning and sampling strategies.

Serialize()

Serializes the optimizer's state into a byte array.

public override byte[] Serialize()

Returns

byte[]

A byte array representing the serialized state of the optimizer.

Remarks

This method converts the current state of the optimizer, including its options and internal memory, into a byte array. This allows the optimizer's state to be saved or transmitted.

For Beginners: This is like taking a snapshot of the optimizer's current state so it can be saved or sent somewhere else. It includes all the important information about what the optimizer has learned so far.

UpdateAdaptiveParameters(OptimizationStepData<T, TInput, TOutput>, OptimizationStepData<T, TInput, TOutput>)

Updates the adaptive parameters of the optimizer based on the current and previous optimization steps.

protected override void UpdateAdaptiveParameters(OptimizationStepData<T, TInput, TOutput> currentStepData, OptimizationStepData<T, TInput, TOutput> previousStepData)

Parameters

currentStepData OptimizationStepData<T, TInput, TOutput>

Data from the current optimization step.

previousStepData OptimizationStepData<T, TInput, TOutput>

Data from the previous optimization step.

Remarks

This method adjusts the learning rate based on the performance of the current step compared to the previous step. If the adaptive learning rate option is enabled, it increases or decreases the learning rate accordingly.

For Beginners: This method helps the optimizer learn more efficiently by adjusting how big its steps are. If the current step improved the solution, it takes slightly bigger steps. If not, it takes smaller steps to be more careful.

UpdateOptions(OptimizationAlgorithmOptions<T, TInput, TOutput>)

Updates the optimizer's options with new settings.

protected override void UpdateOptions(OptimizationAlgorithmOptions<T, TInput, TOutput> options)

Parameters

options OptimizationAlgorithmOptions<T, TInput, TOutput>

The new options to apply to the optimizer.

Remarks

This method updates the optimizer's configuration with new options. It ensures that only valid LBFGSOptimizerOptions are applied to this optimizer.

For Beginners: This is like changing the settings on the optimizer. It makes sure you're using the right kind of settings for this specific type of optimizer.

Exceptions

ArgumentException

Thrown when the provided options are not of type LBFGSOptimizerOptions.

UpdateParameters(Vector<T>, Vector<T>)

Updates parameters using the L-BFGS algorithm.

public override Vector<T> UpdateParameters(Vector<T> parameters, Vector<T> gradient)

Parameters

parameters Vector<T>

The current parameter vector to update.

gradient Vector<T>

The gradient of the loss function with respect to the parameters.

Returns

Vector<T>

The updated parameter vector.

Remarks

This method implements the L-BFGS two-loop recursion algorithm for computing the search direction. It maintains internal state (previous parameters and gradients) to build up the L-BFGS memory across successive calls.

For Beginners: Unlike simple gradient descent that just follows the steepest direction, L-BFGS uses information from previous steps to approximate the curvature of the function being optimized. This typically leads to faster convergence, especially for problems with many variables.

UpdateParametersGpu(IGpuBuffer, IGpuBuffer, int, IDirectGpuBackend)

Updates parameters using GPU-accelerated L-BFGS.

public override void UpdateParametersGpu(IGpuBuffer parameters, IGpuBuffer gradients, int parameterCount, IDirectGpuBackend backend)

Parameters

parameters IGpuBuffer
gradients IGpuBuffer
parameterCount int
backend IDirectGpuBackend

Remarks

L-BFGS is a limited-memory quasi-Newton method that maintains history of past gradients. GPU implementation is not yet available due to the complexity of two-loop recursion and history management across GPU memory.