Table of Contents

Namespace AiDotNet.Optimizers

Classes

ADMMOptimizer<T, TInput, TOutput>

Implements the Alternating Direction Method of Multipliers (ADMM) optimization algorithm.

AMSGradOptimizer<T, TInput, TOutput>

Implements the AMSGrad optimization algorithm, an improved version of Adam optimizer.

AdaDeltaOptimizer<T, TInput, TOutput>

Implements the AdaDelta optimization algorithm for training neural networks and other machine learning models.

AdaMaxOptimizer<T, TInput, TOutput>

Represents an AdaMax optimizer, an extension of Adam that uses the infinity norm.

AdagradOptimizer<T, TInput, TOutput>

Represents an Adagrad (Adaptive Gradient) optimizer for gradient-based optimization.

AdamOptimizer<T, TInput, TOutput>

Implements the Adam (Adaptive Moment Estimation) optimization algorithm for gradient-based optimization.

AdamWOptimizer<T, TInput, TOutput>

Implements the AdamW (Adam with decoupled Weight decay) optimization algorithm.

AntColonyOptimizer<T, TInput, TOutput>

Implements the Ant Colony Optimization algorithm for solving optimization problems.

BFGSOptimizer<T, TInput, TOutput>

Implements the Broyden-Fletcher-Goldfarb-Shanno (BFGS) optimization algorithm.

BayesianOptimizer<T, TInput, TOutput>

Represents a Bayesian Optimizer for optimization problems.

CMAESOptimizer<T, TInput, TOutput>

Implements the Covariance Matrix Adaptation Evolution Strategy (CMA-ES) optimization algorithm.

ConjugateGradientOptimizer<T, TInput, TOutput>

Implements the Conjugate Gradient optimization algorithm for numerical optimization problems.

CoordinateDescentOptimizer<T, TInput, TOutput>

Implements the Coordinate Descent optimization algorithm for numerical optimization problems.

DFPOptimizer<T, TInput, TOutput>

Implements the Davidon-Fletcher-Powell (DFP) optimization algorithm for numerical optimization problems.

DifferentialEvolutionOptimizer<T, TInput, TOutput>

Implements the Differential Evolution optimization algorithm for numerical optimization problems.

FTRLOptimizer<T, TInput, TOutput>

Represents a Follow The Regularized Leader (FTRL) optimizer for machine learning models.

GeneticAlgorithmOptimizer<T, TInput, TOutput>

Represents a Genetic Algorithm optimizer for machine learning models.

GradientBasedOptimizerBase<T, TInput, TOutput>

Represents a base class for gradient-based optimization algorithms.

GradientDescentOptimizer<T, TInput, TOutput>

Represents a Gradient Descent optimizer for machine learning models.

LAMBOptimizer<T, TInput, TOutput>

Implements the LAMB (Layer-wise Adaptive Moments for Batch training) optimization algorithm.

LARSOptimizer<T, TInput, TOutput>

Implements the LARS (Layer-wise Adaptive Rate Scaling) optimization algorithm.

LBFGSOptimizer<T, TInput, TOutput>

Implements the Limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) optimization algorithm.

LevenbergMarquardtOptimizer<T, TInput, TOutput>

Implements the Levenberg-Marquardt optimization algorithm for non-linear least squares problems.

LionOptimizer<T, TInput, TOutput>

Implements the Lion (Evolved Sign Momentum) optimization algorithm for gradient-based optimization.

MiniBatchGradientDescentOptimizer<T, TInput, TOutput>

Implements the Mini-Batch Gradient Descent optimization algorithm.

ModifiedGradientDescentOptimizer<T>

Modified Gradient Descent optimizer for Hope architecture. Based on Equations 27-29 from "Nested Learning" paper.

Traditional GD: W_{t+1} = W_t - η * ∇L(W_t; x_t) ⊗ x_t Modified GD: W_{t+1} = W_t * (I - x_t*x_t^T) - η * ∇L(W_t; x_t) ⊗ x_t

This formulation uses L2 regression objective instead of dot-product similarity, resulting in better handling of data dependencies in token space.

MomentumOptimizer<T, TInput, TOutput>

Implements the Momentum optimization algorithm for gradient-based optimization.

NadamOptimizer<T, TInput, TOutput>

Implements the Nesterov-accelerated Adaptive Moment Estimation (Nadam) optimization algorithm.

NelderMeadOptimizer<T, TInput, TOutput>

Implements the Nelder-Mead optimization algorithm, also known as the downhill simplex method.

NesterovAcceleratedGradientOptimizer<T, TInput, TOutput>

Implements the Nesterov Accelerated Gradient optimization algorithm.

NewtonMethodOptimizer<T, TInput, TOutput>

Implements the Newton's Method optimization algorithm.

NormalOptimizer<T, TInput, TOutput>

Implements a normal optimization algorithm with adaptive parameters.

OptimizationDataBatcherExtensions

Extension methods for optimization data batching.

OptimizationDataBatcher<T, TInput, TOutput>

Provides batch iteration utilities for optimization input data.

OptimizerBase<T, TInput, TOutput>

Represents the base class for all optimization algorithms, providing common functionality and interfaces.

ParticleSwarmOptimizer<T, TInput, TOutput>

Implements a Particle Swarm Optimization algorithm for finding optimal solutions.

ProximalGradientDescentOptimizer<T, TInput, TOutput>

Implements a Proximal Gradient Descent optimization algorithm which combines gradient descent with regularization.

RootMeanSquarePropagationOptimizer<T, TInput, TOutput>

Implements the Root Mean Square Propagation (RMSProp) optimization algorithm, an adaptive learning rate method.

StochasticGradientDescentOptimizer<T, TInput, TOutput>

Represents a Stochastic Gradient Descent (SGD) optimizer for machine learning models.

TabuSearchOptimizer<T, TInput, TOutput>

Represents a Tabu Search optimizer for machine learning models.

TrustRegionOptimizer<T, TInput, TOutput>

Implements the Trust Region optimization algorithm for machine learning models.