Table of Contents

Class PerParameterOptimizer<T, TInput, TOutput>

Namespace
AiDotNet.MetaLearning.Algorithms
Assembly
AiDotNet.dll

Per-parameter optimizer for Meta-SGD that learns individual optimization coefficients.

public class PerParameterOptimizer<T, TInput, TOutput>

Type Parameters

T

The numeric type.

TInput

The input data type.

TOutput

The output data type.

Inheritance
PerParameterOptimizer<T, TInput, TOutput>
Inherited Members

Remarks

This optimizer maintains learned coefficients for each parameter: - Learning rates: α_i for each parameter - Momentum: β_i for each parameter (optional) - Direction: d_i for each parameter (optional) - Adam parameters: beta1, beta2, epsilon (if using Adam)

For Beginners: This is a special optimizer where each weight in the network gets its own set of optimization settings that are learned during meta-training.

Constructors

PerParameterOptimizer(int, MetaSGDOptions<T, TInput, TOutput>)

Initializes a new instance of the PerParameterOptimizer.

public PerParameterOptimizer(int numParameters, MetaSGDOptions<T, TInput, TOutput> options)

Parameters

numParameters int

Number of model parameters.

options MetaSGDOptions<T, TInput, TOutput>

Meta-SGD options.

Properties

NumParameters

Gets the number of model parameters this optimizer manages.

public int NumParameters { get; }

Property Value

int

Methods

Clone()

Creates a deep copy of this per-parameter optimizer.

public PerParameterOptimizer<T, TInput, TOutput> Clone()

Returns

PerParameterOptimizer<T, TInput, TOutput>

A new PerParameterOptimizer with copied state.

GetLearningRate(int)

Gets the learning rate for a specific parameter.

public T GetLearningRate(int parameterIndex)

Parameters

parameterIndex int

Returns

T

GetMetaParameterCount()

Gets the total number of meta-parameters being learned.

public int GetMetaParameterCount()

Returns

int

Count of learned meta-parameters.

SetAdamBeta1(int, T)

Sets Adam beta1 for a specific parameter.

public void SetAdamBeta1(int parameterIndex, T beta1)

Parameters

parameterIndex int
beta1 T

SetAdamBeta2(int, T)

Sets Adam beta2 for a specific parameter.

public void SetAdamBeta2(int parameterIndex, T beta2)

Parameters

parameterIndex int
beta2 T

SetAdamEpsilon(int, T)

Sets Adam epsilon for a specific parameter.

public void SetAdamEpsilon(int parameterIndex, T epsilon)

Parameters

parameterIndex int
epsilon T

SetDirection(int, T)

Sets the direction for a specific parameter.

public void SetDirection(int parameterIndex, T direction)

Parameters

parameterIndex int
direction T

SetLearningRate(int, T)

Sets the learning rate for a specific parameter.

public void SetLearningRate(int parameterIndex, T learningRate)

Parameters

parameterIndex int
learningRate T

SetMomentum(int, T)

Sets the momentum for a specific parameter.

public void SetMomentum(int parameterIndex, T momentum)

Parameters

parameterIndex int
momentum T

UpdateMetaParameters(Vector<T>)

Updates the meta-parameters (learned coefficients) of the optimizer.

public void UpdateMetaParameters(Vector<T> metaGradients)

Parameters

metaGradients Vector<T>

Gradients for the meta-parameters.

Remarks

Updates learning rates, momentum, direction, and Adam parameters based on the computed meta-gradients. Also applies regularization and clipping.

UpdateParameter(int, T, T)

Updates a single parameter using its learned optimization coefficients.

public T UpdateParameter(int parameterIndex, T parameter, T gradient)

Parameters

parameterIndex int

Index of the parameter to update.

parameter T

Current parameter value.

gradient T

Gradient for this parameter.

Returns

T

Updated parameter value.

Remarks

Applies the learned per-parameter update rule: - SGD: θ = θ - α_i × d_i × g - SGDWithMomentum: v = β_i × v + α_i × d_i × g; θ = θ - v - Adam: Full Adam with learned β1, β2, ε