Table of Contents

Enum MetaSGDUpdateRuleType

Namespace
AiDotNet.MetaLearning.Options
Assembly
AiDotNet.dll

Update rule types for Meta-SGD per-parameter optimization.

public enum MetaSGDUpdateRuleType

Fields

AdaDelta = 5

AdaDelta optimizer with learned decay per parameter.

AdaGrad = 4

AdaGrad optimizer with learned accumulation per parameter.

Adam = 2

Adam optimizer with optionally learned beta parameters per parameter.

RMSprop = 3

RMSprop optimizer with learned decay rates per parameter.

SGD = 0

Standard Stochastic Gradient Descent with learned per-parameter learning rates. Update: θ_i = θ_i - α_i × ∇L

SGDWithMomentum = 1

SGD with learned per-parameter momentum coefficients. Update: v_i = β_i × v_i + α_i × ∇L; θ_i = θ_i - v_i

Remarks

These define the base optimization algorithm that Meta-SGD learns to configure on a per-parameter basis.