Enum MetaSGDUpdateRuleType
- Namespace
- AiDotNet.MetaLearning.Options
- Assembly
- AiDotNet.dll
Update rule types for Meta-SGD per-parameter optimization.
public enum MetaSGDUpdateRuleType
Fields
AdaDelta = 5AdaDelta optimizer with learned decay per parameter.
AdaGrad = 4AdaGrad optimizer with learned accumulation per parameter.
Adam = 2Adam optimizer with optionally learned beta parameters per parameter.
RMSprop = 3RMSprop optimizer with learned decay rates per parameter.
SGD = 0Standard Stochastic Gradient Descent with learned per-parameter learning rates. Update: θ_i = θ_i - α_i × ∇L
SGDWithMomentum = 1SGD with learned per-parameter momentum coefficients. Update: v_i = β_i × v_i + α_i × ∇L; θ_i = θ_i - v_i
Remarks
These define the base optimization algorithm that Meta-SGD learns to configure on a per-parameter basis.