Interface IMetaLearnerOptions<T>
- Namespace
- AiDotNet.Interfaces
- Assembly
- AiDotNet.dll
Configuration options interface for meta-learning algorithms.
public interface IMetaLearnerOptions<T>
Type Parameters
TThe numeric type (e.g., float, double).
Remarks
Meta-learning algorithms use a two-loop optimization structure: - Inner loop: Fast adaptation to a specific task using support set - Outer loop: Meta-optimization to improve adaptation across all tasks
For Beginners: Think of meta-learning like learning to study effectively:
- Inner loop: How you study for a specific exam (practice problems, examples)
- Outer loop: Learning better study techniques by reflecting across many exams
The configuration controls both loops:
- InnerLearningRate: How aggressively to adapt to each task
- OuterLearningRate: How much to update the meta-parameters
- AdaptationSteps: How many gradient updates per task
Properties
AdaptationSteps
Gets the number of gradient descent steps for inner loop adaptation.
int AdaptationSteps { get; }
Property Value
- int
How many times to update parameters on each task's support set. Typical values: 1 to 10. Default: 5
CheckpointFrequency
Gets the checkpoint save frequency in meta-iterations.
int CheckpointFrequency { get; }
Property Value
- int
Save checkpoint every N meta-iterations. Only used if EnableCheckpointing is true. Default: 500
EnableCheckpointing
Gets whether to save checkpoints during training.
bool EnableCheckpointing { get; }
Property Value
- bool
True to save model checkpoints periodically.
EvaluationFrequency
Gets the evaluation frequency in meta-iterations.
int EvaluationFrequency { get; }
Property Value
- int
Run evaluation every N meta-iterations. Set to 0 to disable periodic evaluation. Default: 100
EvaluationTasks
Gets the number of evaluation tasks for periodic validation.
int EvaluationTasks { get; }
Property Value
- int
How many tasks to use when evaluating model performance. Typical values: 100 to 1000. Default: 100
GradientClipThreshold
Gets the gradient clipping threshold to prevent exploding gradients.
double? GradientClipThreshold { get; }
Property Value
- double?
Maximum gradient norm. Set to null or 0 to disable clipping. Typical values: 1.0 to 10.0. Default: 10.0
InnerLearningRate
Gets the inner loop learning rate for task-specific adaptation.
double InnerLearningRate { get; }
Property Value
- double
The learning rate used during task adaptation on support sets. Typical values: 0.001 to 0.1. Default: 0.01
MetaBatchSize
Gets the number of tasks to sample per meta-update (meta-batch size).
int MetaBatchSize { get; }
Property Value
- int
How many tasks to average over for each outer loop update. Typical values: 1 (online) to 32 (batch). Default: 4
Remarks
For Beginners: This controls how many tasks you learn from before updating your meta-parameters: - MetaBatchSize = 1: Update after every task (more noisy, faster iteration) - MetaBatchSize = 16: Update after 16 tasks (more stable, slower iteration)
NumMetaIterations
Gets the number of meta-training iterations to perform.
int NumMetaIterations { get; }
Property Value
- int
How many times to perform the outer loop meta-update. Typical values: 100 to 10,000. Default: 1000
OuterLearningRate
Gets the outer loop learning rate for meta-optimization.
double OuterLearningRate { get; }
Property Value
- double
The learning rate for updating meta-parameters. Typically 10x smaller than InnerLearningRate. Typical values: 0.0001 to 0.01. Default: 0.001
RandomSeed
Gets the random seed for reproducible task sampling and initialization.
int? RandomSeed { get; }
Property Value
- int?
Random seed value. Set to null for non-deterministic behavior.
UseFirstOrder
Gets whether to use first-order approximation (e.g., FOMAML, Reptile).
bool UseFirstOrder { get; }
Property Value
- bool
True to ignore second-order gradients, which is faster but may be less accurate. Default: false for MAML-based algorithms.
Remarks
For Beginners: First-order approximation: - True: Faster training, simpler gradients, works well in practice - False: More accurate gradients, slower, may be unstable with many inner steps
Methods
Clone()
Creates a deep copy of this options instance.
IMetaLearnerOptions<T> Clone()
Returns
- IMetaLearnerOptions<T>
A new options instance with the same values.
IsValid()
Validates that the configuration is valid and sensible.
bool IsValid()
Returns
- bool
True if the configuration is valid; false otherwise.