Class MAMLAlgorithm<T, TInput, TOutput>
- Namespace
- AiDotNet.MetaLearning.Algorithms
- Assembly
- AiDotNet.dll
Implementation of the MAML (Model-Agnostic Meta-Learning) algorithm.
public class MAMLAlgorithm<T, TInput, TOutput> : MetaLearnerBase<T, TInput, TOutput>, IMetaLearner<T, TInput, TOutput>
Type Parameters
TThe numeric type used for calculations (e.g., double, float).
TInputThe input data type (e.g., Matrix<T>, Tensor<T>).
TOutputThe output data type (e.g., Vector<T>, Tensor<T>).
- Inheritance
-
MetaLearnerBase<T, TInput, TOutput>MAMLAlgorithm<T, TInput, TOutput>
- Implements
-
IMetaLearner<T, TInput, TOutput>
- Inherited Members
Remarks
MAML (Model-Agnostic Meta-Learning) is a meta-learning algorithm that trains models to be easily fine-tunable. It learns initial parameters such that a small number of gradient steps on a new task will lead to good performance.
Key features: - Model-agnostic: works with any model trainable with gradient descent - Learns good initialization rather than learning a fixed feature extractor - Enables few-shot learning with just 1-5 examples per class
For Beginners: MAML is like teaching someone how to learn quickly.
Normal machine learning: Train a model for one specific task MAML: Train a model to be easily trainable for many different tasks
It's like learning how to learn - by practicing on many tasks, the model learns what kind of parameters make it easy to adapt to new tasks quickly.
Reference: Finn, C., Abbeel, P., & Levine, S. (2017). Model-agnostic meta-learning for fast adaptation of deep networks.
Constructors
MAMLAlgorithm(MAMLOptions<T, TInput, TOutput>)
Initializes a new instance of the MAMLAlgorithm class.
public MAMLAlgorithm(MAMLOptions<T, TInput, TOutput> options)
Parameters
optionsMAMLOptions<T, TInput, TOutput>MAML configuration options containing the model and all hyperparameters.
Examples
// Create MAML with minimal configuration (uses all defaults)
var options = new MAMLOptions<double, Tensor, Tensor>(myNeuralNetwork);
var maml = new MAMLAlgorithm<double, Tensor, Tensor>(options);
// Create MAML with custom configuration
var options = new MAMLOptions<double, Tensor, Tensor>(myNeuralNetwork)
{
LossFunction = new CrossEntropyLoss<double>(),
InnerLearningRate = 0.01,
OuterLearningRate = 0.001,
AdaptationSteps = 5,
UseFirstOrderApproximation = true
};
var maml = new MAMLAlgorithm<double, Tensor, Tensor>(options);
Exceptions
- ArgumentNullException
Thrown when options is null.
- InvalidOperationException
Thrown when required components are not set in options.
Properties
AlgorithmType
Gets the algorithm type identifier for this meta-learner.
public override MetaLearningAlgorithmType AlgorithmType { get; }
Property Value
- MetaLearningAlgorithmType
Returns MAML.
Remarks
This property identifies the algorithm as MAML (Model-Agnostic Meta-Learning), which is useful for serialization, logging, and algorithm-specific handling.
Methods
Adapt(IMetaLearningTask<T, TInput, TOutput>)
Adapts the meta-learned model to a new task using MAML's inner loop optimization.
public override IModel<TInput, TOutput, ModelMetadata<T>> Adapt(IMetaLearningTask<T, TInput, TOutput> task)
Parameters
taskIMetaLearningTask<T, TInput, TOutput>The new task containing support set examples for adaptation.
Returns
- IModel<TInput, TOutput, ModelMetadata<T>>
A new model instance that has been fine-tuned to the given task.
Remarks
This is where MAML's "learning to learn" capability shines. The meta-learned initialization allows the model to quickly adapt to new tasks with just a few gradient steps on a small support set.
Adaptation Process: 1. Clone the meta-model to preserve the learned initialization 2. Perform K gradient descent steps on the task's support set 3. Return the adapted model, ready for inference on new examples
For Beginners: After meta-training, call this method when you have a new task with a few labeled examples. The returned model will be specialized for that task and ready to make predictions.
The number of gradient steps is controlled by AdaptationSteps. Typically 1-10 steps are sufficient thanks to the good initialization learned during meta-training.
Exceptions
- ArgumentNullException
Thrown when task is null.
MetaTrain(TaskBatch<T, TInput, TOutput>)
Performs one meta-training step using MAML's bi-level optimization.
public override T MetaTrain(TaskBatch<T, TInput, TOutput> taskBatch)
Parameters
taskBatchTaskBatch<T, TInput, TOutput>A batch of tasks to meta-train on, each containing support and query sets.
Returns
- T
The average meta-loss across all tasks in the batch.
Remarks
MAML meta-training consists of two nested optimization loops:
Inner Loop (Task Adaptation): For each task in the batch: 1. Clone the meta-model with current meta-parameters 2. Perform K gradient descent steps on the task's support set 3. Evaluate the adapted model on the task's query set
Outer Loop (Meta-Update): 1. Compute meta-gradients based on query set performance 2. Average meta-gradients across all tasks in the batch 3. Apply gradient clipping if configured 4. Update meta-parameters using the averaged meta-gradients
For Beginners: Each call to this method makes the model slightly better at learning new tasks quickly. The returned loss value should decrease over time.
Exceptions
- ArgumentException
Thrown when the task batch is null or empty.
- InvalidOperationException
Thrown when meta-gradient computation fails.