Class ReptileAlgorithm<T, TInput, TOutput>
- Namespace
- AiDotNet.MetaLearning.Algorithms
- Assembly
- AiDotNet.dll
Implementation of the Reptile meta-learning algorithm.
public class ReptileAlgorithm<T, TInput, TOutput> : MetaLearnerBase<T, TInput, TOutput>, IMetaLearner<T, TInput, TOutput>
Type Parameters
TThe numeric type used for calculations (e.g., double, float).
TInputThe input data type (e.g., Matrix<T>, Tensor<T>).
TOutputThe output data type (e.g., Vector<T>, Tensor<T>).
- Inheritance
-
MetaLearnerBase<T, TInput, TOutput>ReptileAlgorithm<T, TInput, TOutput>
- Implements
-
IMetaLearner<T, TInput, TOutput>
- Inherited Members
Remarks
Reptile is a simple and scalable meta-learning algorithm. Unlike MAML, it doesn't require computing gradients through the adaptation process, making it more efficient and easier to implement while achieving competitive performance.
Algorithm: 1. Sample a task (or batch of tasks) 2. Perform SGD on the task starting from the current meta-parameters 3. Update meta-parameters by interpolating toward the adapted parameters 4. Repeat
For Beginners: Reptile is like learning by averaging your experiences.
Imagine learning to cook: - You start with basic knowledge (initial parameters) - You make a specific dish and learn specific techniques - Instead of just remembering that one dish, you update your basic knowledge to include some of what you learned - After cooking many dishes, your basic knowledge becomes really good for learning any new recipe quickly
Reptile is simpler than MAML because it just moves toward adapted parameters instead of computing complex gradients through the adaptation process. The key insight is that this simple approach achieves similar performance to more complex methods like MAML.
Reference: Nichol, A., Achiam, J., & Schulman, J. (2018). On first-order meta-learning algorithms.
Constructors
ReptileAlgorithm(ReptileOptions<T, TInput, TOutput>)
Initializes a new instance of the ReptileAlgorithm class.
public ReptileAlgorithm(ReptileOptions<T, TInput, TOutput> options)
Parameters
optionsReptileOptions<T, TInput, TOutput>Reptile configuration options containing the model and all hyperparameters.
Examples
// Create Reptile with minimal configuration
var options = new ReptileOptions<double, Tensor, Tensor>(myNeuralNetwork);
var reptile = new ReptileAlgorithm<double, Tensor, Tensor>(options);
// Create Reptile with custom configuration
var options = new ReptileOptions<double, Tensor, Tensor>(myNeuralNetwork)
{
AdaptationSteps = 10,
InnerBatches = 2,
Interpolation = 0.5,
OuterLearningRate = 0.1
};
var reptile = new ReptileAlgorithm<double, Tensor, Tensor>(options);
Exceptions
- ArgumentNullException
Thrown when options is null.
- InvalidOperationException
Thrown when required components are not set in options.
Properties
AlgorithmType
Gets the algorithm type identifier for this meta-learner.
public override MetaLearningAlgorithmType AlgorithmType { get; }
Property Value
- MetaLearningAlgorithmType
Returns Reptile.
Remarks
This property identifies the algorithm as Reptile, a first-order meta-learning algorithm that uses parameter interpolation instead of gradient-based meta-updates.
Methods
Adapt(IMetaLearningTask<T, TInput, TOutput>)
Adapts the meta-learned model to a new task using gradient descent.
public override IModel<TInput, TOutput, ModelMetadata<T>> Adapt(IMetaLearningTask<T, TInput, TOutput> task)
Parameters
taskIMetaLearningTask<T, TInput, TOutput>The new task containing support set examples for adaptation.
Returns
- IModel<TInput, TOutput, ModelMetadata<T>>
A new model instance that has been fine-tuned to the given task.
Remarks
Reptile adaptation is straightforward: perform SGD on the support set for K steps. The meta-learned initialization enables fast adaptation.
For Beginners: When adapting to a new task, Reptile works just like regular training - take gradient steps on the examples. The magic is in the initialization, which was learned during meta-training to be a great starting point for any task.
Note: Reptile can use many adaptation steps since there's no need to backpropagate through them. More steps often leads to better task performance, especially for harder tasks.
Exceptions
- ArgumentNullException
Thrown when task is null.
MetaTrain(TaskBatch<T, TInput, TOutput>)
Performs one meta-training step using Reptile's parameter interpolation approach.
public override T MetaTrain(TaskBatch<T, TInput, TOutput> taskBatch)
Parameters
taskBatchTaskBatch<T, TInput, TOutput>A batch of tasks to meta-train on, each containing support and query sets.
Returns
- T
The average loss across all tasks in the batch (evaluated on query sets).
Remarks
Reptile meta-training is simpler than MAML:
For each task: 1. Clone the meta-model with current meta-parameters 2. Perform K gradient descent steps on the task's support set 3. Compute the direction: (adapted_params - initial_params)
Meta-Update: 1. Average the adaptation directions across all tasks 2. Move meta-parameters in that direction: theta_new = theta_old + epsilon * direction
For Beginners: Reptile simply says: "I adapted to these tasks and ended up at these new parameters. Let me move my starting point a little bit in that direction, so next time I'm closer to where I need to be."
The beauty of Reptile is that this simple interpolation, when done across many tasks, converges to a good initialization for few-shot learning.
Exceptions
- ArgumentException
Thrown when the task batch is null or empty.
- InvalidOperationException
Thrown when parameter update computation fails.