Table of Contents

Class GNNMetaAlgorithm<T, TInput, TOutput>

Namespace
AiDotNet.MetaLearning.Algorithms
Assembly
AiDotNet.dll

Implementation of Graph Neural Network-based Meta-learning.

public class GNNMetaAlgorithm<T, TInput, TOutput> : MetaLearnerBase<T, TInput, TOutput>, IMetaLearner<T, TInput, TOutput>

Type Parameters

T

The numeric type used for calculations (e.g., double, float).

TInput

The input data type (e.g., Matrix<T>, Tensor<T>).

TOutput

The output data type (e.g., Vector<T>, Tensor<T>).

Inheritance
MetaLearnerBase<T, TInput, TOutput>
GNNMetaAlgorithm<T, TInput, TOutput>
Implements
IMetaLearner<T, TInput, TOutput>
Inherited Members

Remarks

GNN-based meta-learning models tasks and examples as nodes in a graph, with edges representing relationships between them. The graph neural network learns to propagate information across the task structure to improve learning.

Key Innovation: Instead of treating tasks independently, GNN Meta-learning: 1. Builds a graph where nodes represent tasks or examples 2. Edges connect similar or related tasks 3. Message passing propagates useful information between tasks 4. The aggregated graph information guides adaptation

For Beginners: GNN Meta-learning is like studying with a study group:

- MAML: Each student learns alone but starts with good study habits - GNN Meta: Students share notes and help each other learn faster

When learning a new subject (task), you can benefit from what others who studied similar subjects (similar tasks) have learned. The graph network learns which tasks are helpful for each other.

Architecture: - Node Embeddings: Each task gets a vector representation - Edge Weights: Learned weights showing task relationships - Message Passing: Information flows between related tasks - Graph Aggregation: Combines all node information for prediction

Algorithm:

For each task batch:
  1. Build task graph from current batch
  2. Compute node embeddings from task data
  3. Perform K rounds of message passing
  4. Aggregate graph information for each task
  5. Use graph context to guide adaptation
  6. Compute meta-gradients and update all components

Reference: Garcia, V., & Bruna, J. (2018). Few-shot learning with graph neural networks.

Constructors

GNNMetaAlgorithm(GNNMetaOptions<T, TInput, TOutput>)

Initializes a new instance of the GNNMetaAlgorithm class.

public GNNMetaAlgorithm(GNNMetaOptions<T, TInput, TOutput> options)

Parameters

options GNNMetaOptions<T, TInput, TOutput>

GNN Meta configuration options containing the model and all hyperparameters.

Examples

// Create GNN Meta with minimal configuration
var options = new GNNMetaOptions<double, Tensor, Tensor>(myNeuralNetwork);
var gnnMeta = new GNNMetaAlgorithm<double, Tensor, Tensor>(options);

// Create GNN Meta with custom configuration
var options = new GNNMetaOptions<double, Tensor, Tensor>(myNeuralNetwork)
{
    NumMessagePassingLayers = 5,
    NodeEmbeddingDimension = 256,
    AggregationType = GNNAggregationType.Attention,
    SimilarityMetric = TaskSimilarityMetric.GradientSimilarity
};
var gnnMeta = new GNNMetaAlgorithm<double, Tensor, Tensor>(options);

Exceptions

ArgumentNullException

Thrown when options is null.

InvalidOperationException

Thrown when required components are not set in options.

Properties

AlgorithmType

Gets the algorithm type identifier for this meta-learner.

public override MetaLearningAlgorithmType AlgorithmType { get; }

Property Value

MetaLearningAlgorithmType

Returns GNNMeta.

Remarks

This property identifies the algorithm as GNN Meta-learning, which uses graph neural networks to model task relationships and propagate information between related tasks.

Methods

Adapt(IMetaLearningTask<T, TInput, TOutput>)

Adapts the meta-learned model to a new task using graph-informed adaptation.

public override IModel<TInput, TOutput, ModelMetadata<T>> Adapt(IMetaLearningTask<T, TInput, TOutput> task)

Parameters

task IMetaLearningTask<T, TInput, TOutput>

The new task containing support set examples for adaptation.

Returns

IModel<TInput, TOutput, ModelMetadata<T>>

A new model instance that has been adapted to the given task.

Remarks

GNN adaptation uses learned graph structure for context:

Adaptation Process: 1. Compute task embedding from support set 2. If previous tasks exist, use learned relationships 3. Apply graph context to guide adaptation 4. Perform gradient-based fine-tuning

For Beginners: When adapting to a new task, GNN Meta uses what it learned about task relationships. If the new task is similar to tasks it's seen before, it can leverage that knowledge.

It's like starting a new subject - if you remember that it's similar to something you studied before, you can apply relevant techniques.

Exceptions

ArgumentNullException

Thrown when task is null.

MetaTrain(TaskBatch<T, TInput, TOutput>)

Performs one meta-training step using GNN-based task relationship modeling.

public override T MetaTrain(TaskBatch<T, TInput, TOutput> taskBatch)

Parameters

taskBatch TaskBatch<T, TInput, TOutput>

A batch of tasks to meta-train on, each containing support and query sets.

Returns

T

The average meta-loss across all tasks in the batch (evaluated on query sets).

Remarks

GNN Meta-training leverages task relationships through graph structure:

Graph Construction: 1. Each task in the batch becomes a node in the graph 2. Edges are created based on task similarity (configurable metric) 3. Edge weights are either fixed or learned during training

Message Passing: 1. Node embeddings are initialized from task representations 2. For each layer, nodes aggregate information from neighbors 3. Updated embeddings capture multi-hop task relationships

Adaptation with Graph Context: 1. Graph embeddings provide context for each task 2. Context guides the adaptation process 3. Similar tasks benefit from shared information

For Beginners: During training, the GNN learns: - Which tasks are similar and should share information - How to combine information from related tasks - How to use this combined information for better adaptation

It's like learning which study partners are most helpful and how to best combine their notes with your own understanding.

Exceptions

ArgumentException

Thrown when the task batch is null or empty.

InvalidOperationException

Thrown when meta-gradient computation fails.