Class FederatedMetaLearningOptions
Configuration options for federated meta-learning.
public sealed class FederatedMetaLearningOptions
- Inheritance
-
FederatedMetaLearningOptions
- Inherited Members
Remarks
For Beginners: Meta-learning in federated settings aims to learn a "good starting point" (initial model) that can adapt quickly to each client's local data with a small amount of fine-tuning.
In this library, federated meta-learning is implemented as an alternative server update rule that uses client adaptation results (post-local training) to update the global initialization.
Properties
Enabled
Gets or sets whether federated meta-learning is enabled.
public bool Enabled { get; set; }
Property Value
InnerEpochs
Gets or sets the number of local adaptation epochs used for the inner loop.
public int InnerEpochs { get; set; }
Property Value
Remarks
If not set (or <= 0), the trainer falls back to the federated LocalEpochs value.
MetaLearningRate
Gets or sets the server meta learning rate applied to the average adaptation delta.
public double MetaLearningRate { get; set; }
Property Value
Remarks
For Beginners: This controls how strongly the server moves the global initialization toward the client-adapted models each round. A value of 1.0 means "move fully to the average adapted model" (similar to FedAvg when inner epochs match).
Strategy
Gets or sets the federated meta-learning strategy name.
public string Strategy { get; set; }
Property Value
Remarks
Supported built-ins:
- "None"
- "Reptile" (first-order meta-update based on post-adaptation parameters)
- "PerFedAvg" (treated as a Reptile-style first-order update in v1)
- "FedMAML" (first-order approximation in v1; full second-order requires explicit gradient support)