Table of Contents

Class InMemoryFederatedTrainer<T, TInput, TOutput>

Namespace
AiDotNet.FederatedLearning.Trainers
Assembly
AiDotNet.dll

In-memory federated learning trainer for local simulation and tests.

public sealed class InMemoryFederatedTrainer<T, TInput, TOutput> : FederatedTrainerBase<IFullModel<T, TInput, TOutput>, FederatedClientDataset<TInput, TOutput>, FederatedLearningMetadata, T>, IFederatedTrainer<IFullModel<T, TInput, TOutput>, FederatedClientDataset<TInput, TOutput>, FederatedLearningMetadata>

Type Parameters

T
TInput
TOutput
Inheritance
InMemoryFederatedTrainer<T, TInput, TOutput>
Implements
Inherited Members

Remarks

This trainer runs federated learning rounds in-process by creating per-client model/optimizer instances, running local optimization on each client's data, and aggregating client models into a global model.

Constructors

InMemoryFederatedTrainer(IOptimizer<T, TInput, TOutput>, double?, int?, double, int, FederatedLearningOptions?, IPrivacyMechanism<Vector<T>>?, IPrivacyAccountant?, IClientSelectionStrategy?, IFederatedServerOptimizer<T>?, IFederatedHeterogeneityCorrection<T>?, IHomomorphicEncryptionProvider<T>?)

public InMemoryFederatedTrainer(IOptimizer<T, TInput, TOutput> optimizerPrototype, double? learningRateOverride = null, int? randomSeed = null, double convergenceThreshold = 0.001, int minRoundsBeforeConvergence = 10, FederatedLearningOptions? federatedLearningOptions = null, IPrivacyMechanism<Vector<T>>? differentialPrivacyMechanism = null, IPrivacyAccountant? privacyAccountant = null, IClientSelectionStrategy? clientSelectionStrategy = null, IFederatedServerOptimizer<T>? serverOptimizer = null, IFederatedHeterogeneityCorrection<T>? heterogeneityCorrection = null, IHomomorphicEncryptionProvider<T>? homomorphicEncryptionProvider = null)

Parameters

optimizerPrototype IOptimizer<T, TInput, TOutput>
learningRateOverride double?
randomSeed int?
convergenceThreshold double
minRoundsBeforeConvergence int
federatedLearningOptions FederatedLearningOptions
differentialPrivacyMechanism IPrivacyMechanism<Vector<T>>
privacyAccountant IPrivacyAccountant
clientSelectionStrategy IClientSelectionStrategy
serverOptimizer IFederatedServerOptimizer<T>
heterogeneityCorrection IFederatedHeterogeneityCorrection<T>
homomorphicEncryptionProvider IHomomorphicEncryptionProvider<T>

Methods

Train(Dictionary<int, FederatedClientDataset<TInput, TOutput>>, int, double, int)

Executes multiple rounds of federated learning until convergence or maximum rounds reached.

public override FederatedLearningMetadata Train(Dictionary<int, FederatedClientDataset<TInput, TOutput>> clientData, int rounds, double clientSelectionFraction = 1, int localEpochs = 1)

Parameters

clientData Dictionary<int, FederatedClientDataset<TInput, TOutput>>

Dictionary mapping client IDs to their local training data.

rounds int

Maximum number of federated learning rounds to execute.

clientSelectionFraction double

Fraction of clients to select per round (0.0 to 1.0).

localEpochs int

Number of training epochs each client performs per round.

Returns

FederatedLearningMetadata

Aggregated metadata across all training rounds.

Remarks

This method orchestrates the entire federated learning process by:

  • Running multiple training rounds
  • Monitoring convergence (when the model stops improving significantly)
  • Tracking performance metrics across rounds
  • Applying privacy mechanisms if configured

For Beginners: This is the complete federated learning process from start to finish. It's like running an entire semester of study group sessions, where you continue meeting until everyone has mastered the material or you've run out of time.

TrainRound(Dictionary<int, FederatedClientDataset<TInput, TOutput>>, double, int)

Executes one round of federated learning where clients train locally and updates are aggregated.

public override FederatedLearningMetadata TrainRound(Dictionary<int, FederatedClientDataset<TInput, TOutput>> clientData, double clientSelectionFraction = 1, int localEpochs = 1)

Parameters

clientData Dictionary<int, FederatedClientDataset<TInput, TOutput>>

Dictionary mapping client IDs to their local training data.

clientSelectionFraction double

Fraction of clients to select for this round (0.0 to 1.0).

localEpochs int

Number of training epochs each client should perform locally.

Returns

FederatedLearningMetadata

Metadata about the training round including accuracy, loss, and convergence metrics.

Remarks

A federated learning round consists of several steps:

  1. The global model is sent to selected clients
  2. Each client trains the model on their local data
  3. Clients send their model updates back to the server
  4. The server aggregates these updates using an aggregation strategy
  5. The global model is updated with the aggregated result

For Beginners: Think of this as one iteration in a collaborative learning cycle. Everyone gets the current version of the shared knowledge, studies independently, then contributes their improvements. These improvements are combined to create an even better version for the next round.

For example:

  • Round 1: Clients receive initial model, train for 5 epochs, send updates
  • Server aggregates updates and improves global model
  • Round 2: Clients receive improved model, train again, send updates
  • This continues until the model reaches desired accuracy