Class FedAvgMServerOptimizer<T>
- Namespace
- AiDotNet.FederatedLearning.ServerOptimizers
- Assembly
- AiDotNet.dll
FedAvgM server optimizer (server momentum).
public sealed class FedAvgMServerOptimizer<T> : FederatedServerOptimizerBase<T>, IFederatedServerOptimizer<T>
Type Parameters
T
- Inheritance
-
FedAvgMServerOptimizer<T>
- Implements
- Inherited Members
Remarks
For Beginners: Momentum helps smooth updates across rounds. Instead of applying only the current round's update, the server maintains a running "velocity" that accumulates updates.
Constructors
FedAvgMServerOptimizer(double, double)
public FedAvgMServerOptimizer(double learningRate = 1, double momentum = 0.9)
Parameters
Methods
GetOptimizerName()
Gets the name of the server optimizer.
public override string GetOptimizerName()
Returns
Step(Vector<T>, Vector<T>)
Updates global parameters given the current parameters and an aggregated target.
public override Vector<T> Step(Vector<T> currentGlobalParameters, Vector<T> aggregatedTargetParameters)
Parameters
currentGlobalParametersVector<T>The current global parameter vector.
aggregatedTargetParametersVector<T>The aggregated target parameter vector (e.g., FedAvg output).
Returns
- Vector<T>
The updated global parameter vector.