Namespace AiDotNet.LearningRateSchedulers
Classes
- ConstantLRScheduler
Maintains a constant learning rate throughout training.
- CosineAnnealingLRScheduler
Sets the learning rate using a cosine annealing schedule.
- CosineAnnealingWarmRestartsScheduler
Sets the learning rate using cosine annealing with warm restarts.
- CyclicLRScheduler
Implements cyclical learning rate policy.
- ExponentialLRScheduler
Decays the learning rate exponentially every step.
- LambdaLRScheduler
Sets the learning rate using a user-defined lambda function.
- LearningRateSchedulerBase
Base class for learning rate schedulers providing common functionality.
- LearningRateSchedulerFactory
Factory for creating learning rate schedulers with common configurations.
- LinearWarmupScheduler
Implements linear learning rate warmup followed by constant or decay schedule.
- MultiStepLRScheduler
Decays the learning rate by gamma at each milestone step.
- OneCycleLRScheduler
Implements the 1cycle learning rate policy.
- PolynomialLRScheduler
Decays the learning rate using a polynomial function.
- ReduceOnPlateauScheduler
Reduces learning rate when a metric has stopped improving.
- SequentialLRScheduler
Chains multiple learning rate schedulers together in sequence.
- StepLRScheduler
Decays the learning rate by a factor (gamma) every specified number of steps.
Interfaces
- ILearningRateScheduler
Interface for learning rate schedulers that adjust the learning rate during training.
Enums
- CyclicLRScheduler.CyclicMode
Mode for cyclic learning rate.
- LearningRateSchedulerType
Enumeration of available learning rate scheduler types.
- LinearWarmupScheduler.DecayMode
Decay mode after warmup phase.
- OneCycleLRScheduler.AnnealingStrategy
Annealing strategy for the decay phase.
- ReduceOnPlateauScheduler.Mode
Optimization mode.
- ReduceOnPlateauScheduler.ThresholdMode
Threshold comparison mode.
- SchedulerStepMode
Specifies when the learning rate scheduler should be stepped during training.