Class AutoMLMultiFidelityOptions
- Namespace
- AiDotNet.Configuration
- Assembly
- AiDotNet.dll
Configuration options for multi-fidelity/ASHA AutoML search.
public sealed class AutoMLMultiFidelityOptions
- Inheritance
-
AutoMLMultiFidelityOptions
- Inherited Members
Remarks
Multi-fidelity search tries many configurations quickly with a smaller "budget" (for example, a subset of the training data) and then promotes only the most promising trials to larger budgets.
ASHA (Asynchronous Successive Halving Algorithm) extends this with parallel trial execution and early stopping of underperforming trials, providing 5-10x speedup over grid/random search.
For Beginners: Instead of fully training every trial (slow), multi-fidelity does:
- Train many candidates on a small amount of data.
- Keep only the best candidates.
- Train those candidates on more data.
- Repeat until you reach full training.
Properties
EarlyStoppingMinDelta
Gets or sets the minimum improvement threshold for early stopping.
public double EarlyStoppingMinDelta { get; set; }
Property Value
Remarks
A trial must improve by at least this amount to reset the patience counter.
For Beginners: Smaller values are more lenient; 0.001 is a good default.
EarlyStoppingPatience
Gets or sets the early stopping patience for individual trials within a rung.
public int EarlyStoppingPatience { get; set; }
Property Value
Remarks
If a trial's score doesn't improve for this many checkpoints, it is stopped early. A value of 0 or negative disables per-trial early stopping.
For Beginners: Higher values give trials more time but slower search.
EnableAsyncExecution
Gets or sets whether to enable ASHA-style async parallel trial execution.
public bool EnableAsyncExecution { get; set; }
Property Value
Remarks
When enabled, trials at each rung are executed in parallel up to MaxParallelism, and underperforming trials are stopped early based on EarlyStoppingPatience.
For Beginners: Enable this for faster search on multi-core systems.
EnableHyperBandBrackets
Gets or sets whether to use aggressive bracket halving (HyperBand-style).
public bool EnableHyperBandBrackets { get; set; }
Property Value
Remarks
When enabled, multiple brackets with different starting fidelities are explored in parallel. This trades off exploration (many trials at low fidelity) vs exploitation (fewer trials at high fidelity).
For Beginners: Enable for broader hyperparameter exploration.
GracePeriod
Gets or sets the grace period (minimum checkpoints) before early stopping can trigger.
public int GracePeriod { get; set; }
Property Value
Remarks
Trials are allowed at least this many checkpoints before being considered for early stopping. This prevents killing trials too early before they have a chance to converge.
For Beginners: Higher values give all trials more time to "warm up".
HyperBandBrackets
Gets or sets the number of HyperBand brackets to use when EnableHyperBandBrackets is true.
public int HyperBandBrackets { get; set; }
Property Value
Remarks
Each bracket has a different starting fidelity and reduction schedule. More brackets increase exploration diversity but also increase compute cost.
MaxParallelism
Gets or sets the maximum number of trials to run in parallel at each fidelity rung.
public int MaxParallelism { get; set; }
Property Value
Remarks
Only applies when EnableAsyncExecution is true.
A value of 0 or negative means use Environment.ProcessorCount.
For Beginners: Set to the number of CPU cores you want to use.
ReductionFactor
Gets or sets the reduction factor used when promoting trials between fidelity levels.
public double ReductionFactor { get; set; }
Property Value
Remarks
A value of 3 means "keep about 1/3 of the trials" when moving to the next fidelity level.
For Beginners: Higher values are more aggressive (fewer promotions).
TrainingFractions
Gets or sets the ordered list of training-data fractions to use as fidelity levels.
public double[] TrainingFractions { get; set; }
Property Value
- double[]
Remarks
Values must be in (0, 1]. The final level should be 1.0 to represent full-fidelity training. If the list does not include 1.0, multi-fidelity will append it automatically.
For Beginners: 0.25 means "train on 25% of the training data".