Table of Contents

Enum AutoMLSearchStrategy

Namespace
AiDotNet.Enums
Assembly
AiDotNet.dll

Defines the search strategy used to explore AutoML candidate configurations.

public enum AutoMLSearchStrategy

Fields

BayesianOptimization = 1

Bayesian optimization (typically Gaussian-process or TPE style).

DARTS = 5

DARTS (Differentiable Architecture Search) - gradient-based NAS. Jointly learns architecture and weights through continuous relaxation. Best for: Fast search, moderate compute budgets.

Evolutionary = 2

Evolutionary / genetic search.

GDAS = 6

GDAS (Gumbel-softmax DARTS) - improved differentiable NAS. Uses Gumbel-softmax sampling for better architecture discretization. Best for: When DARTS produces weak architectures due to discretization gap.

MultiFidelity = 3

Multi-fidelity search (e.g., HyperBand/ASHA-style scheduling).

NeuralArchitectureSearch = 4

Neural Architecture Search with automatic algorithm selection. Chooses the best NAS algorithm based on task characteristics and constraints.

OnceForAll = 7

Once-for-All (OFA) Networks - train once, specialize anywhere. Trains a supernet supporting elastic depth, width, and kernel sizes. Best for: Multi-hardware deployment, mobile/edge devices.

RandomSearch = 0

Random search baseline.

Remarks

AutoML can use different strategies to decide which candidate model configurations to try next. The best choice depends on budget, search-space shape (continuous vs categorical), and how expensive each trial is.

For Beginners: This is how AutoML decides what to try next:

  • RandomSearch tries random settings (simple and surprisingly strong).
  • BayesianOptimization tries to learn which settings work best and focus on them.
  • Evolutionary evolves good settings over time (useful for discrete/conditional knobs).
  • MultiFidelity uses short runs first and only gives more budget to promising trials.