Table of Contents

Interface IHyperparameterOptimizer<T, TInput, TOutput>

Namespace
AiDotNet.Interfaces
Assembly
AiDotNet.dll

Defines the contract for hyperparameter optimization algorithms.

public interface IHyperparameterOptimizer<T, TInput, TOutput>

Type Parameters

T

The numeric data type used for calculations (e.g., float, double).

TInput
TOutput

Remarks

A hyperparameter optimizer automatically searches for the best hyperparameters for a machine learning model by trying different combinations and evaluating their performance.

For Beginners: Think of hyperparameters as the "settings" for your machine learning algorithm (like learning rate, number of layers, etc.). A hyperparameter optimizer is like an automatic tuner that tries different settings to find the combination that works best for your data.

Common optimization strategies include:

  • Grid Search: Tries every possible combination in a predefined grid
  • Random Search: Randomly samples combinations
  • Bayesian Optimization: Uses past results to intelligently choose what to try next
  • Hyperband: Efficiently allocates resources to promising configurations

Why hyperparameter optimization matters:

  • Manual tuning is time-consuming and error-prone
  • Good hyperparameters can dramatically improve model performance
  • Systematic search ensures you don't miss good configurations
  • Enables reproducible model selection

Methods

GetAllTrials()

Gets all trials performed during optimization.

List<HyperparameterTrial<T>> GetAllTrials()

Returns

List<HyperparameterTrial<T>>

List of all trials.

GetBestTrial()

Gets the best trial from the optimization.

HyperparameterTrial<T> GetBestTrial()

Returns

HyperparameterTrial<T>

The trial with the best objective value.

GetTrials(Func<HyperparameterTrial<T>, bool>)

Gets trials that meet a certain criteria.

List<HyperparameterTrial<T>> GetTrials(Func<HyperparameterTrial<T>, bool> filter)

Parameters

filter Func<HyperparameterTrial<T>, bool>

Filter function to select trials.

Returns

List<HyperparameterTrial<T>>

Filtered list of trials.

Optimize(Func<Dictionary<string, object>, T>, HyperparameterSearchSpace, int)

Searches for the best hyperparameter configuration.

HyperparameterOptimizationResult<T> Optimize(Func<Dictionary<string, object>, T> objectiveFunction, HyperparameterSearchSpace searchSpace, int nTrials)

Parameters

objectiveFunction Func<Dictionary<string, object>, T>

The function to optimize (typically model performance).

searchSpace HyperparameterSearchSpace

The space of possible hyperparameter values to search.

nTrials int

Number of trials to run.

Returns

HyperparameterOptimizationResult<T>

The best hyperparameter configuration found.

Remarks

For Beginners: This is the main method that performs the hyperparameter search. It tries different combinations of hyperparameters and returns the best one found.

OptimizeModel<TMetadata>(IModel<TInput, TOutput, TMetadata>, (TInput X, TOutput Y), (TInput X, TOutput Y), HyperparameterSearchSpace, int)

Searches for the best hyperparameters for a specific model.

HyperparameterOptimizationResult<T> OptimizeModel<TMetadata>(IModel<TInput, TOutput, TMetadata> model, (TInput X, TOutput Y) trainingData, (TInput X, TOutput Y) validationData, HyperparameterSearchSpace searchSpace, int nTrials) where TMetadata : class

Parameters

model IModel<TInput, TOutput, TMetadata>

The model to optimize hyperparameters for.

trainingData (TInput Input, TOutput Output)

The training data to use for evaluation.

validationData (TInput Input, TOutput Output)

The validation data to use for evaluation.

searchSpace HyperparameterSearchSpace

The space of possible hyperparameter values.

nTrials int

Number of trials to run.

Returns

HyperparameterOptimizationResult<T>

The optimized model with best hyperparameters.

Type Parameters

TMetadata

ReportTrial(HyperparameterTrial<T>, T)

Reports the result of a trial.

void ReportTrial(HyperparameterTrial<T> trial, T objectiveValue)

Parameters

trial HyperparameterTrial<T>

The trial to report results for.

objectiveValue T

The objective value achieved.

ShouldPrune(HyperparameterTrial<T>, int, T)

Determines if a trial should be pruned (stopped early) to save resources.

bool ShouldPrune(HyperparameterTrial<T> trial, int step, T intermediateValue)

Parameters

trial HyperparameterTrial<T>

The trial to check.

step int

The current training step.

intermediateValue T

The current performance value.

Returns

bool

True if the trial should be stopped early.

Remarks

For Beginners: Pruning is stopping a trial early if it's clearly not performing well, which saves time and computational resources. It's like stopping a cake from baking if you can already tell it's burned.

SuggestNext(HyperparameterTrial<T>)

Suggests the next hyperparameter configuration to try based on past trials.

Dictionary<string, object> SuggestNext(HyperparameterTrial<T> trial)

Parameters

trial HyperparameterTrial<T>

The trial to populate with suggestions.

Returns

Dictionary<string, object>

Dictionary of suggested hyperparameter values.

Remarks

For Beginners: Advanced optimizers (like Bayesian optimization) learn from previous trials to intelligently choose what to try next. This method provides that suggestion.