Table of Contents

Class AdaBoostClassifierOptions<T>

Namespace
AiDotNet.Models.Options
Assembly
AiDotNet.dll

Configuration options for AdaBoost classifier.

public class AdaBoostClassifierOptions<T> : ClassifierOptions<T>

Type Parameters

T

The data type used for calculations.

Inheritance
AdaBoostClassifierOptions<T>
Inherited Members

Remarks

AdaBoost (Adaptive Boosting) is a meta-algorithm that combines multiple weak classifiers into a strong classifier. Each subsequent classifier focuses more on the samples that were misclassified by previous classifiers.

For Beginners: AdaBoost is like a team of experts that learns from mistakes!

Imagine you have a series of simple decision makers:

  1. The first one makes some mistakes
  2. The second one focuses on fixing those mistakes
  3. The third one focuses on fixing the remaining mistakes
  4. And so on...

Each decision maker gets a "vote weight" based on how accurate it is. The final prediction combines all their votes.

AdaBoost is great because:

  • It automatically focuses on hard-to-classify samples
  • It combines many simple rules into a complex decision boundary
  • It's resistant to overfitting (in most cases)
  • It provides a natural confidence measure

Properties

Algorithm

Gets or sets the algorithm variant to use.

public AdaBoostAlgorithm Algorithm { get; set; }

Property Value

AdaBoostAlgorithm

The algorithm: SAMME or SAMME.R. Default is SAMME.R.

Remarks

SAMME.R (Real) uses probability estimates and typically performs better. SAMME uses class predictions only and is the original algorithm.

LearningRate

Gets or sets the learning rate (shrinkage).

public double LearningRate { get; set; }

Property Value

double

The learning rate. Default is 1.0.

Remarks

The learning rate shrinks the contribution of each classifier. Lower values require more estimators but can give better results.

For Beginners: The learning rate controls how much each weak learner contributes.

  • learning_rate = 1.0: Each learner contributes fully (default, faster)
  • learning_rate = 0.1: Each learner contributes 10% (needs more estimators, often better)

A common strategy is to use a smaller learning rate with more estimators.

NEstimators

Gets or sets the maximum number of estimators (weak learners).

public int NEstimators { get; set; }

Property Value

int

The number of boosting stages. Default is 50.

Remarks

More estimators can improve accuracy but increase training time. AdaBoost is less prone to overfitting than other methods, so more estimators usually help until a plateau is reached.

RandomState

Gets or sets the random state for reproducibility.

public int? RandomState { get; set; }

Property Value

int?

The random seed, or null for non-deterministic. Default is null.