Table of Contents

Class GradientBoostingClassifier<T>

Namespace
AiDotNet.Classification.Ensemble
Assembly
AiDotNet.dll

Gradient Boosting classifier that builds trees sequentially to correct errors.

public class GradientBoostingClassifier<T> : EnsembleClassifierBase<T>, ITreeBasedClassifier<T>, IProbabilisticClassifier<T>, IClassifier<T>, IFullModel<T, Matrix<T>, Vector<T>>, IModel<Matrix<T>, Vector<T>, ModelMetadata<T>>, IModelSerializer, ICheckpointableModel, IParameterizable<T, Matrix<T>, Vector<T>>, IFeatureAware, IFeatureImportance<T>, ICloneable<IFullModel<T, Matrix<T>, Vector<T>>>, IGradientComputable<T, Matrix<T>, Vector<T>>, IJitCompilable<T>

Type Parameters

T

The numeric data type used for calculations (e.g., float, double).

Inheritance
GradientBoostingClassifier<T>
Implements
IFullModel<T, Matrix<T>, Vector<T>>
IModel<Matrix<T>, Vector<T>, ModelMetadata<T>>
IParameterizable<T, Matrix<T>, Vector<T>>
ICloneable<IFullModel<T, Matrix<T>, Vector<T>>>
IGradientComputable<T, Matrix<T>, Vector<T>>
Inherited Members
Extension Methods

Remarks

Gradient Boosting builds an additive model in a forward stage-wise fashion. At each stage, a regression tree is fit on the negative gradient of the loss function. For classification, this uses log loss (deviance) or exponential loss.

For Beginners: Gradient Boosting is one of the most powerful machine learning algorithms:

How it works:

  1. Start with an initial prediction
  2. Calculate how wrong we are
  3. Train a tree to predict our mistakes
  4. Add a fraction of this tree's predictions
  5. Repeat, each time correcting remaining errors

Key insight: Each tree fixes what previous trees got wrong!

Tips for best results:

  • Use lower learning_rate with more n_estimators
  • Keep max_depth small (3-5) unlike Random Forest
  • Consider subsample less than 1.0 for regularization

Constructors

GradientBoostingClassifier(GradientBoostingClassifierOptions<T>?, IRegularization<T, Matrix<T>, Vector<T>>?)

Initializes a new instance of the GradientBoostingClassifier class.

public GradientBoostingClassifier(GradientBoostingClassifierOptions<T>? options = null, IRegularization<T, Matrix<T>, Vector<T>>? regularization = null)

Parameters

options GradientBoostingClassifierOptions<T>

Configuration options for Gradient Boosting.

regularization IRegularization<T, Matrix<T>, Vector<T>>

Optional regularization strategy.

Properties

LeafCount

Gets the number of leaf nodes in the tree.

public int LeafCount { get; }

Property Value

int

The count of terminal nodes (leaves) in the trained tree. Returns 0 if the model has not been trained.

MaxDepth

Gets the maximum depth of the tree.

public int MaxDepth { get; }

Property Value

int

The maximum depth reached during training, or the configured maximum depth.

NodeCount

Gets the number of internal (decision) nodes in the tree.

public int NodeCount { get; }

Property Value

int

The count of non-terminal nodes that make decisions. Returns 0 if the model has not been trained.

Options

Gets the Gradient Boosting specific options.

protected GradientBoostingClassifierOptions<T> Options { get; }

Property Value

GradientBoostingClassifierOptions<T>

Methods

Clone()

Creates a clone of the classifier model.

public override IFullModel<T, Matrix<T>, Vector<T>> Clone()

Returns

IFullModel<T, Matrix<T>, Vector<T>>

A new instance of the model with the same parameters and options.

CreateNewInstance()

Creates a new instance of the same type as this classifier.

protected override IFullModel<T, Matrix<T>, Vector<T>> CreateNewInstance()

Returns

IFullModel<T, Matrix<T>, Vector<T>>

A new instance of the same classifier type.

GetModelMetadata()

Gets metadata about the model.

public override ModelMetadata<T> GetModelMetadata()

Returns

ModelMetadata<T>

A ModelMetadata object containing information about the model.

Remarks

This method returns metadata about the model, including its type, feature count, complexity, description, and additional information specific to classification.

For Beginners: Model metadata provides information about the model itself, rather than the predictions it makes. This includes details about the model's structure (like how many features it uses) and characteristics (like how many classes it can predict). This information can be useful for understanding and comparing different models.

GetModelType()

Returns the model type identifier for this classifier.

protected override ModelType GetModelType()

Returns

ModelType

Predict(Matrix<T>)

Predicts class labels for the given input data by taking the argmax of probabilities.

public override Vector<T> Predict(Matrix<T> input)

Parameters

input Matrix<T>

The input features matrix where each row is an example and each column is a feature.

Returns

Vector<T>

A vector of predicted class indices for each input example.

Remarks

This implementation uses the argmax of the probability distribution to determine the predicted class. For binary classification with a custom decision threshold, you may want to use PredictProbabilities() directly and apply your own threshold.

For Beginners: This method picks the class with the highest probability for each sample.

For example, if the probabilities are [0.1, 0.7, 0.2] for classes [A, B, C], this method returns class B (index 1) because it has the highest probability (0.7).

PredictProbabilities(Matrix<T>)

Aggregates predictions from all estimators in the ensemble.

public override Matrix<T> PredictProbabilities(Matrix<T> input)

Parameters

input Matrix<T>

The input features matrix.

Returns

Matrix<T>

A matrix of aggregated class probabilities.

Remarks

Default implementation averages the probability predictions from all estimators. Derived classes may override this for different aggregation strategies.

Train(Matrix<T>, Vector<T>)

Trains the Gradient Boosting classifier on the provided data.

public override void Train(Matrix<T> x, Vector<T> y)

Parameters

x Matrix<T>
y Vector<T>