Table of Contents

Namespace AiDotNet.Regression

Classes

AdaBoostR2Regression<T>

Implements the AdaBoost.R2 algorithm for regression problems, an ensemble learning method that combines multiple decision tree regressors to improve prediction accuracy.

AsyncDecisionTreeRegressionBase<T>

Represents an abstract base class for asynchronous decision tree regression models.

BayesianRegression<T>

Implements Bayesian Linear Regression with support for various kernels and uncertainty estimation.

ConditionalInferenceTreeRegression<T>

Represents a conditional inference tree regression model that builds decision trees based on statistical tests.

DecisionTreeRegressionBase<T>

Provides a base implementation for decision tree regression models that predict continuous values.

DecisionTreeRegression<T>

Represents a decision tree regression model that predicts continuous values based on input features.

ElasticNetRegression<T>

Implements Elastic Net Regression (combined L1 and L2 regularized linear regression), which extends ordinary least squares by adding both L1 (Lasso) and L2 (Ridge) penalty terms.

ExtremelyRandomizedTreesRegression<T>

Implements an Extremely Randomized Trees regression model, which is an ensemble method that uses multiple decision trees with additional randomization for improved prediction accuracy and reduced overfitting.

GaussianProcessRegression<T>

Implements a Gaussian Process Regression model, which is a non-parametric, probabilistic approach to regression that provides uncertainty estimates along with predictions.

GeneralizedAdditiveModel<T>

Implements a Generalized Additive Model (GAM) for regression, which models the target as a sum of smooth functions of individual features, allowing for flexible nonlinear relationships while maintaining interpretability.

GeneticAlgorithmRegression<T>

Implements a regression model that uses genetic algorithms to optimize model parameters, mimicking the process of natural selection to find the best solution.

GradientBoostingRegression<T>

Implements a Gradient Boosting Regression model, which combines multiple decision trees sequentially to create a powerful ensemble that learns from the errors of previous trees.

IsotonicRegression<T>

Implements an Isotonic Regression model, which fits a free-form line to data with the constraint that the fitted line must be non-decreasing (monotonically increasing).

KNearestNeighborsRegression<T>

Implements K-Nearest Neighbors algorithm for regression, which predicts target values by averaging the values of the K closest training examples.

KernelRidgeRegression<T>

Implements Kernel Ridge Regression, a powerful nonlinear regression technique that combines ridge regression with the kernel trick to capture complex nonlinear relationships.

LassoRegression<T>

Implements Lasso Regression (L1 regularized linear regression), which extends ordinary least squares by adding a penalty term proportional to the absolute magnitude of the coefficients.

LocallyWeightedRegression<T>

Implements Locally Weighted Regression, a non-parametric approach that creates a different model for each prediction point based on the weighted influence of nearby training examples.

LogisticRegression<T>

Represents a logistic regression model for binary classification problems.

M5ModelTree<T>

Represents an M5 model tree for regression problems, combining decision tree structure with linear models at the leaves.

MultilayerPerceptronRegression<T>

Represents a multilayer perceptron (neural network) for regression problems.

MultinomialLogisticRegression<T>

Represents a multinomial logistic regression model for multi-class classification problems.

MultipleRegression<T>

Represents a multiple linear regression model that predicts a target value based on multiple input features.

MultivariateRegression<T>

Represents a multivariate linear regression model that predicts a target value based on multiple input features.

NegativeBinomialRegression<T>

Represents a negative binomial regression model for count data that may exhibit overdispersion.

NeuralNetworkRegression<T>

A neural network regression model that can learn complex non-linear relationships in data.

NonLinearRegressionBase<T>

Base class for non-linear regression algorithms that provides common functionality for training and prediction.

OrthogonalRegression<T>

Implements orthogonal regression (also known as total least squares), which minimizes the perpendicular distance from data points to the fitted line or hyperplane.

PartialLeastSquaresRegression<T>

Implements Partial Least Squares Regression (PLS), a technique that combines features from principal component analysis and multiple linear regression to handle situations with many correlated predictors.

PoissonRegression<T>

Implements Poisson regression, a generalized linear model used for modeling count data and contingency tables.

PolynomialRegression<T>

Implements polynomial regression, which extends linear regression by fitting a polynomial equation to the data.

PrincipalComponentRegression<T>

Implements Principal Component Regression (PCR), a technique that combines principal component analysis (PCA) with linear regression to handle multicollinearity in the predictor variables.

QuantileRegressionForests<T>

Implements Quantile Regression Forests, an extension of Random Forests that can predict conditional quantiles of the target variable, not just the conditional mean.

QuantileRegression<T>

Implements Quantile Regression, a technique that estimates the conditional quantiles of a response variable distribution in the linear model, providing a more complete view of the relationship between variables.

RadialBasisFunctionRegression<T>

Implements Radial Basis Function (RBF) Regression, a technique that uses radial basis functions as the basis for approximating complex nonlinear relationships between inputs and outputs.

RandomForestRegression<T>

Implements Random Forest Regression, an ensemble learning method that operates by constructing multiple decision trees during training and outputting the average prediction of the individual trees.

RegressionBase<T>

Provides a base implementation for regression algorithms that model the relationship between a dependent variable and one or more independent variables.

RidgeRegression<T>

Implements Ridge Regression (L2 regularized linear regression), which extends ordinary least squares by adding a penalty term proportional to the squared magnitude of the coefficients.

RobustRegression<T>

Represents a robust regression model that is resistant to outliers in the data.

SimpleRegression<T>

Implements simple linear regression, which predicts a single output value based on a single input feature. This is the most basic form of regression that finds the best-fitting straight line through a set of points.

SplineRegression<T>

Implements spline regression, which models nonlinear relationships by fitting piecewise polynomial functions. This advanced regression technique offers more flexibility than simple linear regression by allowing the model to change its behavior across different regions of the data.

StepwiseRegression<T>

Implements stepwise regression, which automatically selects the most relevant features for the model. This approach builds a model by adding or removing features based on their statistical significance.

SupportVectorRegression<T>

Implements Support Vector Regression (SVR), which creates a regression model by finding a hyperplane that lies within a specified margin (epsilon) of the training data. This approach is effective for both linear and nonlinear regression problems.

SymbolicRegression<T>

Implements symbolic regression, which discovers mathematical expressions that best describe the relationship between input features and target values. Unlike traditional regression methods, symbolic regression can discover both the form of the equation and its parameters.

TimeSeriesRegression<T>

Represents a time series regression model that incorporates temporal dependencies, trends, and seasonality.

WeightedRegression<T>

Implements weighted regression, a variation of linear regression where each data point has a different level of importance (weight) in determining the model parameters.