Class ElasticNetRegressionOptions<T>
Configuration options for Elastic Net Regression (combined L1 and L2 regularization).
public class ElasticNetRegressionOptions<T> : RegressionOptions<T>
Type Parameters
TThe data type used for calculations.
- Inheritance
-
ElasticNetRegressionOptions<T>
- Inherited Members
Remarks
Elastic Net combines the penalties of Ridge (L2) and Lasso (L1) regression, providing a balance between feature selection (from L1) and handling correlated features (from L2).
The objective function minimized is: (1/2n) * ||y - Xw||^2 + alpha * l1_ratio * ||w||_1 + alpha * (1 - l1_ratio) * ||w||^2 / 2
For Beginners: Elastic Net gives you the best of both Ridge and Lasso.
Lasso (L1) is great for feature selection but has a limitation: when features are highly correlated, it tends to arbitrarily pick one and zero out the others.
Ridge (L2) handles correlated features well but doesn't do feature selection - all features keep non-zero coefficients.
Elastic Net combines both:
- It can still set coefficients to zero (like Lasso) for feature selection
- It groups correlated features together (like Ridge) instead of picking one arbitrarily
When to use Elastic Net:
- When you have correlated features and want feature selection
- When Lasso's behavior on correlated features is problematic
- When you're not sure whether Ridge or Lasso is better
The l1_ratio parameter controls the mix:
- l1_ratio = 1.0: Pure Lasso (L1 only)
- l1_ratio = 0.0: Pure Ridge (L2 only)
- l1_ratio = 0.5: Equal mix of L1 and L2 (default)
Note: If your features are on different scales, consider normalizing your data before training using INormalizer implementations like ZScoreNormalizer or MinMaxNormalizer.
Properties
Alpha
Gets or sets the overall regularization strength. Must be a non-negative value.
public double Alpha { get; set; }
Property Value
- double
The regularization parameter, defaulting to 1.0.
Remarks
This parameter controls the overall strength of regularization. It multiplies both the L1 and L2 penalties. Larger values result in stronger regularization.
For Beginners: Alpha controls the overall regularization strength.
Think of it as a volume knob for regularization:
- Alpha = 0.0: No regularization (ordinary least squares)
- Alpha = 1.0: Moderate regularization (default)
- Alpha = 10.0: Strong regularization
Higher alpha means more shrinkage and potentially more features set to zero. Use cross-validation to find the optimal value for your data.
Exceptions
- ArgumentOutOfRangeException
Thrown when value is negative.
L1Ratio
Gets or sets the ratio of L1 penalty in the combined penalty. Must be between 0 and 1.
public double L1Ratio { get; set; }
Property Value
- double
The L1 ratio, defaulting to 0.5.
Remarks
This parameter controls the mix between L1 (Lasso) and L2 (Ridge) penalties: - 1.0 = Pure Lasso (L1 only) - 0.0 = Pure Ridge (L2 only) - 0.5 = Equal mix (default)
For Beginners: L1Ratio controls the balance between feature selection and stability.
The effects of different values:
- L1Ratio = 1.0: Pure Lasso - maximum sparsity, may have issues with correlated features
- L1Ratio = 0.5: Balanced - good default for most problems
- L1Ratio = 0.1: Mostly Ridge - keeps more features, better for correlated features
- L1Ratio = 0.0: Pure Ridge - no feature selection, all features kept
Tips:
- Start with 0.5 and adjust based on results
- If you need feature selection but have correlated features, try 0.3-0.7
- Use cross-validation to find the optimal value
Exceptions
- ArgumentOutOfRangeException
Thrown when value is outside [0, 1] range.
MaxIterations
Gets or sets the maximum number of iterations for the coordinate descent algorithm.
public int MaxIterations { get; set; }
Property Value
- int
The maximum number of iterations, defaulting to 1000.
Remarks
Elastic Net uses coordinate descent optimization. This parameter limits the number of iterations to prevent infinite loops.
For Beginners: This sets how many times the algorithm can try to improve. The default of 1000 is usually sufficient. Increase if you see convergence warnings.
Exceptions
- ArgumentOutOfRangeException
Thrown when value is not positive.
Tolerance
Gets or sets the convergence tolerance for the optimization algorithm.
public double Tolerance { get; set; }
Property Value
- double
The convergence tolerance, defaulting to 1e-4.
Remarks
The algorithm stops when the maximum change in coefficients falls below this threshold.
For Beginners: This determines how precise the solution needs to be. The default is good for most applications.
Exceptions
- ArgumentOutOfRangeException
Thrown when value is not positive.
WarmStart
Gets or sets whether to use warm starting for cross-validation.
public bool WarmStart { get; set; }
Property Value
- bool
True to use warm starting; false otherwise. Default is true.
Remarks
When enabled, the previous solution is used as the starting point for retraining, which can significantly speed up cross-validation.
For Beginners: Keep this enabled (default) for faster training when trying different parameter values.