Overview
TheLinearRegression class implements ordinary least squares (OLS) and optionally L2-regularized (Ridge) linear regression. It automatically selects the optimal solving strategy based on problem geometry and condition number.
Namespace: mlpp::regression
Template parameter:
Scalar- Floating-point type (float, double, long double). Defaults todouble.
Mathematical formulation
Solves the optimization problem:- n >> d: Normal equations via Cholesky decomposition (fast, O(nd² + d³))
- d >> n: Thin SVD with closed-form solution (numerically stable)
- Ill-conditioned: Falls back to JacobiSVD with full pivoting
Constructor
Whether to fit a bias term
L2 penalty λ ≥ 0. Default 0 gives pure OLS regression
Linear solver strategy:
SolveMethod::Auto- Chosen automatically (recommended)SolveMethod::Cholesky- Normal equations via LDLTSolveMethod::SVD- Thin BDCSVDSolveMethod::JacobiSVD- Full-pivoting JacobiSVD (slowest, most stable)
Methods
fit
Feature matrix with shape (n_samples, n_features), row-major order
Target vector with length n_samples
predict
Feature matrix with shape (n_samples, n_features)
Predicted values with length n_samples
score
Feature matrix
True target values
R² score in the range (-∞, 1]. Higher values indicate better fit
residuals
Feature matrix
True target values
Residual vector with length n_samples. Throws if model is not fitted
gradient
Feature matrix
Target values
Gradient vector evaluated at current coefficients
coefficients
Coefficient vector in original (unscaled) feature space with length n_features
intercept
Intercept (bias) term. Returns 0 if fit_intercept is false
is_fitted
True after a successful call to fit()
condition_number
Effective condition number of the design matrix. Only available after an SVD solve