LinearRegression
Ordinary Least Squares Linear Regression.
Fits a linear model with coefficients w = (w1, …, wp) to minimize the residual sum of squares between the observed targets and the targets predicted by the linear approximation.
Constructor
new LinearRegression(options?: {
fitIntercept?: boolean;
normalize?: boolean;
copyX?: boolean;
})
Whether to calculate the intercept for this model.
Whether to normalize features before regression. If true, the regressors X will be normalized before regression by subtracting the mean and dividing by the l2-norm.
Whether to copy X or overwrite it (if false, X may be overwritten).
Methods
fit
fit(X: Tensor, y: Tensor): this
Fit linear model using Ordinary Least Squares.
Training data of shape (n_samples, n_features)
Target values of shape (n_samples,)
Returns: The fitted estimator
Throws: ShapeError, DataValidationError
predict
predict(X: Tensor): Tensor
Predict using the linear model.
Samples of shape (n_samples, n_features)
Returns: Predicted values of shape (n_samples,)
score
score(X: Tensor, y: Tensor): number
Return the coefficient of determination R² of the prediction.
R² = 1 - (SS_res / SS_tot), where SS_res is the residual sum of squares and SS_tot is the total sum of squares. Best possible score is 1.0.
Returns: R² score
Properties
Model coefficients (weights) of shape (n_features,)
Independent term (bias/intercept) in the linear model
Example
import { LinearRegression } from 'deepbox/ml';
import { tensor } from 'deepbox/ndarray';
const X = tensor([[1, 1], [1, 2], [2, 2], [2, 3]]);
const y = tensor([1, 2, 2, 3]);
const model = new LinearRegression({ fitIntercept: true });
model.fit(X, y);
const X_test = tensor([[3, 5]]);
const predictions = model.predict(X_test);
const score = model.score(X, y);
Ridge
Ridge Regression (L2 Regularized Linear Regression).
Ridge regression addresses multicollinearity by adding a penalty term (L2 regularization) to the loss function.
Constructor
new Ridge(options?: {
alpha?: number;
fitIntercept?: boolean;
normalize?: boolean;
solver?: "auto" | "svd" | "cholesky" | "lsqr" | "sag";
maxIter?: number;
tol?: number;
})
Regularization strength. Must be >= 0. Larger values specify stronger regularization.
Whether to calculate the intercept.
Whether to normalize features before regression.
Solver to use: ‘auto’, ‘svd’, ‘cholesky’, ‘lsqr’, or ‘sag’.
Maximum number of iterations for iterative solvers.
Tolerance for stopping criterion.
Methods
fit
fit(X: Tensor, y: Tensor): this
Fit Ridge regression model.
Time Complexity: O(n²p + p³) where n = samples, p = features
predict
predict(X: Tensor): Tensor
Predict using the Ridge regression model.
score
score(X: Tensor, y: Tensor): number
Return the R² score.
Properties
Model coefficients of shape (n_features,)
Number of iterations run by the solver (for iterative solvers)
Example
import { Ridge } from 'deepbox/ml';
import { tensor } from 'deepbox/ndarray';
const model = new Ridge({ alpha: 0.5 });
model.fit(X_train, y_train);
const predictions = model.predict(X_test);
Lasso
Lasso Regression (L1 Regularized Linear Regression).
Lasso performs both regularization and feature selection by adding an L1 penalty that can drive coefficients exactly to zero.
Constructor
new Lasso(options?: {
alpha?: number;
fitIntercept?: boolean;
normalize?: boolean;
maxIter?: number;
tol?: number;
warmStart?: boolean;
positive?: boolean;
selection?: "cyclic" | "random";
randomState?: number;
})
Regularization strength. Must be >= 0. Controls sparsity of solution.
Whether to calculate the intercept.
Whether to normalize features before regression.
Maximum iterations for coordinate descent.
Tolerance for convergence. Smaller = more precise but slower.
Whether to reuse previous solution as initialization.
Whether to force coefficients to be positive.
Coordinate selection: ‘cyclic’ or ‘random’.
Random seed for reproducibility.
Methods
fit
fit(X: Tensor, y: Tensor): this
Fit Lasso regression model using Coordinate Descent.
Time Complexity: O(k * n * p) where k = iterations, n = samples, p = features
predict
predict(X: Tensor): Tensor
Predict using the Lasso regression model.
score
score(X: Tensor, y: Tensor): number
Return the R² score.
Properties
Model coefficients. Many will be exactly zero due to L1 regularization (sparsity).
Number of iterations until convergence
Example
import { Lasso } from 'deepbox/ml';
import { tensor } from 'deepbox/ndarray';
const model = new Lasso({ alpha: 0.1, maxIter: 1000 });
model.fit(X_train, y_train);
// Many coefficients will be exactly 0
console.log(model.coef);
const predictions = model.predict(X_test);
LogisticRegression
Logistic Regression (Binary and Multiclass Classification).
Logistic regression uses the logistic (sigmoid) function to model the probability of class membership.
Constructor
new LogisticRegression(options?: {
penalty?: "l2" | "none";
C?: number;
tol?: number;
maxIter?: number;
fitIntercept?: boolean;
learningRate?: number;
multiClass?: "ovr" | "auto";
})
Regularization type: ‘l2’ or ‘none’.
Inverse regularization strength. Must be > 0. Smaller values = stronger regularization.
Tolerance for stopping criterion.
Maximum number of iterations.
Whether to fit intercept.
Learning rate for gradient descent.
Multiclass strategy: ‘ovr’ (One-vs-Rest) or ‘auto’.
Methods
fit
fit(X: Tensor, y: Tensor): this
Fit logistic regression model. Supports binary and multiclass (One-vs-Rest) classification.
predict
predict(X: Tensor): Tensor
Predict class labels for samples.
predictProba
predictProba(X: Tensor): Tensor
Predict class probabilities for samples.
Returns: Probabilities of shape (n_samples, n_classes)
score
score(X: Tensor, y: Tensor): number
Return the mean accuracy on the given test data.
Properties
Model coefficients (weights). Shape (n_features,) for binary, (n_classes, n_features) for multiclass.
Intercept term. Scalar for binary, array for multiclass.
Unique class labels discovered during fitting.
Example
import { LogisticRegression } from 'deepbox/ml';
import { tensor } from 'deepbox/ndarray';
// Binary classification
const model = new LogisticRegression({ C: 1.0, maxIter: 100 });
model.fit(X_train, y_train);
const predictions = model.predict(X_test);
const probabilities = model.predictProba(X_test);
// Multiclass classification
const multiModel = new LogisticRegression({
multiClass: 'ovr'
});
multiModel.fit(X_train_multi, y_train_multi);