Skip to main content

KNeighborsClassifier

K-Nearest Neighbors Classifier. Classification based on k nearest neighbors using majority vote. Instance-based learning algorithm:
  1. Store all training data
  2. For each test sample, find k nearest training samples
  3. Predict class by majority vote (or weighted vote)
Time Complexity:
  • Training: O(1) (just stores data)
  • Prediction: O(n * d) per sample where n=training samples, d=features

Constructor

new KNeighborsClassifier(options?: {
  nNeighbors?: number;
  weights?: "uniform" | "distance";
  metric?: "euclidean" | "manhattan";
})
options.nNeighbors
number
default:"5"
Number of neighbors to use for prediction.
options.weights
string
default:"uniform"
Weight function: ‘uniform’ (all neighbors weighted equally) or ‘distance’ (closer neighbors weighted more).
options.metric
string
default:"euclidean"
Distance metric: ‘euclidean’ or ‘manhattan’.

Methods

fit

fit(X: Tensor, y: Tensor): this
Fit the k-nearest neighbors classifier from the training set.
X
Tensor
required
Training data of shape (n_samples, n_features)
y
Tensor
required
Target class labels of shape (n_samples,)
Returns: The fitted estimator Throws: InvalidParameterError if nNeighbors > n_samples

predict

predict(X: Tensor): Tensor
Predict class labels for samples in X. Returns: Predicted class labels of shape (n_samples,)

predictProba

predictProba(X: Tensor): Tensor
Predict class probabilities for samples in X. Returns: Class probability matrix of shape (n_samples, n_classes)

score

score(X: Tensor, y: Tensor): number
Return the mean accuracy on the given test data and labels. Returns: Accuracy score in range [0, 1]

Example

import { KNeighborsClassifier } from 'deepbox/ml';
import { tensor } from 'deepbox/ndarray';

const X = tensor([[0, 0], [1, 1], [2, 2], [3, 3]]);
const y = tensor([0, 0, 1, 1]);

const knn = new KNeighborsClassifier({ nNeighbors: 3 });
knn.fit(X, y);

const predictions = knn.predict(tensor([[1.5, 1.5]]));

KNeighborsRegressor

K-Nearest Neighbors Regressor. Predicts value as mean (or weighted mean) of k nearest training samples.

Constructor

new KNeighborsRegressor(options?: {
  nNeighbors?: number;
  weights?: "uniform" | "distance";
  metric?: "euclidean" | "manhattan";
})
options.nNeighbors
number
default:"5"
Number of neighbors to use.
options.weights
string
default:"uniform"
Weight function: ‘uniform’ or ‘distance’.
options.metric
string
default:"euclidean"
Distance metric: ‘euclidean’ or ‘manhattan’.

Methods

fit

fit(X: Tensor, y: Tensor): this
Fit the k-nearest neighbors regressor from the training set.
X
Tensor
required
Training data of shape (n_samples, n_features)
y
Tensor
required
Target values of shape (n_samples,)
Returns: The fitted estimator

predict

predict(X: Tensor): Tensor
Predict target values for samples in X. Returns: Predicted values of shape (n_samples,)

score

score(X: Tensor, y: Tensor): number
Return the R² score on the given test data and target values. Returns: R² score (best possible is 1.0, can be negative)

Example

import { KNeighborsRegressor } from 'deepbox/ml';
import { tensor } from 'deepbox/ndarray';

const X = tensor([[0], [1], [2], [3]]);
const y = tensor([0, 1, 4, 9]);

const knn = new KNeighborsRegressor({ nNeighbors: 2 });
knn.fit(X, y);

const predictions = knn.predict(tensor([[1.5]]));

Build docs developers (and LLMs) love