Overview
The QDA class implements Quadratic Discriminant Analysis, a generative classifier that models each class as its own multivariate Gaussian distribution. Unlike LDA, QDA allows each class to have a different covariance matrix, producing quadratic decision boundaries.
Namespace: mlpp::classifiers
Template parameters:
Scalar: Numeric type for computations (e.g., float, double)
LabelIndex: Integer type for class labels (default: int)
Model assumptions:
- Each class c has mean vector μ_c and covariance Σ_c
- P(x|c) ~ N(μ_c, Σ_c)
- Class priors P(c) are estimated from frequency counts
Prediction:
For each sample x, QDA computes the log-likelihood:
log p(x|c) = −1/2 * [(x−μ_c)ᵀ Σ_c⁻¹ (x−μ_c) + log|Σ_c| + D log(2π)]
log posterior = log p(x|c) + log P(c)
The predicted label is the class with the largest posterior.
Constructor
QDA
Construct a QDA classifier.
template<typename Scalar, typename LabelIndex = int>
QDA();
Default constructor that initializes an empty QDA model.
#include "QDA.h"
using namespace mlpp::classifiers;
QDA<double> qda;
Methods
fit
Fit the QDA model to training data.
void fit(const Matrix& X, const Labels& labels);
Training data matrix of shape (n_samples × n_features)
Integer class labels vector of shape (n_samples)
This method computes:
- Mean vector for each class
- Covariance matrix for each class
- Inverse covariance matrices
- Log-determinants of covariance matrices
- Class prior probabilities
Eigen::MatrixXd X(150, 4); // 150 samples, 4 features
Eigen::VectorXi y(150); // Class labels
// ... populate X and y ...
QDA<double, int> qda;
qda.fit(X, y);
predict
Predict class labels for new samples.
Labels predict(const Matrix& X) const;
Data matrix to classify of shape (n_samples × n_features)
Predicted class labels vector of shape (n_samples)
Returns the class with the highest posterior probability for each sample.
Eigen::MatrixXd X_test(30, 4); // 30 test samples
// ... populate X_test ...
Eigen::VectorXi predictions = qda.predict(X_test);
predict_log_likelihood
Compute class posterior log-likelihoods for each sample.
Matrix predict_log_likelihood(const Matrix& X) const;
Data matrix of shape (n_samples × n_features)
Log-likelihood matrix of shape (n_samples × num_classes)
Each row contains the log posterior probability for each class. Useful for uncertainty quantification or threshold-based classification.
Eigen::MatrixXd X_test(10, 4);
// ... populate X_test ...
Eigen::MatrixXd log_probs = qda.predict_log_likelihood(X_test);
// log_probs(i, k) = log posterior probability of class k for sample i
// Get prediction with confidence
for (int i = 0; i < X_test.rows(); ++i) {
int predicted_class;
double max_log_prob = log_probs.row(i).maxCoeff(&predicted_class);
std::cout << "Sample " << i << ": class " << predicted_class
<< " (log prob: " << max_log_prob << ")" << std::endl;
}
num_classes
Get the number of classes.
Number of classes in the training data
int k = qda.num_classes();
class_means
Get the mean vectors for all classes.
const std::vector<Vector>& class_means() const;
return
const std::vector<Vector>&
Vector containing the mean vector for each class
auto means = qda.class_means();
for (size_t c = 0; c < means.size(); ++c) {
std::cout << "Class " << c << " mean: "
<< means[c].transpose() << std::endl;
}
class_covariances
Get the covariance matrices for all classes.
const std::vector<Matrix>& class_covariances() const;
return
const std::vector<Matrix>&
Vector containing the covariance matrix for each class
auto covs = qda.class_covariances();
for (size_t c = 0; c < covs.size(); ++c) {
std::cout << "Class " << c << " covariance:\n"
<< covs[c] << std::endl;
}
Type aliases
using Matrix = Eigen::Matrix<Scalar, Eigen::Dynamic, Eigen::Dynamic>;
using Vector = Eigen::Matrix<Scalar, Eigen::Dynamic, 1>;
using Labels = Eigen::Matrix<LabelIndex, Eigen::Dynamic, 1>;
Example usage
#include "QDA.h"
#include <Eigen/Dense>
#include <iostream>
using namespace mlpp::classifiers;
int main() {
// Generate synthetic 3-class data with different covariances
const int n_samples = 150;
const int n_features = 2;
Eigen::MatrixXd X(n_samples, n_features);
Eigen::VectorXi y(n_samples);
// Generate three clusters with different spreads
for (int i = 0; i < 50; ++i) {
X.row(i) = Eigen::RowVector2d::Random() * 0.5; // Class 0: tight cluster
y(i) = 0;
}
for (int i = 50; i < 100; ++i) {
X.row(i) = Eigen::RowVector2d::Random() * 1.5 + Eigen::RowVector2d(3, 0); // Class 1
y(i) = 1;
}
for (int i = 100; i < 150; ++i) {
X.row(i) = Eigen::RowVector2d::Random() * 2.0 + Eigen::RowVector2d(0, 3); // Class 2
y(i) = 2;
}
// Train QDA
QDA<double, int> qda;
qda.fit(X, y);
std::cout << "Number of classes: " << qda.num_classes() << std::endl;
// Make predictions
Eigen::MatrixXd X_test(3, n_features);
X_test << 0.0, 0.0,
3.0, 0.0,
0.0, 3.0;
Eigen::VectorXi predictions = qda.predict(X_test);
std::cout << "Predictions: " << predictions.transpose() << std::endl;
// Get log-likelihoods
Eigen::MatrixXd log_probs = qda.predict_log_likelihood(X_test);
std::cout << "Log-likelihoods:\n" << log_probs << std::endl;
// Inspect class statistics
auto means = qda.class_means();
auto covs = qda.class_covariances();
for (int c = 0; c < qda.num_classes(); ++c) {
std::cout << "\nClass " << c << ":" << std::endl;
std::cout << " Mean: " << means[c].transpose() << std::endl;
std::cout << " Covariance:\n" << covs[c] << std::endl;
}
return 0;
}