Skip to main content

Overview

ORB-SLAM3 implements sophisticated visual-inertial fusion, integrating IMU (Inertial Measurement Unit) data with visual features to achieve:

Metric Scale

Recovers absolute scale without scale ambiguity, even in monocular mode.

Robustness

Handles fast motion, temporary occlusions, and texture-poor environments.

Gravity Estimation

Estimates gravity direction and magnitude for consistent world frame.

Velocity Estimation

Provides camera velocity for motion prediction and feature tracking.
The visual-inertial system is based on the paper “Inertial-Only Optimization for Visual-Inertial Initialization” (ICRA 2020) and “Visual-inertial monocular SLAM with map reuse” (RA-L 2017).

IMU Data Structures

All IMU-related types are defined in include/ImuTypes.h:40-262.

IMU Measurement Point

Represents a single IMU measurement with accelerometer, gyroscope, and timestamp:
class IMU::Point
{
public:
    Point(const float &acc_x, const float &acc_y, const float &acc_z,
          const float &ang_vel_x, const float &ang_vel_y, const float &ang_vel_z,
          const double &timestamp);
    
    Point(const cv::Point3f Acc, const cv::Point3f Gyro, const double &timestamp);
    
    Eigen::Vector3f a;  // Accelerometer (m/s²)
    Eigen::Vector3f w;  // Gyroscope (rad/s)
    double t;           // Timestamp (seconds)
};
// Constructor with individual components
IMU::Point imu1(ax, ay, az, wx, wy, wz, timestamp);

// Constructor with cv::Point3f
cv::Point3f acc(ax, ay, az);
cv::Point3f gyro(wx, wy, wz);
IMU::Point imu2(acc, gyro, timestamp);

// Vector of measurements between frames
vector<IMU::Point> vImuMeas;
vImuMeas.push_back(imu1);
vImuMeas.push_back(imu2);

IMU Bias

Accelerometer and gyroscope biases that are estimated and refined over time:
class IMU::Bias
{
public:
    Bias();
    Bias(const float &b_acc_x, const float &b_acc_y, const float &b_acc_z,
         const float &b_ang_vel_x, const float &b_ang_vel_y, const float &b_ang_vel_z);
    
    float bax, bay, baz;  // Accelerometer bias (m/s²)
    float bwx, bwy, bwz;  // Gyroscope bias (rad/s)
};
Biases are automatically estimated during IMU initialization and continuously refined. You don’t need to provide them manually.

IMU Calibration

Stores IMU-camera extrinsic calibration and noise parameters:
class IMU::Calib
{
public:
    Calib(const Sophus::SE3<float> &Tbc, 
          const float &ng,   // Gyroscope noise
          const float &na,   // Accelerometer noise  
          const float &ngw,  // Gyroscope random walk
          const float &naw); // Accelerometer random walk
    
    Sophus::SE3<float> mTcb;  // Transform: Camera to Body (IMU)
    Sophus::SE3<float> mTbc;  // Transform: Body (IMU) to Camera
    Eigen::DiagonalMatrix<float,6> Cov;      // Measurement covariance
    Eigen::DiagonalMatrix<float,6> CovWalk;  // Random walk covariance
};

IMU Preintegration

ORB-SLAM3 uses IMU preintegration to efficiently integrate high-frequency IMU data between keyframes.

Preintegrated Class

The core preintegration class defined in include/ImuTypes.h:143-251:
class IMU::Preintegrated
{
public:
    Preintegrated(const Bias &b_, const Calib &calib);
    
    // Integrate new measurement
    void IntegrateNewMeasurement(const Eigen::Vector3f &acceleration, 
                                  const Eigen::Vector3f &angVel, 
                                  const float &dt);
    
    // Recompute integration with updated bias
    void Reintegrate();
    
    // Get preintegrated values
    Eigen::Matrix3f GetDeltaRotation(const Bias &b_);
    Eigen::Vector3f GetDeltaVelocity(const Bias &b_);
    Eigen::Vector3f GetDeltaPosition(const Bias &b_);
    
    float dT;  // Total integration time
    
    // Preintegrated measurements (relative to initial bias)
    Eigen::Matrix3f dR;   // Rotation delta
    Eigen::Vector3f dV;   // Velocity delta  
    Eigen::Vector3f dP;   // Position delta
    
    // Jacobians for bias correction
    Eigen::Matrix3f JRg, JVg, JVa, JPg, JPa;
    
    // Covariance and information matrix
    Eigen::Matrix<float,15,15> C;     // Covariance
    Eigen::Matrix<float,15,15> Info;  // Information matrix
};

Preintegration Workflow

1

Initialize Preintegration

Create preintegration object with initial bias and calibration:
IMU::Bias b0;  // Initial bias (zero or from previous)
IMU::Calib calib(Tbc, ng, na, ngw, naw);
IMU::Preintegrated* pImuPreint = new IMU::Preintegrated(b0, calib);
2

Integrate Measurements

For each IMU measurement between frames:
for(auto &imu : vImuMeas) {
    pImuPreint->IntegrateNewMeasurement(imu.a, imu.w, dt);
}
3

Use in Optimization

Preintegrated values provide constraints in bundle adjustment:
Eigen::Matrix3f dR = pImuPreint->GetDeltaRotation(currentBias);
Eigen::Vector3f dV = pImuPreint->GetDeltaVelocity(currentBias);
Eigen::Vector3f dP = pImuPreint->GetDeltaPosition(currentBias);
4

Update Bias

When bias estimate changes, efficiently update preintegration:
pImuPreint->SetNewBias(updatedBias);
// Uses Jacobians for fast recomputation

IMU Initialization

Visual-inertial initialization is a critical phase that estimates scale, gravity, and initial velocities.

Initialization Sequence

Duration: First few frames
  1. Initialize visual SLAM (map with scale ambiguity)
  2. Accumulate IMU measurements
  3. Create initial keyframes
During this phase, the system operates in visual-only mode until sufficient motion and IMU data are available.

Checking Initialization Status

// Check if IMU is initialized
if(mpAtlas->isImuInitialized()) {
    // Scale, gravity, and velocity are available
    // Full visual-inertial tracking active
} else {
    // Still initializing
    // System may be running visual-only tracking
}

Gravity Constant

The expected gravity magnitude is defined in include/ImuTypes.h:43:
const float GRAVITY_VALUE = 9.81;  // m/s²
This value is used as a prior during initialization. The actual gravity magnitude and direction are estimated from IMU measurements.

IMU Configuration

IMU parameters are specified in the YAML settings file:
# IMU sensor
IMU.NoiseGyro: 1.6968e-04    # rad/s/√Hz
IMU.NoiseAcc: 2.0000e-3       # m/s²/√Hz  
IMU.GyroWalk: 1.9393e-05      # rad/s²/√Hz
IMU.AccWalk: 3.0000e-03       # m/s³/√Hz
IMU.Frequency: 200            # Hz

Noise Parameters

NoiseGyro
float
required
Gyroscope measurement noise density (rad/s/√Hz). Obtain from IMU datasheet or calibration.
NoiseAcc
float
required
Accelerometer measurement noise density (m/s²/√Hz).
GyroWalk
float
required
Gyroscope random walk (rad/s²/√Hz). Models bias drift over time.
AccWalk
float
required
Accelerometer random walk (m/s³/√Hz).
Finding noise parameters: Check your IMU datasheet or use calibration tools like Allan variance analysis to determine accurate noise values.

Integration with Visual Tracking

Motion Prediction

IMU provides motion prediction for the next frame:
// Predict camera pose using IMU integration
Sophus::SE3f Tcw_predicted = PredictNavStateFromIMU();

// Use prediction to guide feature search
SearchFeaturesInPredictedRegion(Tcw_predicted);

Visual-Inertial Optimization

The system jointly optimizes visual and inertial terms:
E_total = E_visual + E_inertial

E_visual = Σ ρ(|| z_i - π(T_cw * X_i) ||²)
E_inertial = Σ || r_imu(T_i, T_j, v_i, v_j, b) ||²_Σ
Where:
  • E_visual: Reprojection error from visual features
  • E_inertial: Preintegrated IMU residuals
  • ρ: Robust cost function (Huber)
  • π: Camera projection function
  • r_imu: IMU preintegration residual

Integrated Rotation

Helper class for integrating single gyroscope measurements:
class IMU::IntegratedRotation
{
public:
    IntegratedRotation(const Eigen::Vector3f &angVel, 
                       const Bias &imuBias, 
                       const float &time);
    
    float deltaT;               // Integration time
    Eigen::Matrix3f deltaR;     // Rotation delta
    Eigen::Matrix3f rightJ;     // Right Jacobian
};
Used internally during preintegration for rotation updates.

Lie Algebra Functions

Utility functions for SO(3) operations defined in include/ImuTypes.h:254-260:
// Right Jacobian of SO(3)
Eigen::Matrix3f RightJacobianSO3(const Eigen::Vector3f &v);
Eigen::Matrix3f RightJacobianSO3(const float &x, const float &y, const float &z);

// Inverse Right Jacobian
Eigen::Matrix3f InverseRightJacobianSO3(const Eigen::Vector3f &v);
Eigen::Matrix3f InverseRightJacobianSO3(const float &x, const float &y, const float &z);

// Normalize rotation matrix
Eigen::Matrix3f NormalizeRotation(const Eigen::Matrix3f &R);
These functions are used for manifold operations during optimization on the SO(3) rotation group.

Best Practices

  • High frequency: Use IMU at 100-200 Hz (much higher than camera frame rate)
  • All measurements: Provide ALL IMU samples between consecutive frames
  • Synchronization: Ensure accurate time synchronization between camera and IMU
  • Coordinate frames: Ensure IMU measurements are in the correct body frame
  • Extrinsic calibration: Accurately calibrate IMU-camera transformation (Tbc)
  • Temporal calibration: Estimate time offset between sensors
  • Noise parameters: Use accurate noise values from datasheet or Allan variance
  • Verify calibration: Test with known motions to validate setup
  • Initial motion: Perform slow, smooth motion for first 2-3 seconds
  • Sufficient excitation: Ensure rotation and translation for observability
  • Wait for completion: Check isImuInitialized() before relying on scale
  • Avoid aggressive motion: During initialization, avoid fast rotations
  • Bias drift: Long-term bias drift is automatically tracked and corrected
  • Temperature effects: Some IMUs drift with temperature changes
  • Magnetic interference: Keep IMU away from magnetic sources
  • Vibrations: Mount IMU rigidly to camera to avoid vibration noise

Coordinate Frames

Camera Frame (C)

  • Origin at camera optical center
  • Z-axis pointing forward (viewing direction)
  • X-axis pointing right, Y-axis pointing down

Body Frame (B/IMU)

  • Origin at IMU center
  • Axes follow IMU orientation
  • Related to camera by Tbc transformation

World Frame (W)

  • Aligned with gravity (Z-axis up or down)
  • Arbitrary horizontal orientation
  • Established during initialization

Navigation Frame (N)

  • East-North-Up or similar
  • Used for expressing poses and velocities
  • Related to world frame by rotation

Performance Benefits

Fast Motion

IMU prediction enables tracking during rapid camera motion (>10 m/s linear, >200°/s angular).

Occlusions

Maintains pose estimate during temporary visual feature loss using inertial integration.

Scale Accuracy

Achieves less than 1% scale error in monocular mode, comparable to stereo systems.

Sensor Modes

Visual-inertial sensor mode configurations

Calibration

IMU-camera calibration procedures

Configuration

YAML settings for IMU parameters

EuRoC Examples

Running visual-inertial examples

Troubleshooting

  • Insufficient motion: Ensure camera moves with rotation and translation
  • Poor calibration: Verify Tbc extrinsic calibration
  • Wrong noise values: Check IMU noise parameters in config
  • Too fast motion: Slow down during first 2-3 seconds
  • Poor IMU quality: Low-cost IMUs may have significant bias drift
  • Long sequences: Scale drift accumulates over time without loops
  • Temperature changes: Some IMUs are temperature sensitive
  • Insufficient visual constraints: Ensure good feature tracking
  • Calibration errors: Large Tbc errors cause divergence
  • Time synchronization: Ensure accurate timestamps
  • Coordinate frame mismatch: Verify IMU axes orientation
  • IMU saturation: Check for gyro/accelerometer saturation

Advanced Topics

  • Bias estimation: Continuous bias refinement during operation
  • Scale observability: Conditions for scale estimation in monocular-inertial
  • Covariance propagation: Uncertainty tracking through preintegration
  • Manifold optimization: SO(3) optimization for rotation states

Build docs developers (and LLMs) love