Skip to main content
ORB-SLAM3 provides a comprehensive set of example programs for testing and running SLAM with various sensor configurations and datasets. All examples are built automatically when you compile the library.

Available Example Programs

The library includes examples for all supported sensor configurations:

Monocular

Single camera SLAM
  • mono_euroc
  • mono_tum_vi
  • mono_kitti
  • mono_realsense_D435i
  • mono_realsense_t265

Stereo

Dual camera SLAM
  • stereo_euroc
  • stereo_tum_vi
  • stereo_kitti
  • stereo_realsense_D435i
  • stereo_realsense_t265

RGB-D

Depth camera SLAM
  • rgbd_tum
  • rgbd_realsense_D435i

Visual-Inertial

Camera + IMU fusion
  • mono_inertial_euroc
  • mono_inertial_tum_vi
  • stereo_inertial_euroc
  • stereo_inertial_tum_vi
  • mono_inertial_realsense_D435i
  • mono_inertial_realsense_t265
  • stereo_inertial_realsense_D435i
  • stereo_inertial_realsense_t265

Supported Datasets

ORB-SLAM3 has been extensively tested on several public datasets:

EuRoC MAV Dataset

Recorded with two pinhole cameras and an IMU. Contains 11 sequences with ground truth trajectories.
  • Visual (monocular/stereo)
  • Visual-Inertial (monocular/stereo + IMU)
  • Pinhole camera model
Learn more about EuRoC examples →

TUM-VI Dataset

Recorded with two fisheye cameras and an IMU. Includes indoor and outdoor sequences.
  • Visual (monocular/stereo)
  • Visual-Inertial (monocular/stereo + IMU)
  • Fisheye camera model
Learn more about TUM-VI examples →

KITTI Dataset

Autonomous driving dataset with stereo cameras.
  • Monocular sequences
  • Stereo sequences
  • Large-scale outdoor environments
Learn more about KITTI examples →

Live Camera Support

Run SLAM in real-time with Intel RealSense cameras.
  • RealSense D435i (stereo infrared + IMU)
  • RealSense T265 (fisheye stereo + IMU)
Learn more about RealSense examples →

Running Examples

1

Build the library

First, ensure ORB-SLAM3 is compiled:
cd ORB_SLAM3
./build.sh
This creates executables in the Examples/ folder.
2

Download or prepare data

For dataset examples, download the sequence data. For live cameras, connect your device via USB.
3

Run the example

Execute the appropriate program with required arguments:
./Examples/<Type>/<executable> Vocabulary/ORBvoc.txt <settings.yaml> <data_path>

Expected Outputs

When running examples, ORB-SLAM3 produces:
A Pangolin viewer window displays:
  • Current frame with tracked features
  • 3D map points
  • Camera trajectory
  • Keyframes
Saved trajectories in TUM or EuRoC format:
  • CameraTrajectory.txt - All frame poses
  • KeyFrameTrajectory.txt - Keyframe poses only
Format (TUM):
timestamp tx ty tz qx qy qz qw
Terminal displays:
  • Tracking status (SLAM, Tracking Only, Lost)
  • Frame processing times
  • Number of tracked features
  • Map statistics
When compiled with REGISTER_TIMES:
  • ExecTimeMean.txt - Execution time statistics
  • Per-module timing breakdowns

Evaluation Tools

ORB-SLAM3 includes scripts for trajectory evaluation:

EuRoC Evaluation

./euroc_eval_examples
Processes all EuRoC sequences and computes RMS ATE (Absolute Trajectory Error) against ground truth.

TUM-VI Evaluation

./tum_vi_eval_examples
Evaluates drift at sequence end points where ground truth is available.

Python Evaluation Tools

The library uses Python scripts for trajectory alignment:
python evaluation/evaluate_ate_scale.py <groundtruth.txt> <estimated.txt>
Evaluation scripts require Python with NumPy installed.

Common Arguments

Most examples follow this pattern:
./<executable> <vocabulary> <settings> <sequence_path> [output_name]
ArgumentDescription
vocabularyPath to ORB vocabulary file (usually Vocabulary/ORBvoc.txt)
settingsYAML configuration file with camera calibration and parameters
sequence_pathPath to dataset sequence folder
output_name(Optional) Prefix for output trajectory files

Performance Tips

Hardware

  • Use a powerful CPU (i7 or better recommended)
  • Run on native Linux for best performance
  • Close unnecessary applications

Configuration

  • Adjust ORBextractor.nFeatures for speed/accuracy trade-off
  • Use image downsampling for faster processing
  • Disable viewer for maximum speed

Next Steps

EuRoC Dataset

Run examples with pinhole cameras and IMU

TUM-VI Dataset

Run examples with fisheye cameras

KITTI Dataset

Process autonomous driving sequences

RealSense Cameras

Real-time SLAM with live cameras

Build docs developers (and LLMs) love