Overview
This guide will help you run your first ORB-SLAM3 example. We’ll use the monocular SLAM example with the TUM RGB-D dataset, which is one of the simplest configurations to get started.Before proceeding, make sure you have installed ORB-SLAM3 and verified that the compilation was successful.
Quick Start with TUM Dataset
Download a Dataset
Download a sample sequence from the TUM RGB-D dataset:This sequence contains 798 frames of a camera moving in x, y, and z directions.
Run Monocular SLAM
Execute the monocular SLAM example:Command breakdown:
Vocabulary/ORBvoc.txt: The ORB vocabulary for place recognitionExamples/Monocular/TUM1.yaml: Camera calibration and SLAM parameters~/Datasets/rgbd_dataset_freiburg1_xyz: Path to the dataset
View the Results
You should see two windows:
- ORB-SLAM3 Viewer: 3D visualization showing:
- Green points: Current map points
- Red boxes: Keyframes
- Blue line: Current camera pose
- Camera trajectory in real-time
- Frame Viewer: Shows the current frame with:
- Tracked ORB features
- Current tracking state
- Number of tracked features
The system needs a few frames to initialize. Look for the message “System initialized” in the terminal.
Understanding the Output
During execution, ORB-SLAM3 will print status information:KeyFrameTrajectory.txt in TUM format.
Understanding the System API
Here’s how the monocular example initializes and uses ORB-SLAM3 (fromExamples/Monocular/mono_tum.cc:50):
System Constructor
TheSystem constructor (from include/System.h:105) takes:
Sensor Types
Available sensor types (frominclude/System.h:87-94):
Tracking Methods
Depending on your sensor configuration, use the appropriate tracking method:Sophus::SE3f transformation (empty if tracking fails).
Other Examples
Stereo SLAM (EuRoC Dataset)
RGB-D SLAM
RGB-D examples require association files that pair RGB images with depth images. These are provided in
Examples/RGB-D/associations/.Using Your Own Camera
To use ORB-SLAM3 with your own camera:Calibrate Your Camera
Follow the instructions in
Calibration_Tutorial.pdf to calibrate your camera and create a YAML configuration file.Modify an Example
Copy and modify one of the provided examples (e.g.,
Examples/Monocular/mono_tum.cc) to read from your camera source instead of files.Performance Tips
Optimize for Real-Time Performance
Optimize for Real-Time Performance
- Use a powerful computer (Intel i7 or equivalent)
- Reduce image resolution if needed (configure in YAML file)
- Disable the viewer for maximum performance: set the 4th parameter to
falsein the System constructor - Compile in Release mode (default in
build.sh)
Improve Tracking Robustness
Improve Tracking Robustness
- Ensure good lighting conditions
- Avoid rapid camera movements during initialization
- Include visual texture in the scene (avoid blank walls)
- For IMU configurations, keep the camera-IMU synchronized
Debug Tracking Issues
Debug Tracking Issues
- Check the Frame Viewer for the number of tracked features (should be > 30)
- Look for “Tracking Lost” messages in the terminal
- Verify camera calibration is correct
- Enable verbose output by setting the verbosity level
Saving and Loading Results
ORB-SLAM3 provides several methods to save trajectory data (frominclude/System.h:143-168):
Next Steps
Now that you’ve run your first example, explore more advanced features:API Reference
Learn about the complete System class API
Configuration
Understand the YAML configuration files
Examples
Explore examples for all sensor configurations
Datasets
Download and use popular SLAM datasets