Skip to main content
This guide will help you set up rLLM on your system.

Prerequisites

rLLM requires Python >= 3.10 (Python 3.11 is required if using the tinker backend). Starting with v0.2.1, rLLM’s recommended dependency manager is uv. To install uv, run:
curl -LsSf https://astral.sh/uv/install.sh | sh

Install Python 3.11

Ensure that your system has a suitable installation of Python:
uv python install 3.11

Installation methods

Choose one of the following installation methods based on your needs.

Quick install with uv

Install rLLM directly from GitHub with a single command:
uv pip install "rllm[verl] @ git+https://github.com/rllm-org/rllm.git"
Replace verl with tinker to install with the tinker backend instead:
uv pip install "rllm[tinker] @ git+https://github.com/rllm-org/rllm.git"

Optional dependencies

rLLM provides additional optional dependencies for specific agent domains and framework integrations:
  • sdk: LiteLLM proxy integration
  • smolagents: Hugging Face SmolAgents integration
  • strands: Strands agents framework
  • web: Web agents (BrowserGym, Selenium, Firecrawl)
  • code-tools: Sandboxed code execution (E2B, Together)
  • swe: Software engineering tools (Docker, Kubernetes, SWEBench)
  • verifiers: Verifiers integration for validation
  • dev: Development tools (pytest, ruff, mypy, mkdocs)
  • ui: UI components (httpx, python-multipart)
  • opentelemetry: OpenTelemetry SDK for observability
Install optional dependencies by adding them to the installation command:
# Install with web and code-tools extras
uv pip install -e .[verl,web,code-tools]

Advanced: Editable verl installation

If you wish to make changes to the verl backend itself:
git clone https://github.com/volcengine/verl.git
cd verl
git checkout v0.6.1
uv pip install -e .

Verify installation

Verify that rLLM is installed correctly:
python -c "import rllm; print(rllm.__version__)"
You should see the version number printed (e.g., 0.2.1).

Next steps

Quick start

Build your first math reasoning agent in 10 minutes

Core concepts

Learn about the key components of rLLM

Troubleshooting

If you encounter issues during installation:
  1. Check Python version: Ensure you’re using Python >= 3.10 (3.11 for tinker)
  2. GPU compatibility: Verify CUDA version matches PyTorch requirements
  3. Dependency conflicts: Use uv instead of pip for better dependency resolution
  4. GitHub issues: Check the GitHub issues page for known issues
For additional help, join our Discord community.

Build docs developers (and LLMs) love