Skip to main content

Choose your installation method

Codex-LB supports multiple installation methods depending on your use case.

Docker

Recommended for production and quick testing

uvx

Easiest for personal use and experimentation

Local Dev

Required for contributing or customizing

Docker Installation

The recommended method for most users. Provides a fully isolated environment with all dependencies.

Basic Setup

# Create a volume for persistent data
docker volume create codex-lb-data

# Run Codex-LB
docker run -d --name codex-lb \
  -p 2455:2455 -p 1455:1455 \
  -v codex-lb-data:/var/lib/codex-lb \
  ghcr.io/soju06/codex-lb:latest
Data persistence: All data (accounts, API keys, usage stats) is stored in the codex-lb-data volume at /var/lib/codex-lb.

With Environment Variables

Customize Codex-LB with environment variables:
docker run -d --name codex-lb \
  -p 2455:2455 -p 1455:1455 \
  -v codex-lb-data:/var/lib/codex-lb \
  -e CODEX_LB_DATABASE_URL="postgresql+asyncpg://user:pass@host:5432/db" \
  -e CODEX_LB_USAGE_REFRESH_INTERVAL_SECONDS=120 \
  ghcr.io/soju06/codex-lb:latest
See Environment Variables for all available options.

Using Docker Compose

For more complex setups, use Docker Compose:
docker-compose.yml
services:
  codex-lb:
    image: ghcr.io/soju06/codex-lb:latest
    ports:
      - "2455:2455"
      - "1455:1455"
    volumes:
      - codex-lb-data:/var/lib/codex-lb
    environment:
      - CODEX_LB_DATABASE_MIGRATE_ON_STARTUP=true
      - CODEX_LB_USAGE_REFRESH_ENABLED=true
    restart: unless-stopped

volumes:
  codex-lb-data:
Run it:
docker compose up -d

With PostgreSQL

For production deployments, use PostgreSQL instead of SQLite:
docker-compose.yml
services:
  codex-lb:
    image: ghcr.io/soju06/codex-lb:latest
    ports:
      - "2455:2455"
      - "1455:1455"
    volumes:
      - codex-lb-data:/var/lib/codex-lb
    environment:
      - CODEX_LB_DATABASE_URL=postgresql+asyncpg://codex_lb:codex_lb@postgres:5432/codex_lb
    depends_on:
      postgres:
        condition: service_healthy
    restart: unless-stopped

  postgres:
    image: postgres:16-alpine
    environment:
      POSTGRES_USER: codex_lb
      POSTGRES_PASSWORD: codex_lb
      POSTGRES_DB: codex_lb
    volumes:
      - postgres-data:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U codex_lb -d codex_lb"]
      interval: 10s
      timeout: 5s
      retries: 5
    restart: unless-stopped

volumes:
  codex-lb-data:
  postgres-data:
PostgreSQL is recommended for production deployments with high request volumes or multiple instances.

Container Management

docker logs codex-lb -f

uvx Installation

The quickest way to run Codex-LB without Docker. Requires Python 3.13+.

Install and Run

uvx codex-lb
That’s it! Codex-LB will start on ports 2455 and 1455.
Data location: ~/.codex-lb/ — includes database, encryption keys, and logs.

With Custom Settings

Pass environment variables when running:
CODEX_LB_DATABASE_URL="postgresql+asyncpg://..." uvx codex-lb
Or create a .env.local file in your working directory:
.env.local
CODEX_LB_DATABASE_URL=sqlite+aiosqlite:///~/.codex-lb/store.db
CODEX_LB_USAGE_REFRESH_INTERVAL_SECONDS=60
Then run:
uvx codex-lb

Stopping uvx

Press Ctrl+C in the terminal running Codex-LB.

Local Development

For contributors or users who want to customize Codex-LB.

Prerequisites

  • Python 3.13+
  • uv (Python package manager)
  • Bun (for frontend development)
  • Git

Clone and Install

1

Clone the repository

git clone https://github.com/Soju06/codex-lb.git
cd codex-lb
2

Install Python dependencies

uv sync
This creates a virtual environment at .venv/ and installs all dependencies.
3

Install frontend dependencies

cd frontend
bun install
cd ..
4

Build the frontend

cd frontend
bun run build
cd ..
This creates production assets in app/static/.

Development Servers

Run the backend and frontend separately for hot reloading:
uv run fastapi run app/main.py --reload
In development mode, the frontend runs on port 5173 and proxies API requests to the backend on port 2455.

Docker Development

Alternatively, use Docker Compose for local development with hot reload:
docker compose watch
This watches for file changes and automatically syncs them to the running containers.

Environment Configuration

Copy the example environment file:
cp .env.example .env.local
Edit .env.local to customize settings. See Environment Variables for all options.

Database Migrations

Run migrations manually:
uv run codex-lb-db upgrade head
Create a new migration:
uv run alembic revision --autogenerate -m "Description of changes"

Running Tests

# Run all tests
uv run pytest

# Run with coverage
uv run pytest --cov=app

# Run specific test file
uv run pytest tests/test_proxy.py

Data Storage

Codex-LB stores data in different locations depending on the installation method:
MethodLocationContents
Docker/var/lib/codex-lb/Database, encryption keys, logs
uvx / Local~/.codex-lb/Database, encryption keys, logs
Backup your data! The data directory contains:
  • Account tokens (encrypted)
  • API keys
  • Usage history
  • Configuration settings
Losing this directory means losing all accounts and settings.

Backing Up

# Stop the container
docker stop codex-lb

# Backup the volume
docker run --rm -v codex-lb-data:/data -v $(pwd):/backup alpine tar czf /backup/codex-lb-backup.tar.gz -C /data .

# Restart the container
docker start codex-lb

Restoring

# Stop and remove existing container
docker stop codex-lb
docker rm codex-lb

# Remove old volume
docker volume rm codex-lb-data

# Create new volume
docker volume create codex-lb-data

# Restore backup
docker run --rm -v codex-lb-data:/data -v $(pwd):/backup alpine tar xzf /backup/codex-lb-backup.tar.gz -C /data

# Start container with restored data
docker run -d --name codex-lb \
  -p 2455:2455 -p 1455:1455 \
  -v codex-lb-data:/var/lib/codex-lb \
  ghcr.io/soju06/codex-lb:latest

Next Steps

Add accounts

Link your ChatGPT accounts via OAuth

Configure clients

Set up Codex CLI, OpenCode, or other clients

Environment variables

Customize timeouts, OAuth, database, and more

Production deployment

Best practices for running Codex-LB in production

Build docs developers (and LLMs) love