Skip to main content

Welcome to Codex-LB

Codex-LB is a load balancer and proxy for ChatGPT accounts. Pool multiple accounts, track usage, manage API keys, and monitor everything through a beautiful dashboard. Codex-LB Dashboard

Why Codex-LB?

Codex-LB sits between your applications and ChatGPT accounts, providing enterprise-grade account management and usage controls without requiring direct OpenAI API access.

Account Pooling

Load balance requests across multiple ChatGPT accounts automatically

Usage Tracking

Track tokens, costs, and 28-day trends per account with real-time updates

API Key Management

Create API keys with per-key rate limits by token, cost, time window, and model

Dashboard Auth

Secure your dashboard with password authentication and optional TOTP

OpenAI Compatible

Works with Codex CLI, OpenCode, and any OpenAI-compatible client

Auto Model Sync

Available models are automatically fetched from upstream ChatGPT

How It Works

  1. Deploy — Run Codex-LB via Docker or uvx
  2. Add Accounts — Link your ChatGPT accounts through OAuth
  3. Configure Clients — Point any OpenAI-compatible client at Codex-LB
  4. Monitor — Track usage and manage everything from the dashboard
Codex-LB proxies requests to your ChatGPT accounts, not to OpenAI’s API. You’ll need active ChatGPT accounts to use this service.

Key Features

Account Management

Add multiple ChatGPT accounts and let Codex-LB distribute requests intelligently. Monitor each account’s token usage, cost, and health status from a unified dashboard.

Usage Analytics

Real-time tracking of:
  • Token consumption per account
  • Cost calculations with 28-day trends
  • Request logs with model and endpoint details
  • Per-key usage breakdowns

Rate Limiting

Create API keys with fine-grained controls:
  • Token limits per day/week/month
  • Cost limits with currency tracking
  • Model restrictions
  • Automatic key expiration

Security

  • Optional API key authentication for all proxy endpoints
  • Dashboard authentication with password + TOTP
  • IP-based firewall rules
  • Encrypted token storage

Supported Clients

Codex-LB works with any OpenAI-compatible client:

Codex CLI

Configure your Codex CLI to use Codex-LB as the backend

OpenCode

Set up OpenCode with Codex-LB as a custom provider

OpenClaw

Add Codex-LB as a model provider in OpenClaw

OpenAI SDK

Use the official OpenAI Python/Node SDK with Codex-LB

Quick Start

Get up and running in under 2 minutes:

Quickstart Guide

Follow our quickstart to install Codex-LB and make your first request

Next Steps

Installation

Detailed installation instructions for Docker, uvx, and local development

Add Accounts

Learn how to add and manage ChatGPT accounts

Client Setup

Configure your favorite OpenAI-compatible client

Configuration

Explore environment variables and advanced settings

Build docs developers (and LLMs) love