Skip to main content
Codex-LB provides OpenAI-compatible endpoints that work with any client that supports the OpenAI API. This includes popular CLI tools, IDEs, SDKs, and custom applications.

Supported Endpoints

Codex-LB exposes two main endpoint types:

OpenAI v1

Standard OpenAI-compatible /v1 endpointshttp://127.0.0.1:2455/v1

Codex Backend API

Codex-specific /backend-api/codex endpointshttp://127.0.0.1:2455/backend-api/codex

Supported Clients

Codex-LB works seamlessly with these popular clients:
ClientEndpointDocumentation
Codex CLIhttp://127.0.0.1:2455/backend-api/codexSetup Guide
OpenCodehttp://127.0.0.1:2455/v1Setup Guide
OpenClawhttp://127.0.0.1:2455/v1Setup Guide
OpenAI Python SDKhttp://127.0.0.1:2455/v1Setup Guide
Any OpenAI-compatible clienthttp://127.0.0.1:2455/v1API Reference

Authentication

API key authentication is disabled by default. When disabled, any client can connect without credentials.
To enable API key authentication, go to Settings → API Key Auth in the dashboard. See API Key Authentication for details.
When API key auth is enabled:
  1. Create an API key in the dashboard (API Keys → Create)
  2. Configure your client to pass the key as a Bearer token:
    Authorization: Bearer sk-clb-...
    
  3. The key is shown only once at creation — save it securely

Configuration Pattern

All clients follow a similar setup pattern:
1

Set the base URL

Point the client at Codex-LB’s endpoint:
  • /v1 for OpenAI-compatible clients
  • /backend-api/codex for Codex CLI
2

Configure authentication (optional)

If API key auth is enabled, provide your key:
  • Via environment variable (recommended)
  • Directly in config (less secure)
3

Select your model

Use any model available in your Codex-LB instance:
  • gpt-5.3-codex
  • gpt-5.3-codex-spark
  • Or any custom models you’ve configured

Next Steps

Codex CLI

Configure the Codex CLI and IDE extensions

OpenCode

Set up OpenCode AI coding assistant

OpenClaw

Configure OpenClaw agent framework

Python SDK

Use the OpenAI Python SDK with Codex-LB

Troubleshooting

Ensure Codex-LB is running and accessible:
# Check if the service is running
curl http://127.0.0.1:2455/v1/models
If using Docker, verify port mappings:
docker ps | grep codex-lb
If you see 401 Unauthorized:
  1. Check if API key auth is enabled in Settings
  2. Verify your API key is valid and not expired
  3. Ensure the key is passed correctly as a Bearer token
  4. Check that the key has permission for the requested model
If a model isn’t available:
  1. Verify the model exists in your dashboard
  2. Check that at least one account supports the model
  3. If using API keys with model restrictions, ensure the key allows that model
  4. Run model sync: Settings → Sync Models
If you hit rate limits:
  1. Check your API key’s rate limits in the dashboard
  2. Review account limits and quotas
  3. Consider adding more accounts to the pool
  4. Adjust rate limits in API Keys → Edit
Need help? Check the Troubleshooting Guide for more common issues and solutions.

Build docs developers (and LLMs) love