Skip to main content

Welcome to CLI Proxy API

CLI Proxy API is a powerful proxy server that bridges the gap between CLI-based AI models and standard AI API interfaces. It provides OpenAI, Gemini, Claude, and Codex compatible endpoints, allowing you to use your existing CLI subscriptions with any tool or library designed for these APIs.

Quick Start

Get up and running in 5 minutes with our quick start guide

Installation

Install and configure CLI Proxy API on your system

API Reference

Explore the complete API documentation

Go SDK

Embed the proxy in your Go applications

Key Features

Multi-Provider OAuth

Authenticate with Gemini, Claude, OpenAI Codex, Qwen, iFlow, and Antigravity using OAuth flows

Load Balancing

Distribute requests across multiple accounts with round-robin or fill-first strategies

Model Aliasing

Create custom model names and map unavailable models to alternatives

Streaming Support

Full support for streaming and non-streaming responses with function calling

Management API

Runtime configuration, quota monitoring, and log access via REST endpoints

Go SDK

Reusable library for embedding the proxy in your own applications

Use Cases

CLI Subscriptions with Any Client

Use your Gemini CLI, Claude Code, or OpenAI Codex subscriptions with tools like Cursor, Cline, Continue, or any OpenAI-compatible client.
# Start the proxy
./cliproxyapi

# Use with any OpenAI-compatible client
curl http://localhost:8317/v1/chat/completions \
  -H "Authorization: Bearer your-api-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gemini-2.5-pro",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

Multi-Account Management

Automatically distribute load across multiple accounts to maximize throughput and avoid rate limits.
# config.yaml - Round-robin across three Gemini accounts
routing:
  strategy: "round-robin"

# Authentication files managed in ~/.cli-proxy-api/
# - gemini_account1.json
# - gemini_account2.json  
# - gemini_account3.json

Custom Model Routing

Map model names to different providers or create aliases for your workflow.
# Route unavailable models to alternatives
ampcode:
  model-mappings:
    - from: "claude-opus-4-5-20251101"
      to: "gemini-2.5-pro"
    - from: "gpt-5"
      to: "claude-sonnet-4"

Architecture

CLI Proxy API acts as a translation layer between standard AI API formats (OpenAI, Claude, Gemini) and CLI-based authentication:
┌─────────────────┐
│  Your Client    │  (Cursor, Cline, custom app)
└────────┬────────┘
         │ OpenAI-compatible API

┌─────────────────┐
│ CLI Proxy API   │  Translation & Routing
└────────┬────────┘

    ┌────┴────┬────────┬────────┐
    ▼         ▼        ▼        ▼
  Gemini   Claude   Codex    Qwen
   (CLI)   (OAuth)  (OAuth)  (OAuth)
The proxy handles:
  • Authentication: OAuth flows, token refresh, multi-account management
  • Translation: Request/response format conversion between API standards
  • Routing: Load balancing, model aliasing, automatic failover
  • Management: Configuration hot-reload, quota tracking, logging

Supported Providers

  • Gemini CLI (via OAuth)
  • AI Studio API keys
  • Vertex AI (service accounts and API keys)
  • Antigravity (OAuth)
  • Claude Code (via OAuth)
  • Official Claude API keys
  • Custom Claude-compatible endpoints
  • OpenAI Codex (via OAuth)
  • Device code flow support
  • Custom OpenAI-compatible endpoints
  • Qwen Code (via OAuth)
  • iFlow (via OAuth and Cookie)
  • Kimi (via OAuth)
  • Any OpenAI-compatible provider (OpenRouter, etc.)

Next Steps

Quick Start Guide

Follow our step-by-step tutorial to get started in minutes

Core Concepts

Understand authentication, providers, and routing strategies

Configuration

Learn how to configure the proxy for your needs

OAuth Setup

Authenticate with your provider accounts

Community & Support

CLI Proxy API is open source and licensed under MIT. Join our growing community:
  • GitHub: router-for-me/CLIProxyAPI
  • Issues: Report bugs or request features on GitHub
  • Stars: 15,500+ developers trust CLI Proxy API
CLI Proxy API is actively maintained and regularly updated with support for new providers and features. Check the GitHub repository for the latest releases and updates.

Build docs developers (and LLMs) love