Skip to main content
The Drako proxy is an HTTP reverse proxy that sits between your agents and their LLM providers. It intercepts every request and response to apply your .drako.yaml governance policy in real time — without modifying agent source code.
The proxy requires the proxy extra: pip install 'drako[proxy]'

Subcommands

CommandDescription
drako proxy startStart the proxy server
drako proxy stopStop the background daemon
drako proxy statusCheck whether the proxy is running

drako proxy start

Start the enforcement proxy server.
drako proxy start [OPTIONS]
--port
integer
default:"8990"
Port to listen on.
--host
string
default:"0.0.0.0"
Host address to bind to.
--config
string
default:".drako.yaml"
Path to the Drako configuration file. Run drako init first if this file does not exist.
--daemon
flag
Fork the proxy to the background as a daemon process. The PID is saved to .drako/.proxy.pid. Use drako proxy stop to terminate it.
Daemon mode is not supported on Windows. The proxy runs in the foreground instead.

drako proxy stop

Send SIGTERM to the background proxy daemon and remove the PID file.
drako proxy stop
No flags. Reads the PID from .drako/.proxy.pid.

drako proxy status

Check whether the proxy is running by calling its /health endpoint.
drako proxy status [OPTIONS]
--port
integer
default:"8990"
Port to query. Must match the port the proxy was started on.
When the proxy is healthy, the command prints governance level, configured targets, and total audit log entries.

Environment variable configuration

Route your agents through the proxy by overriding the base URL for each provider:
# OpenAI
export OPENAI_BASE_URL=http://localhost:8990/openai/v1

# Anthropic
export ANTHROPIC_BASE_URL=http://localhost:8990/anthropic/v1
No other code changes are needed. The proxy forwards requests to the real provider and applies governance inline.
VariableDefaultDescription
OPENAI_BASE_URLhttps://api.openai.com/v1OpenAI-compatible endpoint
ANTHROPIC_BASE_URLhttps://api.anthropic.comAnthropic endpoint
DRAKO_API_KEYAPI key read by the proxy from the config or environment
DRAKO_ENDPOINThttps://api.getdrako.comDrako backend for policy sync

Usage with OpenAI

# 1. Start the proxy (foreground)
drako proxy start

# 2. In a separate terminal, run your agent
export OPENAI_BASE_URL=http://localhost:8990/openai/v1
python my_agent.py
Or start as a daemon:
drako proxy start --daemon
export OPENAI_BASE_URL=http://localhost:8990/openai/v1
python my_agent.py
drako proxy stop

Docker deployment

FROM python:3.11-slim

RUN pip install 'drako[proxy]'

COPY .drako.yaml /app/.drako.yaml
WORKDIR /app

EXPOSE 8990

CMD ["drako", "proxy", "start", "--host", "0.0.0.0", "--port", "8990"]
# docker-compose.yml
services:
  drako-proxy:
    build: .
    ports:
      - "8990:8990"
    environment:
      - DRAKO_API_KEY=${DRAKO_API_KEY}
    volumes:
      - ./.drako.yaml:/app/.drako.yaml:ro

  my-agent:
    build: ./agent
    environment:
      - OPENAI_BASE_URL=http://drako-proxy:8990/openai/v1
    depends_on:
      - drako-proxy
Use drako proxy status after starting the proxy to confirm it loaded your config correctly before running your agents.

Build docs developers (and LLMs) love