Skip to main content

Overview

LiteLLM supports configuration via environment variables for API keys, endpoints, and settings.

Provider API Keys

OpenAI

OPENAI_API_KEY
string
OpenAI API key.
export OPENAI_API_KEY="sk-..."
OPENAI_ORGANIZATION
string
OpenAI organization ID.

Azure OpenAI

AZURE_API_KEY
string
Azure OpenAI API key.
AZURE_API_BASE
string
Azure OpenAI endpoint.
export AZURE_API_BASE="https://your-endpoint.openai.azure.com/"
AZURE_API_VERSION
string
Azure OpenAI API version.
export AZURE_API_VERSION="2024-02-01"

Anthropic

ANTHROPIC_API_KEY
string
Anthropic API key for Claude models.
export ANTHROPIC_API_KEY="sk-ant-..."

Google (Vertex AI / AI Studio)

GOOGLE_APPLICATION_CREDENTIALS
string
Path to Google Cloud service account JSON file.
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account.json"
VERTEXAI_PROJECT
string
Google Cloud project ID.
VERTEXAI_LOCATION
string
Vertex AI location (e.g., us-central1).
GEMINI_API_KEY
string
Google AI Studio API key.

AWS Bedrock

AWS_ACCESS_KEY_ID
string
AWS access key ID.
AWS_SECRET_ACCESS_KEY
string
AWS secret access key.
AWS_REGION_NAME
string
AWS region (e.g., us-east-1).

Cohere

COHERE_API_KEY
string
Cohere API key.
export COHERE_API_KEY="..."

Hugging Face

HUGGINGFACE_API_KEY
string
Hugging Face API token.

Replicate

REPLICATE_API_KEY
string
Replicate API token.

Together AI

TOGETHERAI_API_KEY
string
Together AI API key.

Groq

GROQ_API_KEY
string
Groq API key.

Mistral

MISTRAL_API_KEY
string
Mistral AI API key.

Perplexity

PERPLEXITYAI_API_KEY
string
Perplexity AI API key.

Anyscale

ANYSCALE_API_KEY
string
Anyscale API key.

Voyage AI

VOYAGE_API_KEY
string
Voyage AI API key (embeddings).

Databricks

DATABRICKS_API_KEY
string
Databricks API key.
DATABRICKS_API_BASE
string
Databricks workspace URL.

Cloudflare

CLOUDFLARE_API_KEY
string
Cloudflare Workers AI API key.
CLOUDFLARE_ACCOUNT_ID
string
Cloudflare account ID.

LiteLLM Settings

Core Settings

LITELLM_LOG
string
default:"INFO"
Log level.Options: DEBUG, INFO, WARNING, ERROR
export LITELLM_LOG="DEBUG"
LITELLM_LOCAL_MODEL_COST_MAP
string
default:"true"
Use local cost map for pricing.
LITELLM_DROP_PARAMS
string
default:"false"
Drop unsupported parameters instead of raising errors.
export LITELLM_DROP_PARAMS="true"

Caching

REDIS_HOST
string
Redis hostname for caching.
REDIS_PORT
string
default:"6379"
Redis port.
REDIS_PASSWORD
string
Redis password.
REDIS_URL
string
Complete Redis connection URL.
export REDIS_URL="redis://:password@localhost:6379"

Proxy Server Settings

Authentication

LITELLM_MASTER_KEY
string
Master admin key for proxy.
export LITELLM_MASTER_KEY="sk-1234"
LITELLM_SALT_KEY
string
Salt key for hashing API keys.
UI_USERNAME
string
Admin UI username.
UI_PASSWORD
string
Admin UI password.

Database

DATABASE_URL
string
PostgreSQL connection string.
export DATABASE_URL="postgresql://user:pass@localhost:5432/litellm"
STORE_MODEL_IN_DB
string
default:"false"
Store model configurations in database.
export STORE_MODEL_IN_DB="true"

Server Configuration

PORT
string
default:"4000"
Proxy server port.
export PORT="8000"
HOST
string
default:"0.0.0.0"
Proxy server host.
NUM_WORKERS
string
default:"1"
Number of Uvicorn workers.
export NUM_WORKERS="4"
SSL_KEYFILE
string
Path to SSL key file.
SSL_CERTFILE
string
Path to SSL certificate file.

Observability

LANGFUSE_PUBLIC_KEY
string
Langfuse public key for logging.
LANGFUSE_SECRET_KEY
string
Langfuse secret key.
LANGFUSE_HOST
string
Langfuse host URL.
LUNARY_PUBLIC_KEY
string
Lunary public key.
HELICONE_API_KEY
string
Helicone API key.
SENTRY_DSN
string
Sentry DSN for error tracking.
SLACK_WEBHOOK_URL
string
Slack webhook for alerts.

Budget & Rate Limiting

LITELLM_MAX_BUDGET
string
Global budget limit in USD.
export LITELLM_MAX_BUDGET="1000.0"
LITELLM_BUDGET_DURATION
string
Budget reset period.
export LITELLM_BUDGET_DURATION="30d"

Usage Examples

Basic Setup

# OpenAI
export OPENAI_API_KEY="sk-..."

# Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."

# Start LiteLLM
python your_app.py

Azure OpenAI Setup

export AZURE_API_KEY="your-key"
export AZURE_API_BASE="https://your-endpoint.openai.azure.com/"
export AZURE_API_VERSION="2024-02-01"

python your_app.py

Proxy with Database

export LITELLM_MASTER_KEY="sk-1234"
export DATABASE_URL="postgresql://user:pass@localhost:5432/litellm"
export STORE_MODEL_IN_DB="true"

litellm --config config.yaml

Full Production Setup

# API Keys
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export AZURE_API_KEY="your-azure-key"
export AZURE_API_BASE="https://your-endpoint.openai.azure.com/"

# Proxy Settings
export LITELLM_MASTER_KEY="sk-1234"
export DATABASE_URL="postgresql://user:pass@localhost:5432/litellm"
export PORT="4000"
export NUM_WORKERS="4"

# Caching
export REDIS_URL="redis://:password@localhost:6379"

# Observability
export LANGFUSE_PUBLIC_KEY="pk-..."
export LANGFUSE_SECRET_KEY="sk-..."
export SENTRY_DSN="https://[email protected]/..."

# Budget
export LITELLM_MAX_BUDGET="10000.0"
export LITELLM_BUDGET_DURATION="30d"

# Logging
export LITELLM_LOG="INFO"

# Start proxy
litellm --config config.yaml

Docker Environment

docker-compose.yml

version: '3.8'

services:
  litellm:
    image: ghcr.io/berriai/litellm:main-latest
    ports:
      - "4000:4000"
    environment:
      # API Keys
      - OPENAI_API_KEY=${OPENAI_API_KEY}
      - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
      
      # Proxy Settings
      - LITELLM_MASTER_KEY=${LITELLM_MASTER_KEY}
      - DATABASE_URL=postgresql://litellm:password@postgres:5432/litellm
      
      # Redis
      - REDIS_HOST=redis
      - REDIS_PORT=6379
      
      # Observability
      - LANGFUSE_PUBLIC_KEY=${LANGFUSE_PUBLIC_KEY}
      - LANGFUSE_SECRET_KEY=${LANGFUSE_SECRET_KEY}
      
    volumes:
      - ./config.yaml:/app/config.yaml
    command: ["--config", "/app/config.yaml", "--port", "4000"]
    depends_on:
      - postgres
      - redis
  
  postgres:
    image: postgres:15
    environment:
      - POSTGRES_DB=litellm
      - POSTGRES_USER=litellm
      - POSTGRES_PASSWORD=password
    volumes:
      - postgres_data:/var/lib/postgresql/data
  
  redis:
    image: redis:7
    volumes:
      - redis_data:/data

volumes:
  postgres_data:
  redis_data:

.env file

# .env
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
LITELLM_MASTER_KEY=sk-1234
LANGFUSE_PUBLIC_KEY=pk-...
LANGFUSE_SECRET_KEY=sk-...

Kubernetes ConfigMap

apiVersion: v1
kind: ConfigMap
metadata:
  name: litellm-config
data:
  LITELLM_LOG: "INFO"
  PORT: "4000"
  HOST: "0.0.0.0"
  NUM_WORKERS: "4"
  STORE_MODEL_IN_DB: "true"
---
apiVersion: v1
kind: Secret
metadata:
  name: litellm-secrets
type: Opaque
stringData:
  OPENAI_API_KEY: "sk-..."
  ANTHROPIC_API_KEY: "sk-ant-..."
  LITELLM_MASTER_KEY: "sk-1234"
  DATABASE_URL: "postgresql://..."
  REDIS_URL: "redis://..."

Loading from .env File

from dotenv import load_dotenv
import litellm

# Load environment variables from .env file
load_dotenv()

# Now use LiteLLM
response = litellm.completion(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello"}]
)

Provider-Specific Variables

Some providers require additional environment variables:

AWS Bedrock with Profile

export AWS_PROFILE="my-profile"
export AWS_REGION_NAME="us-east-1"

Vertex AI with ADC

# Use Application Default Credentials
gcloud auth application-default login

export VERTEXAI_PROJECT="my-project"
export VERTEXAI_LOCATION="us-central1"

Azure with Managed Identity

export AZURE_USE_MANAGED_IDENTITY="true"
export AZURE_API_BASE="https://your-endpoint.openai.azure.com/"

Security Best Practices

  1. Never commit .env files to version control
  2. Use secret managers in production (AWS Secrets Manager, Kubernetes Secrets, etc.)
  3. Rotate keys regularly
  4. Use least-privilege access for cloud IAM roles
  5. Encrypt sensitive environment variables in CI/CD

Build docs developers (and LLMs) love