Skip to main content
OpenCode uses environment variables for configuration, especially for API keys and provider credentials. This page documents all environment variables that OpenCode recognizes.

API provider credentials

These environment variables configure authentication for various AI providers.

Anthropic (Claude)

ANTHROPIC_API_KEY
string
API key for Anthropic’s Claude models.Provider: Anthropic
Models: Claude 4, Claude 3.5, Claude 3.7, and all Claude model variants
Get your API key from the Anthropic Console.
export ANTHROPIC_API_KEY="sk-ant-..."
If this key is present and valid, OpenCode will default to using Claude 4 Sonnet for all agents.

OpenAI

OPENAI_API_KEY
string
API key for OpenAI models.Provider: OpenAI
Models: GPT-4.1, GPT-4.5, GPT-4o, O1, O3, O4, and all OpenAI variants
Get your API key from the OpenAI Platform.
export OPENAI_API_KEY="sk-..."
OpenCode prioritizes providers in this order: Copilot > Anthropic > OpenAI. If multiple keys are present, Anthropic will be used by default.

Google Gemini

GEMINI_API_KEY
string
API key for Google Gemini models via Google AI Studio.Provider: Google Gemini
Models: Gemini 2.5, Gemini 2.5 Flash, Gemini 2.0 Flash, Gemini 2.0 Flash Lite
Get your API key from Google AI Studio.
export GEMINI_API_KEY="AI..."
This is different from Google Cloud VertexAI. See VERTEXAI_PROJECT for VertexAI configuration.

GitHub Copilot

GITHUB_TOKEN
string
GitHub token with Copilot permissions for accessing GitHub Copilot models.Provider: GitHub Copilot (Experimental)
Models: GPT-4o, GPT-4.1, Claude 3.5/3.7/4 Sonnet, O1, O3-mini, O4-mini, Gemini 2.0/2.5
Requirements:
  • Copilot chat in the IDE enabled in GitHub settings
  • One of:
    • VSCode GitHub Copilot chat extension
    • GitHub gh CLI
    • Neovim GitHub Copilot plugin (copilot.vim or copilot.lua)
    • GitHub token with copilot permissions
If using one of the above tools, authenticate the tool with your GitHub account. This creates a token at:
  • ~/.config/github-copilot/hosts.json
  • ~/.config/github-copilot/apps.json
  • $XDG_CONFIG_HOME/github-copilot/hosts.json
Alternatively, set the token explicitly:
export GITHUB_TOKEN="ghp_..."
Copilot support is currently experimental. Some features may not work as expected.

Groq

GROQ_API_KEY
string
API key for Groq cloud inference.Provider: Groq
Models: Llama 4, QWEN QWQ-32b, Deepseek R1 distill Llama 70b, Llama 3.3 70b
Get your API key from Groq Console.
export GROQ_API_KEY="gsk_..."

OpenRouter

OPENROUTER_API_KEY
string
API key for OpenRouter, which provides access to multiple AI models through a unified API.Provider: OpenRouter
Models: Various models with openrouter. prefix
Get your API key from OpenRouter.
export OPENROUTER_API_KEY="sk-or-..."

xAI (Grok)

XAI_API_KEY
string
API key for xAI’s Grok models.Provider: xAI
Models: Grok-3 Beta, Grok-3 Mini, Grok-3 Fast variants
Get your API key from xAI Console.
export XAI_API_KEY="xai-..."

Cloud provider credentials

These environment variables configure access to cloud-based AI services.

AWS Bedrock

AWS Bedrock provides access to Claude models through Amazon’s infrastructure.
AWS_ACCESS_KEY_ID
string
AWS access key ID for authenticating with AWS services.
export AWS_ACCESS_KEY_ID="AKIA..."
AWS_SECRET_ACCESS_KEY
string
AWS secret access key corresponding to the access key ID.
export AWS_SECRET_ACCESS_KEY="..."
AWS_REGION
string
AWS region for Bedrock API calls.
export AWS_REGION="us-east-1"
If not set, OpenCode will check AWS_DEFAULT_REGION.
AWS_DEFAULT_REGION
string
Fallback AWS region if AWS_REGION is not set.
export AWS_DEFAULT_REGION="us-west-2"
AWS_PROFILE
string
AWS profile name to use from ~/.aws/credentials and ~/.aws/config.
export AWS_PROFILE="my-profile"
AWS_DEFAULT_PROFILE
string
Fallback AWS profile if AWS_PROFILE is not set.
export AWS_DEFAULT_PROFILE="default"
AWS_CONTAINER_CREDENTIALS_RELATIVE_URI
string
Used when running on ECS with task IAM roles. Set automatically by AWS.
This is typically set automatically when running in AWS ECS. You don’t need to set it manually.
AWS_CONTAINER_CREDENTIALS_FULL_URI
string
Alternative credentials endpoint for ECS tasks. Set automatically by AWS.
This is typically set automatically when running in AWS ECS. You don’t need to set it manually.

Azure OpenAI

Azure OpenAI Service provides OpenAI models through Microsoft Azure.
AZURE_OPENAI_ENDPOINT
string
Azure OpenAI service endpoint URL.Format: https://<resource-name>.openai.azure.com
export AZURE_OPENAI_ENDPOINT="https://my-resource.openai.azure.com"
This variable must be set for Azure OpenAI to be enabled, even when using Entra ID authentication.
AZURE_OPENAI_API_KEY
string
API key for Azure OpenAI service. Optional when using Entra ID authentication.
export AZURE_OPENAI_API_KEY="..."
This can be omitted when using Azure Entra ID (formerly Azure AD) credentials for authentication.
AZURE_OPENAI_API_VERSION
string
Azure OpenAI API version to use.Format: YYYY-MM-DD-preview or YYYY-MM-DD
export AZURE_OPENAI_API_VERSION="2025-04-01-preview"

Google Cloud VertexAI

VertexAI provides access to Gemini models through Google Cloud Platform.
VERTEXAI_PROJECT
string
Google Cloud project ID for VertexAI.
export VERTEXAI_PROJECT="my-project-id"
Both VERTEXAI_PROJECT and VERTEXAI_LOCATION must be set for VertexAI to be enabled.
VERTEXAI_LOCATION
string
Google Cloud location/region for VertexAI API calls.Common values: us-central1, us-east1, europe-west1, asia-northeast1
export VERTEXAI_LOCATION="us-central1"
GOOGLE_CLOUD_PROJECT
string
Alternative to VERTEXAI_PROJECT. Google Cloud project ID.
export GOOGLE_CLOUD_PROJECT="my-project-id"
GOOGLE_CLOUD_REGION
string
Alternative to VERTEXAI_LOCATION. Google Cloud region.
export GOOGLE_CLOUD_REGION="us-central1"
GOOGLE_CLOUD_LOCATION
string
Alternative to VERTEXAI_LOCATION and GOOGLE_CLOUD_REGION.
export GOOGLE_CLOUD_LOCATION="us-central1"

Self-hosted models

LOCAL_ENDPOINT
string
Endpoint URL for self-hosted OpenAI-compatible model servers.Use this to connect to locally running models or custom inference servers that implement the OpenAI API specification.Compatible servers:
  • llama.cpp server
  • Ollama (with OpenAI compatibility)
  • vLLM
  • Text Generation Inference (TGI)
  • LocalAI
  • Any OpenAI-compatible API
export LOCAL_ENDPOINT="http://localhost:1235/v1"
Example with Ollama:
# Start Ollama with OpenAI compatibility
export LOCAL_ENDPOINT="http://localhost:11434/v1"
Example configuration:
{
  "agents": {
    "coder": {
      "model": "local.granite-3.3-2b-instruct@q8_0",
      "reasoningEffort": "high"
    }
  }
}

System configuration

SHELL
string
Default shell to use for the bash tool. If not specified, OpenCode defaults to /bin/bash.This variable is typically set automatically by your system but can be overridden.
export SHELL="/bin/zsh"
You can also configure the shell in .opencode.json:
{
  "shell": {
    "path": "/bin/zsh",
    "args": ["-l"]
  }
}
OPENCODE_DEV_DEBUG
string
Enable development debug mode. When set to "true", OpenCode writes debug logs to a file instead of using the TUI logger.Values: "true" or unset
export OPENCODE_DEV_DEBUG="true"
When enabled:
  • Logs written to <data-directory>/debug.log
  • Messages saved to <data-directory>/messages/
  • Useful for debugging OpenCode itself
This is primarily for OpenCode development and debugging. Regular users should use the --debug flag instead.

Configuration precedence

OpenCode loads configuration from multiple sources in this order (highest to lowest priority):
  1. Command-line flags (e.g., --debug)
  2. Environment variables (documented on this page)
  3. Local config file (.opencode.json in working directory)
  4. Global config file:
    • $HOME/.opencode.json
    • $XDG_CONFIG_HOME/opencode/.opencode.json
    • $HOME/.config/opencode/.opencode.json
  5. Default values

Provider priority

When multiple API keys are present, OpenCode selects the default provider in this order:
  1. GitHub Copilot (GITHUB_TOKEN)
  2. Anthropic (ANTHROPIC_API_KEY)
  3. OpenAI (OPENAI_API_KEY)
  4. Google Gemini (GEMINI_API_KEY)
  5. Groq (GROQ_API_KEY)
  6. OpenRouter (OPENROUTER_API_KEY)
  7. xAI (XAI_API_KEY)
  8. AWS Bedrock (AWS credentials)
  9. Azure OpenAI (AZURE_OPENAI_ENDPOINT)
  10. Google Cloud VertexAI (VERTEXAI_PROJECT + VERTEXAI_LOCATION)
You can override this default by explicitly configuring models in .opencode.json.

Security best practices

Never commit API keys to version control. Always use environment variables or secure secret management systems.
  1. Use environment variables:
    # Add to ~/.bashrc, ~/.zshrc, or ~/.profile
    export ANTHROPIC_API_KEY="sk-ant-..."
    export OPENAI_API_KEY="sk-..."
    
  2. Use a secrets manager:
    # Example with 1Password CLI
    export OPENAI_API_KEY=$(op read "op://Private/OpenAI/api_key")
    
  3. Use project-specific env files (not committed):
    # .env (add to .gitignore)
    ANTHROPIC_API_KEY=sk-ant-...
    OPENAI_API_KEY=sk-...
    
    Then source before running:
    source .env && opencode
    
  4. Use shell-specific secure storage:
    # macOS Keychain
    security find-generic-password -s "openai-key" -w
    
  5. Restrict file permissions:
    chmod 600 ~/.opencode.json
    

Verification

To verify which provider OpenCode will use:
# Run with debug flag to see provider selection
opencode -d
The debug output will show:
  • Which API keys were found
  • Which provider was selected as default
  • Any configuration issues

Troubleshooting

No valid provider available

If you see “no valid provider available”, ensure:
  1. At least one API key environment variable is set
  2. The API key is valid and has not expired
  3. For cloud providers (AWS, Azure, VertexAI), all required variables are set

Provider not working as expected

Check:
  1. API key format is correct
  2. No typos in environment variable names
  3. Environment variables are exported (use export)
  4. Variables are set in the same shell session
# Verify environment variables are set
env | grep -E '(ANTHROPIC|OPENAI|GEMINI|GITHUB_TOKEN|GROQ|AWS|AZURE|VERTEXAI|LOCAL_ENDPOINT)'

Cloud provider credentials

For AWS Bedrock:
# Test AWS credentials
aws sts get-caller-identity
For Azure OpenAI:
# Verify all required variables
echo $AZURE_OPENAI_ENDPOINT
echo $AZURE_OPENAI_API_KEY
echo $AZURE_OPENAI_API_VERSION
For Google Cloud VertexAI:
# Check authentication
gcloud auth list
gcloud config get-value project

Build docs developers (and LLMs) love