API provider credentials
These environment variables configure authentication for various AI providers.Anthropic (Claude)
API key for Anthropic’s Claude models.Provider: Anthropic
Models: Claude 4, Claude 3.5, Claude 3.7, and all Claude model variantsGet your API key from the Anthropic Console.
Models: Claude 4, Claude 3.5, Claude 3.7, and all Claude model variantsGet your API key from the Anthropic Console.
If this key is present and valid, OpenCode will default to using Claude 4 Sonnet for all agents.
OpenAI
API key for OpenAI models.Provider: OpenAI
Models: GPT-4.1, GPT-4.5, GPT-4o, O1, O3, O4, and all OpenAI variantsGet your API key from the OpenAI Platform.
Models: GPT-4.1, GPT-4.5, GPT-4o, O1, O3, O4, and all OpenAI variantsGet your API key from the OpenAI Platform.
OpenCode prioritizes providers in this order: Copilot > Anthropic > OpenAI. If multiple keys are present, Anthropic will be used by default.
Google Gemini
API key for Google Gemini models via Google AI Studio.Provider: Google Gemini
Models: Gemini 2.5, Gemini 2.5 Flash, Gemini 2.0 Flash, Gemini 2.0 Flash LiteGet your API key from Google AI Studio.
Models: Gemini 2.5, Gemini 2.5 Flash, Gemini 2.0 Flash, Gemini 2.0 Flash LiteGet your API key from Google AI Studio.
This is different from Google Cloud VertexAI. See
VERTEXAI_PROJECT for VertexAI configuration.GitHub Copilot
GitHub token with Copilot permissions for accessing GitHub Copilot models.Provider: GitHub Copilot (Experimental)
Models: GPT-4o, GPT-4.1, Claude 3.5/3.7/4 Sonnet, O1, O3-mini, O4-mini, Gemini 2.0/2.5Requirements:
Models: GPT-4o, GPT-4.1, Claude 3.5/3.7/4 Sonnet, O1, O3-mini, O4-mini, Gemini 2.0/2.5Requirements:
- Copilot chat in the IDE enabled in GitHub settings
- One of:
- VSCode GitHub Copilot chat extension
- GitHub
ghCLI - Neovim GitHub Copilot plugin (
copilot.vimorcopilot.lua) - GitHub token with copilot permissions
~/.config/github-copilot/hosts.json~/.config/github-copilot/apps.json$XDG_CONFIG_HOME/github-copilot/hosts.json
Groq
API key for Groq cloud inference.Provider: Groq
Models: Llama 4, QWEN QWQ-32b, Deepseek R1 distill Llama 70b, Llama 3.3 70bGet your API key from Groq Console.
Models: Llama 4, QWEN QWQ-32b, Deepseek R1 distill Llama 70b, Llama 3.3 70bGet your API key from Groq Console.
OpenRouter
API key for OpenRouter, which provides access to multiple AI models through a unified API.Provider: OpenRouter
Models: Various models with
Models: Various models with
openrouter. prefixGet your API key from OpenRouter.xAI (Grok)
API key for xAI’s Grok models.Provider: xAI
Models: Grok-3 Beta, Grok-3 Mini, Grok-3 Fast variantsGet your API key from xAI Console.
Models: Grok-3 Beta, Grok-3 Mini, Grok-3 Fast variantsGet your API key from xAI Console.
Cloud provider credentials
These environment variables configure access to cloud-based AI services.AWS Bedrock
AWS Bedrock provides access to Claude models through Amazon’s infrastructure.AWS access key ID for authenticating with AWS services.
AWS secret access key corresponding to the access key ID.
AWS region for Bedrock API calls.
If not set, OpenCode will check
AWS_DEFAULT_REGION.Fallback AWS region if
AWS_REGION is not set.AWS profile name to use from
~/.aws/credentials and ~/.aws/config.Fallback AWS profile if
AWS_PROFILE is not set.Used when running on ECS with task IAM roles. Set automatically by AWS.
This is typically set automatically when running in AWS ECS. You don’t need to set it manually.
Alternative credentials endpoint for ECS tasks. Set automatically by AWS.
This is typically set automatically when running in AWS ECS. You don’t need to set it manually.
Azure OpenAI
Azure OpenAI Service provides OpenAI models through Microsoft Azure.Azure OpenAI service endpoint URL.Format:
https://<resource-name>.openai.azure.comThis variable must be set for Azure OpenAI to be enabled, even when using Entra ID authentication.
API key for Azure OpenAI service. Optional when using Entra ID authentication.
This can be omitted when using Azure Entra ID (formerly Azure AD) credentials for authentication.
Azure OpenAI API version to use.Format:
YYYY-MM-DD-preview or YYYY-MM-DDGoogle Cloud VertexAI
VertexAI provides access to Gemini models through Google Cloud Platform.Google Cloud project ID for VertexAI.
Both
VERTEXAI_PROJECT and VERTEXAI_LOCATION must be set for VertexAI to be enabled.Google Cloud location/region for VertexAI API calls.Common values:
us-central1, us-east1, europe-west1, asia-northeast1Alternative to
VERTEXAI_PROJECT. Google Cloud project ID.Alternative to
VERTEXAI_LOCATION. Google Cloud region.Alternative to
VERTEXAI_LOCATION and GOOGLE_CLOUD_REGION.Self-hosted models
Endpoint URL for self-hosted OpenAI-compatible model servers.Use this to connect to locally running models or custom inference servers that implement the OpenAI API specification.Compatible servers:Example with Ollama:Example configuration:
- llama.cpp server
- Ollama (with OpenAI compatibility)
- vLLM
- Text Generation Inference (TGI)
- LocalAI
- Any OpenAI-compatible API
System configuration
Default shell to use for the bash tool. If not specified, OpenCode defaults to
/bin/bash.This variable is typically set automatically by your system but can be overridden.You can also configure the shell in
.opencode.json:Enable development debug mode. When set to When enabled:
"true", OpenCode writes debug logs to a file instead of using the TUI logger.Values: "true" or unset- Logs written to
<data-directory>/debug.log - Messages saved to
<data-directory>/messages/ - Useful for debugging OpenCode itself
Configuration precedence
OpenCode loads configuration from multiple sources in this order (highest to lowest priority):- Command-line flags (e.g.,
--debug) - Environment variables (documented on this page)
- Local config file (
.opencode.jsonin working directory) - Global config file:
$HOME/.opencode.json$XDG_CONFIG_HOME/opencode/.opencode.json$HOME/.config/opencode/.opencode.json
- Default values
Provider priority
When multiple API keys are present, OpenCode selects the default provider in this order:- GitHub Copilot (
GITHUB_TOKEN) - Anthropic (
ANTHROPIC_API_KEY) - OpenAI (
OPENAI_API_KEY) - Google Gemini (
GEMINI_API_KEY) - Groq (
GROQ_API_KEY) - OpenRouter (
OPENROUTER_API_KEY) - xAI (
XAI_API_KEY) - AWS Bedrock (AWS credentials)
- Azure OpenAI (
AZURE_OPENAI_ENDPOINT) - Google Cloud VertexAI (
VERTEXAI_PROJECT+VERTEXAI_LOCATION)
.opencode.json.
Security best practices
Recommended practices
-
Use environment variables:
-
Use a secrets manager:
-
Use project-specific env files (not committed):
Then source before running:
-
Use shell-specific secure storage:
-
Restrict file permissions:
Verification
To verify which provider OpenCode will use:- Which API keys were found
- Which provider was selected as default
- Any configuration issues
Troubleshooting
No valid provider available
If you see “no valid provider available”, ensure:- At least one API key environment variable is set
- The API key is valid and has not expired
- For cloud providers (AWS, Azure, VertexAI), all required variables are set
Provider not working as expected
Check:- API key format is correct
- No typos in environment variable names
- Environment variables are exported (use
export) - Variables are set in the same shell session