Overview
OpenCode uses a JSON configuration file to customize behavior, AI models, providers, integrations, and more. The configuration system supports both global and local (project-specific) settings with automatic merging.Configuration locations
OpenCode searches for configuration files in the following order:- Global configuration
- Local configuration
macOS/Linux:
~/.opencode.json(primary location)$XDG_CONFIG_HOME/opencode/.opencode.json~/.config/opencode/.opencode.json
%USERPROFILE%\.opencode.json%LOCALAPPDATA%\opencode\.opencode.json
Configuration structure
Here’s a complete example configuration file:Configuration options
Data storage
Directory where OpenCode stores session data, history, and databases.
Working directory
Override the default working directory for OpenCode operations.
Agents
Agents are specialized AI assistants for different tasks. OpenCode supports three agent types:coder
coder
Main coding agent for writing, editing, and debugging code.
Model ID for the coder agent (e.g.,
claude-4-sonnet, gpt-4.1)Maximum tokens for coder agent responses
Reasoning effort level for models that support it (OpenAI o-series, Anthropic extended thinking)Options:
low, medium, hightask
task
Task agent for code search, analysis, and exploration.
Model ID for the task agent (typically a faster/cheaper model)
Maximum tokens for task agent responses
Reasoning effort level (if supported by model)
title
title
Title agent for generating conversation summaries (auto-configured, maxTokens locked to 80).
Model ID for the title agent
Providers
Configure API credentials for AI providers. See AI Models for supported providers.API key for the provider. Can also be set via environment variables.
Disable this provider even if credentials are available.
anthropic- Anthropic API (Claude models)openai- OpenAI API (GPT, o-series models)gemini- Google Gemini APIgroq- Groq API (Llama, Qwen models)azure- Azure OpenAI Servicebedrock- AWS Bedrockvertexai- Google Cloud Vertex AIopenrouter- OpenRouter (multi-provider proxy)copilot- GitHub Copilot (uses GitHub token)
MCP servers
Model Context Protocol servers extend OpenCode with additional capabilities.Command to execute the MCP server
Command-line arguments for the MCP server
Environment variables for the MCP server (format:
KEY=value)MCP server typeOptions:
stdio, sseURL for SSE-type MCP servers
HTTP headers for SSE-type MCP servers
LSP (Language Server Protocol)
Configure language servers for enhanced code intelligence.Command to start the language server
Command-line arguments for the language server
Disable this language server
Additional LSP server options
Context paths
Files and directories to automatically include as context for the AI.Default paths:
.github/copilot-instructions.md.cursorrules.cursor/rules/CLAUDE.md,CLAUDE.local.mdopencode.md,opencode.local.mdOpenCode.md,OpenCode.local.mdOPENCODE.md,OPENCODE.local.md
Terminal UI (TUI)
Visual theme for the terminal interfaceAvailable themes:
opencode(default)catppuccindraculaflexokigruvboxmonokaionedarktokyonighttron
Shell configuration
Path to the shell used by the bash tool
Arguments passed to the shell
Debug options
Enable debug mode for verbose logging
Enable LSP debug mode for language server diagnostics
Automatically compact conversation history to manage context size
Environment variables
OpenCode supports environment variables for sensitive data:Provider API keys
Debug options
Configuration precedence
OpenCode merges configuration from multiple sources in this order (later sources override earlier ones):- Built-in defaults - Hardcoded defaults in the application
- Global config file -
~/.opencode.jsonor XDG config directory - Environment variables - API keys and debug flags
- Local config file -
.opencode.jsonin project directory - Command-line flags - Runtime flags (e.g.,
--debug)
Auto-configuration
OpenCode automatically configures default models based on available providers, checking in this order:- GitHub Copilot (if GitHub token found)
- Anthropic (if
ANTHROPIC_API_KEYset) - OpenAI (if
OPENAI_API_KEYset) - Google Gemini (if
GEMINI_API_KEYset) - Groq (if
GROQ_API_KEYset) - OpenRouter (if
OPENROUTER_API_KEYset) - AWS Bedrock (if AWS credentials available)
- Azure OpenAI (if
AZURE_OPENAI_ENDPOINTset) - Google Cloud Vertex AI (if
VERTEXAI_PROJECTandVERTEXAI_LOCATIONset)
Validation
OpenCode validates your configuration on startup:- Model IDs - Ensures configured models are supported
- Provider credentials - Checks that required API keys are present
- Token limits - Validates maxTokens doesn’t exceed model context windows
- LSP servers - Verifies LSP command paths exist
- MCP servers - Validates MCP server configurations
Example configurations
Minimal configuration (using environment variables)
Minimal configuration (using environment variables)
Multi-provider setup
Multi-provider setup
Full featured setup with MCP and LSP
Full featured setup with MCP and LSP
Related topics
- AI Models - Complete list of supported models and providers
- Architecture - How OpenCode’s configuration system works internally