OpenCode uses a JSON configuration file (.opencode.json) to customize its behavior. This page documents the complete schema and all available configuration options.
Configuration file locations
OpenCode searches for configuration files in the following order:
$HOME/.opencode.json - Global configuration
$XDG_CONFIG_HOME/opencode/.opencode.json - XDG-compliant location
./.opencode.json - Local directory (merged with global config)
Local configuration files are merged with global configuration. Local settings take precedence over global settings.
Root properties
Storage configuration for OpenCode’s database and files. directory
string
default: ".opencode"
Directory where application data is stored (database, session history, etc.)
Working directory for the application. If not specified, uses the current directory.
Enable debug mode with verbose logging. When true, log level is set to debug automatically.
Enable Language Server Protocol debug mode for troubleshooting LSP integration issues.
Automatically summarize conversations when approaching the model’s context window limit. When enabled, OpenCode monitors token usage and creates a new session with a summary at 95% context usage.
List of file paths to automatically include in the AI context. These files are added to every conversation. Default paths:
.github/copilot-instructions.md
.cursorrules
.cursor/rules/
CLAUDE.md, CLAUDE.local.md
opencode.md, opencode.local.md
OpenCode.md, OpenCode.local.md
OPENCODE.md, OPENCODE.local.md
Agents configuration
Configuration for different AI agents used by OpenCode. Each agent can use a different model and token limit. Main coding agent for interactive sessions. Handles code editing, file operations, and complex tasks. Maximum tokens for agent responses (minimum: 1).
Reasoning effort level for models that support extended thinking (OpenAI o-series, Anthropic extended thinking). Options: low, medium, highOnly applicable to models with CanReason: true.
Task agent for executing background operations like file search, grep, and analysis. Model ID to use for the task agent.
Maximum tokens for task agent responses.
Reasoning effort level (low, medium, high).
Title agent for generating session titles from first messages. Model ID to use for the title agent.
Maximum tokens for title generation (automatically set to 80).
Providers configuration
LLM provider configurations. Each provider can be enabled/disabled and configured with API keys. Anthropic (Claude) provider configuration. Anthropic API key. Can also be set via ANTHROPIC_API_KEY environment variable.
OpenAI provider configuration. OpenAI API key. Can also be set via OPENAI_API_KEY environment variable.
Google Gemini provider configuration. Gemini API key. Can also be set via GEMINI_API_KEY environment variable.
Groq provider configuration. Groq API key. Can also be set via GROQ_API_KEY environment variable.
OpenRouter provider configuration. OpenRouter API key. Can also be set via OPENROUTER_API_KEY environment variable.
AWS Bedrock provider configuration. Uses AWS credentials from environment. Azure OpenAI provider configuration. Azure OpenAI API key. Optional when using Entra ID authentication.
Google Cloud VertexAI provider configuration. GitHub Copilot provider configuration. GitHub token. Auto-detected from GitHub Copilot configuration or GITHUB_TOKEN environment variable.
MCP servers configuration
Model Control Protocol server configurations. Define custom MCP servers to extend OpenCode’s capabilities. Each key is the server name, and the value is a server configuration object. Command to execute for the MCP server.
Type of MCP server. Options: stdio, sse
Command arguments for the MCP server.
Environment variables for the MCP server (array of strings).
URL for SSE type MCP servers (required when type: "sse").
HTTP headers for SSE type MCP servers (key-value pairs).
LSP configuration
Language Server Protocol configurations. Define LSP servers for different programming languages. Each key is a language identifier, and the value is an LSP configuration object. Command to execute for the LSP server (e.g., gopls, typescript-language-server).
Command arguments for the LSP server.
Whether the LSP is disabled.
Additional options for the LSP server (provider-specific).
Shell configuration
Shell configuration for the bash tool. path
string
default: "$SHELL or /bin/bash"
Path to the shell executable.
Arguments to pass to the shell.
TUI configuration
Terminal User Interface configuration. TUI theme name. Available themes:
opencode - Default OpenCode theme
catppuccin - Catppuccin color scheme
dracula - Dracula theme
flexoki - Flexoki theme
gruvbox - Gruvbox theme
monokai - Monokai theme
onedark - One Dark theme
tokyonight - Tokyo Night theme
tron - Tron theme
Example configuration
{
"data" : {
"directory" : ".opencode"
},
"debug" : false ,
"autoCompact" : true ,
"providers" : {
"anthropic" : {
"apiKey" : "sk-ant-..."
},
"openai" : {
"apiKey" : "sk-..."
}
},
"agents" : {
"coder" : {
"model" : "claude-3.7-sonnet" ,
"maxTokens" : 5000 ,
"reasoningEffort" : "medium"
},
"task" : {
"model" : "gpt-4.1-mini" ,
"maxTokens" : 5000
},
"title" : {
"model" : "gpt-4.1-mini"
}
},
"shell" : {
"path" : "/bin/zsh" ,
"args" : [ "-l" ]
},
"lsp" : {
"go" : {
"command" : "gopls" ,
"disabled" : false
},
"typescript" : {
"command" : "typescript-language-server" ,
"args" : [ "--stdio" ]
}
},
"mcpServers" : {
"example" : {
"type" : "stdio" ,
"command" : "uvx" ,
"args" : [ "mcp-server-git" ]
}
},
"tui" : {
"theme" : "tokyonight"
}
}
JSON schema
The complete JSON schema is available in the OpenCode repository at opencode-schema.json. You can use it for validation and autocompletion in your editor:
{
"$schema" : "https://raw.githubusercontent.com/opencode-ai/opencode/main/opencode-schema.json"
}