Skip to main content

Overview

OpenCode uses a JSON configuration file to customize behavior, AI models, providers, integrations, and more. The configuration system supports both global and local (project-specific) settings with automatic merging.

Configuration locations

OpenCode searches for configuration files in the following order:
macOS/Linux:
  • ~/.opencode.json (primary location)
  • $XDG_CONFIG_HOME/opencode/.opencode.json
  • ~/.config/opencode/.opencode.json
Windows:
  • %USERPROFILE%\.opencode.json
  • %LOCALAPPDATA%\opencode\.opencode.json

Configuration structure

Here’s a complete example configuration file:
{
  "data": {
    "directory": ".opencode"
  },
  "wd": "/path/to/working/directory",
  "agents": {
    "coder": {
      "model": "claude-4-sonnet",
      "maxTokens": 50000,
      "reasoningEffort": "medium"
    },
    "task": {
      "model": "gpt-4.1-mini",
      "maxTokens": 5000
    },
    "title": {
      "model": "gpt-4o-mini",
      "maxTokens": 80
    }
  },
  "providers": {
    "anthropic": {
      "apiKey": "sk-ant-...",
      "disabled": false
    },
    "openai": {
      "apiKey": "sk-...",
      "disabled": false
    }
  },
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/files"],
      "type": "stdio"
    }
  },
  "lsp": {
    "typescript": {
      "command": "typescript-language-server",
      "args": ["--stdio"],
      "disabled": false
    }
  },
  "contextPaths": [
    ".github/copilot-instructions.md",
    ".cursorrules",
    "opencode.md"
  ],
  "tui": {
    "theme": "opencode"
  },
  "shell": {
    "path": "/bin/bash",
    "args": ["-l"]
  },
  "debug": false,
  "debugLSP": false,
  "autoCompact": true
}

Configuration options

Data storage

data.directory
string
default:".opencode"
Directory where OpenCode stores session data, history, and databases.

Working directory

wd
string
Override the default working directory for OpenCode operations.

Agents

Agents are specialized AI assistants for different tasks. OpenCode supports three agent types:
Main coding agent for writing, editing, and debugging code.
agents.coder.model
string
required
Model ID for the coder agent (e.g., claude-4-sonnet, gpt-4.1)
agents.coder.maxTokens
integer
default:"5000"
Maximum tokens for coder agent responses
agents.coder.reasoningEffort
string
Reasoning effort level for models that support it (OpenAI o-series, Anthropic extended thinking)Options: low, medium, high
Task agent for code search, analysis, and exploration.
agents.task.model
string
required
Model ID for the task agent (typically a faster/cheaper model)
agents.task.maxTokens
integer
default:"5000"
Maximum tokens for task agent responses
agents.task.reasoningEffort
string
Reasoning effort level (if supported by model)
Title agent for generating conversation summaries (auto-configured, maxTokens locked to 80).
agents.title.model
string
required
Model ID for the title agent

Providers

Configure API credentials for AI providers. See AI Models for supported providers.
providers.{providerName}.apiKey
string
API key for the provider. Can also be set via environment variables.
providers.{providerName}.disabled
boolean
default:"false"
Disable this provider even if credentials are available.
Supported providers:
  • anthropic - Anthropic API (Claude models)
  • openai - OpenAI API (GPT, o-series models)
  • gemini - Google Gemini API
  • groq - Groq API (Llama, Qwen models)
  • azure - Azure OpenAI Service
  • bedrock - AWS Bedrock
  • vertexai - Google Cloud Vertex AI
  • openrouter - OpenRouter (multi-provider proxy)
  • copilot - GitHub Copilot (uses GitHub token)

MCP servers

Model Context Protocol servers extend OpenCode with additional capabilities.
mcpServers.{name}.command
string
required
Command to execute the MCP server
mcpServers.{name}.args
array
Command-line arguments for the MCP server
mcpServers.{name}.env
array
Environment variables for the MCP server (format: KEY=value)
mcpServers.{name}.type
string
default:"stdio"
MCP server typeOptions: stdio, sse
mcpServers.{name}.url
string
URL for SSE-type MCP servers
mcpServers.{name}.headers
object
HTTP headers for SSE-type MCP servers
Example MCP server configurations:
{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/files"],
      "type": "stdio"
    },
    "github": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-github"],
      "env": ["GITHUB_PERSONAL_ACCESS_TOKEN=ghp_..."],
      "type": "stdio"
    },
    "sse-server": {
      "command": "curl",
      "type": "sse",
      "url": "https://api.example.com/mcp",
      "headers": {
        "Authorization": "Bearer token"
      }
    }
  }
}

LSP (Language Server Protocol)

Configure language servers for enhanced code intelligence.
lsp.{language}.command
string
required
Command to start the language server
lsp.{language}.args
array
Command-line arguments for the language server
lsp.{language}.disabled
boolean
default:"false"
Disable this language server
lsp.{language}.options
object
Additional LSP server options
Example LSP configurations:
{
  "lsp": {
    "typescript": {
      "command": "typescript-language-server",
      "args": ["--stdio"]
    },
    "python": {
      "command": "pylsp",
      "args": []
    },
    "go": {
      "command": "gopls",
      "args": ["serve"]
    },
    "rust": {
      "command": "rust-analyzer",
      "args": []
    }
  }
}

Context paths

contextPaths
array
Files and directories to automatically include as context for the AI.Default paths:
  • .github/copilot-instructions.md
  • .cursorrules
  • .cursor/rules/
  • CLAUDE.md, CLAUDE.local.md
  • opencode.md, opencode.local.md
  • OpenCode.md, OpenCode.local.md
  • OPENCODE.md, OPENCODE.local.md

Terminal UI (TUI)

tui.theme
string
default:"opencode"
Visual theme for the terminal interfaceAvailable themes:
  • opencode (default)
  • catppuccin
  • dracula
  • flexoki
  • gruvbox
  • monokai
  • onedark
  • tokyonight
  • tron

Shell configuration

shell.path
string
default:"$SHELL or /bin/bash"
Path to the shell used by the bash tool
shell.args
array
default:"[\"-l\"]"
Arguments passed to the shell

Debug options

debug
boolean
default:"false"
Enable debug mode for verbose logging
debugLSP
boolean
default:"false"
Enable LSP debug mode for language server diagnostics
autoCompact
boolean
default:"true"
Automatically compact conversation history to manage context size

Environment variables

OpenCode supports environment variables for sensitive data:

Provider API keys

# Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."

# OpenAI
export OPENAI_API_KEY="sk-..."

# Google Gemini
export GEMINI_API_KEY="..."

# Groq
export GROQ_API_KEY="gsk_..."

# OpenRouter
export OPENROUTER_API_KEY="sk-or-..."

# xAI (Grok)
export XAI_API_KEY="..."

# Azure OpenAI
export AZURE_OPENAI_ENDPOINT="https://....openai.azure.com/"
export AZURE_OPENAI_API_KEY="..."  # Optional with Entra ID

# AWS Bedrock (uses standard AWS credentials)
export AWS_ACCESS_KEY_ID="..."
export AWS_SECRET_ACCESS_KEY="..."
export AWS_REGION="us-east-1"

# Google Cloud Vertex AI
export VERTEXAI_PROJECT="my-project"
export VERTEXAI_LOCATION="us-central1"
# Or use GOOGLE_CLOUD_PROJECT and GOOGLE_CLOUD_REGION

# GitHub Copilot (auto-detected from GitHub CLI or environment)
export GITHUB_TOKEN="ghp_..."

Debug options

# Enable development debug mode (writes to .opencode/debug.log)
export OPENCODE_DEV_DEBUG="true"

Configuration precedence

OpenCode merges configuration from multiple sources in this order (later sources override earlier ones):
  1. Built-in defaults - Hardcoded defaults in the application
  2. Global config file - ~/.opencode.json or XDG config directory
  3. Environment variables - API keys and debug flags
  4. Local config file - .opencode.json in project directory
  5. Command-line flags - Runtime flags (e.g., --debug)

Auto-configuration

OpenCode automatically configures default models based on available providers, checking in this order:
  1. GitHub Copilot (if GitHub token found)
  2. Anthropic (if ANTHROPIC_API_KEY set)
  3. OpenAI (if OPENAI_API_KEY set)
  4. Google Gemini (if GEMINI_API_KEY set)
  5. Groq (if GROQ_API_KEY set)
  6. OpenRouter (if OPENROUTER_API_KEY set)
  7. AWS Bedrock (if AWS credentials available)
  8. Azure OpenAI (if AZURE_OPENAI_ENDPOINT set)
  9. Google Cloud Vertex AI (if VERTEXAI_PROJECT and VERTEXAI_LOCATION set)
If no providers are configured, OpenCode will prompt you to set up credentials.

Validation

OpenCode validates your configuration on startup:
  • Model IDs - Ensures configured models are supported
  • Provider credentials - Checks that required API keys are present
  • Token limits - Validates maxTokens doesn’t exceed model context windows
  • LSP servers - Verifies LSP command paths exist
  • MCP servers - Validates MCP server configurations
Invalid configurations are automatically corrected with warnings logged to help you fix issues.

Example configurations

{
  "agents": {
    "coder": {
      "model": "claude-4-sonnet"
    }
  }
}
Set your API key via environment:
export ANTHROPIC_API_KEY="sk-ant-..."
{
  "agents": {
    "coder": {
      "model": "claude-4-sonnet",
      "maxTokens": 50000,
      "reasoningEffort": "high"
    },
    "task": {
      "model": "gpt-4.1-mini",
      "maxTokens": 5000
    }
  },
  "providers": {
    "anthropic": {
      "apiKey": "sk-ant-..."
    },
    "openai": {
      "apiKey": "sk-..."
    }
  }
}
  • AI Models - Complete list of supported models and providers
  • Architecture - How OpenCode’s configuration system works internally

Build docs developers (and LLMs) love