Installation
Install OpenCode from the official source:Quick Setup
Configuration Only
Use Specific Model
OpenCode requires a large context window (at least 64k tokens). See Context Length for configuration.
Features
Multiple Models
Select and switch between multiple configured models
Recent History
Quick access to recently used models
Favorites
Bookmark your preferred models
Cloud Support
Automatic configuration for cloud models
Recommended Models
Cloud Models
glm-4.7:cloud
Recommended model for OpenCode (200k context)
qwen3-coder:480b-cloud— Advanced code generation (260k context)minimax-m2.5:cloud— Fast, efficient coding (200k context)deepseek-v3.1:671b-cloud— Massive reasoning (160k context)
Local Models
qwen3-coder— Efficient code generation (~11GB VRAM)glm-4.7— Reasoning and coding (~25GB VRAM)deepseek-coder— Specialized code model (~20GB VRAM)
Manual Setup
Manual configuration
Manual configuration
Add a configuration block to
~/.config/opencode/opencode.json:Cloud Models Configuration
Cloud model setup
Cloud model setup
For cloud models with context/output limits:Ollama automatically adds these limits when you use
ollama launch opencode.Configuration Files
OpenCode uses two configuration files:Main Config
~/.config/opencode/opencode.json — Provider and model configuration
State File
~/.local/state/opencode/model.json — Recent and favorite models
Example state file
Example state file
ollama launch opencode.
Multiple Models
OpenCode supports multiple models simultaneously. Useollama launch opencode to configure several models:
Connecting to ollama.com
To use cloud models hosted on ollama.com:Create an API key
Go to ollama.com/settings/keys
Usage Examples
Start in a Project
Switch Models
Use the model picker in the OpenCode UI (usuallyCtrl+M or Cmd+M).
Access Recent Models
Your most recently used models appear at the top of the picker.Troubleshooting
Model Not Available
Ensure the model is pulled:Configuration Not Loading
Restart OpenCode to pick up config changes:Context Window Issues
For local models, increase context:Ollama Connection Failed
Verify Ollama is running:Advanced Configuration
Custom Provider Name
Per-Model Settings
Environment Variables
OpenCode respects:OLLAMA_HOST— Override Ollama server URLOLLAMA_API_KEY— API key for ollama.com
Backup Configuration
When usingollama launch opencode, Ollama creates backups in ~/.ollama/backups/ before modifying your configuration.
Learn More
OpenCode Website
Official OpenCode website
Install Guide
Detailed installation instructions
OpenAI API
Ollama’s OpenAI-compatible API
Context Length
Configure model context windows