Skip to main content
OpenCode is an open-source AI coding assistant that runs in your terminal. Built by Anomaly, it provides an intuitive interface for AI-powered code generation and editing.

Installation

Install OpenCode from the official source:
curl -fsSL https://opencode.ai/install | bash
Learn more at opencode.ai.

Quick Setup

ollama launch opencode
Ollama automatically:
1

Selects models

Interactive multi-select picker for models
2

Updates configuration

Writes to ~/.config/opencode/opencode.json
3

Updates recents

Adds models to recent list in ~/.local/state/opencode/model.json
4

Launches OpenCode

Starts with configured models available

Configuration Only

ollama launch opencode --config

Use Specific Model

ollama launch opencode --model qwen3-coder:480b-cloud
OpenCode requires a large context window (at least 64k tokens). See Context Length for configuration.

Features

Multiple Models

Select and switch between multiple configured models

Recent History

Quick access to recently used models

Favorites

Bookmark your preferred models

Cloud Support

Automatic configuration for cloud models

Cloud Models

glm-4.7:cloud

Recommended model for OpenCode (200k context)
Other excellent options:
  • qwen3-coder:480b-cloud — Advanced code generation (260k context)
  • minimax-m2.5:cloud — Fast, efficient coding (200k context)
  • deepseek-v3.1:671b-cloud — Massive reasoning (160k context)
Explore more at ollama.com/search?c=cloud.

Local Models

  • qwen3-coder — Efficient code generation (~11GB VRAM)
  • glm-4.7 — Reasoning and coding (~25GB VRAM)
  • deepseek-coder — Specialized code model (~20GB VRAM)

Manual Setup

Add a configuration block to ~/.config/opencode/opencode.json:
{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Ollama",
      "options": {
        "baseURL": "http://localhost:11434/v1"
      },
      "models": {
        "qwen3-coder": {
          "name": "qwen3-coder"
        }
      }
    }
  }
}

Cloud Models Configuration

For cloud models with context/output limits:
{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Ollama",
      "options": {
        "baseURL": "http://localhost:11434/v1"
      },
      "models": {
        "glm-4.7:cloud": {
          "name": "glm-4.7:cloud",
          "limit": {
            "context": 202752,
            "output": 131072
          }
        }
      }
    }
  }
}
Ollama automatically adds these limits when you use ollama launch opencode.

Configuration Files

OpenCode uses two configuration files:

Main Config

~/.config/opencode/opencode.json — Provider and model configuration

State File

~/.local/state/opencode/model.json — Recent and favorite models
{
  "recent": [
    {
      "providerID": "ollama",
      "modelID": "glm-4.7:cloud"
    },
    {
      "providerID": "ollama",
      "modelID": "qwen3-coder"
    }
  ],
  "favorite": [
    {
      "providerID": "ollama",
      "modelID": "glm-4.7:cloud"
    }
  ],
  "variant": {}
}
Ollama automatically manages both files when you use ollama launch opencode.

Multiple Models

OpenCode supports multiple models simultaneously. Use ollama launch opencode to configure several models:
 ollama launch opencode
# → Select multiple models in the picker
# → All models become available in OpenCode
Switch between models in the OpenCode UI.

Connecting to ollama.com

To use cloud models hosted on ollama.com:
1

Create an API key

2

Export the key

export OLLAMA_API_KEY=your-key-here
3

Update config

Edit ~/.config/opencode/opencode.json:
{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Ollama Cloud",
      "options": {
        "baseURL": "https://ollama.com/v1",
        "headers": {
          "Authorization": "Bearer ${OLLAMA_API_KEY}"
        }
      },
      "models": {
        "glm-4.7:cloud": {
          "name": "glm-4.7:cloud"
        }
      }
    }
  }
}
4

Restart OpenCode

opencode

Usage Examples

Start in a Project

cd ~/projects/my-app
opencode
OpenCode opens in your terminal with the configured models available.

Switch Models

Use the model picker in the OpenCode UI (usually Ctrl+M or Cmd+M).

Access Recent Models

Your most recently used models appear at the top of the picker.

Troubleshooting

Model Not Available

Ensure the model is pulled:
ollama pull glm-4.7:cloud
ollama list

Configuration Not Loading

Restart OpenCode to pick up config changes:
# Kill existing OpenCode instances
pkill opencode

# Start fresh
opencode

Context Window Issues

For local models, increase context:
ollama run qwen3-coder /set parameter num_ctx 65536
See Context Length for details.

Ollama Connection Failed

Verify Ollama is running:
ollama list
Check the base URL in your config matches your Ollama host.

Advanced Configuration

Custom Provider Name

{
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "My Local Ollama",
      "options": {
        "baseURL": "http://localhost:11434/v1"
      }
    }
  }
}

Per-Model Settings

{
  "provider": {
    "ollama": {
      "models": {
        "qwen3-coder": {
          "name": "Qwen3 Coder (Local)",
          "limit": {
            "context": 32768,
            "output": 8192
          }
        }
      }
    }
  }
}

Environment Variables

OpenCode respects:
  • OLLAMA_HOST — Override Ollama server URL
  • OLLAMA_API_KEY — API key for ollama.com

Backup Configuration

When using ollama launch opencode, Ollama creates backups in ~/.ollama/backups/ before modifying your configuration.

Learn More

OpenCode Website

Official OpenCode website

Install Guide

Detailed installation instructions

OpenAI API

Ollama’s OpenAI-compatible API

Context Length

Configure model context windows

Build docs developers (and LLMs) love