Skip to main content

Prerequisites

You must have OpenCode 1.0.150 or higher installed before proceeding.Check your version:
opencode --version
If OpenCode isn’t installed, visit https://opencode.ai/docs for installation instructions.

Installation methods

The easiest and most reliable way:
1

Copy this prompt to your LLM agent

Paste this into Claude Code, AmpCode, Cursor, or any agent:
Install and configure oh-my-opencode by following the instructions here:
https://raw.githubusercontent.com/code-yeongyu/oh-my-opencode/refs/heads/dev/docs/guide/installation.md
The agent will:
  • Ask about your AI subscriptions
  • Run the installer with correct flags
  • Configure agent models automatically
  • Guide you through provider authentication
  • Verify the setup is working
2

Answer subscription questions

The agent will ask about:
  • Claude Pro/Max subscription (max20 mode?)
  • ChatGPT Plus subscription
  • Gemini access
  • GitHub Copilot subscription
  • OpenCode Zen access
  • Z.ai Coding Plan subscription
Based on your answers, it runs the installer with appropriate flags.
3

Authenticate providers

The agent will guide you through authentication for each provider you selected.

Method 2: Interactive installer

Run the installer manually and answer the prompts:
bunx oh-my-opencode install
The CLI ships with standalone binaries for all major platforms:
  • macOS (ARM64, x64)
  • Linux (x64, ARM64, Alpine/musl)
  • Windows (x64)
No runtime (Bun/Node.js) required for CLI execution after installation.

Method 3: Non-interactive installer

If you know exactly which providers you have, use flags:
bunx oh-my-opencode install --no-tui \
  --claude=max20 \
  --openai=yes \
  --gemini=yes \
  --copilot=no

Provider authentication

After installation, authenticate the providers you selected.

Anthropic (Claude)

1

Start authentication

opencode auth login
2

Select Anthropic provider

In the interactive terminal, choose:
  • Provider: Anthropic
  • Login method: Claude Pro/Max
3

Complete OAuth flow

A browser window will open. Sign in with your Anthropic account and authorize OpenCode.
4

Verify success

opencode config list
You should see Anthropic listed with models like anthropic/claude-opus-4-6.
Sisyphus strongly recommends Claude Opus 4.6. Using other models for Sisyphus may result in significantly degraded experience.Sisyphus has Claude-optimized prompts. No GPT prompt exists for Sisyphus.

Google Gemini (Antigravity OAuth)

1

Add the Antigravity auth plugin

Edit ~/.config/opencode/opencode.json:
{
  "plugin": [
    "oh-my-opencode",
    "opencode-antigravity-auth@latest"
  ]
}
2

Configure Gemini models

Read the opencode-antigravity-auth documentation.Copy the full model configuration from the README and merge carefully to avoid breaking existing setup.
The plugin uses a variant system:
  • antigravity-gemini-3-pro supports low/high variants
  • Use --variant=high instead of separate -high model entries
3

Override Oh My OpenCode agent models

In ~/.config/opencode/oh-my-opencode.json:
{
  "agents": {
    "multimodal-looker": {
      "model": "google/antigravity-gemini-3-flash"
    }
  }
}
Available Antigravity models:
  • google/antigravity-gemini-3-pro — variants: low, high
  • google/antigravity-gemini-3-flash — variants: minimal, low, medium, high
  • google/antigravity-claude-sonnet-4-6 — no variants
  • google/antigravity-claude-sonnet-4-6-thinking — variants: low, max
  • google/antigravity-claude-opus-4-5-thinking — variants: low, max
Available Gemini CLI quota models:
  • google/gemini-2.5-flash, google/gemini-2.5-pro, google/gemini-3-flash-preview, google/gemini-3-pro-preview
4

Authenticate

opencode auth login
  • Provider: Google
  • Login method: OAuth with Google (Antigravity)
Complete sign-in in browser (auto-detected).
5

Multi-account load balancing (optional)

The plugin supports up to 10 Google accounts. When one hits rate limits, it automatically switches to the next available account.To add more accounts, run opencode auth login again and select Google.

OpenAI (ChatGPT Plus)

1

Authenticate

opencode auth login
  • Provider: OpenAI
  • Login method: ChatGPT Plus
2

Verify GPT models are available

opencode config list
You should see:
  • openai/gpt-5.3-codex — Required for Hephaestus agent
  • openai/gpt-5.2 — Used for Oracle agent
  • openai/gpt-5-nano — Fast utility tasks
Why GPT access matters:
  • Hephaestus (deep autonomous worker) requires GPT-5.3-codex
  • Oracle (architecture consultant) prefers GPT-5.2 for high-IQ reasoning
  • Prometheus and Atlas have dual prompts that auto-switch to GPT-optimized versions when GPT models detected

GitHub Copilot (fallback provider)

GitHub Copilot acts as a proxy provider when native providers are unavailable.
1

Run installer with Copilot flag

bunx oh-my-opencode install --no-tui --copilot=yes
2

Authenticate

opencode auth login
  • Provider: GitHub
  • Method: OAuth
Provider priority:
Native (anthropic/, openai/, google/) > GitHub Copilot > OpenCode Zen > Z.ai Coding Plan
Model mappings when Copilot is the best available provider:
AgentModel
Sisyphusgithub-copilot/claude-opus-4-6
Oraclegithub-copilot/gpt-5.2
Exploreopencode/gpt-5-nano
Librarianzai-coding-plan/glm-4.7 (if Z.ai available)

Z.ai Coding Plan

Z.ai provides access to GLM-4.7 models. When enabled, Librarian agent always uses zai-coding-plan/glm-4.7 regardless of other providers.
1

Run installer with Z.ai flag

bunx oh-my-opencode install --no-tui --zai-coding-plan=yes
2

Authenticate via Z.ai dashboard

Visit https://z.ai/subscribe and subscribe to the Coding Plan ($10/month).Follow Z.ai’s authentication instructions to connect with OpenCode.
Model mappings when Z.ai is the only provider:
AgentModel
Sisyphuszai-coding-plan/glm-4.7
Oraclezai-coding-plan/glm-4.7
Explorezai-coding-plan/glm-4.7-flash
Librarianzai-coding-plan/glm-4.7

OpenCode Zen

OpenCode Zen provides access to opencode/ prefixed models:
  • opencode/claude-opus-4-6
  • opencode/gpt-5.2
  • opencode/gpt-5-nano
  • opencode/glm-4.7-free
1

Run installer with OpenCode Zen flag

bunx oh-my-opencode install --no-tui --opencode-zen=yes
2

Check available models

opencode config list
Models are automatically available through OpenCode’s built-in provider.
Model mappings when OpenCode Zen is the best available provider:
AgentModel
Sisyphusopencode/claude-opus-4-6
Oracleopencode/gpt-5.2
Exploreopencode/gpt-5-nano
Librarianopencode/glm-4.7-free

Verify installation

1

Check OpenCode version

opencode --version
Should be 1.0.150 or higher.
2

Verify plugin is loaded

cat ~/.config/opencode/opencode.json
Should contain "oh-my-opencode" in the plugin array:
{
  "plugin": [
    "oh-my-opencode"
  ]
}
3

Check agent configuration

cat ~/.config/opencode/oh-my-opencode.json
You should see agent models configured based on your subscriptions.
4

Run a test command

opencode
In the session, type:
@sisyphus Hello!
Sisyphus should respond, confirming the plugin is working.

Understanding your model setup

The installer auto-configured agent models based on your subscriptions. Here’s what you got:

Model families

Claude-like models (instruction-following, structured output):
  • Claude Opus 4.6, Claude Sonnet 4.6, Claude Haiku 4.5
  • Kimi K2.5 — behaves very similarly to Claude, great all-rounder
  • GLM 5 — Claude-like behavior, good for broad tasks
  • Big Pickle (GLM 4.6) — free-tier GLM, decent fallback
GPT models (explicit reasoning, principle-driven):
  • GPT-5.3-codex — deep coding powerhouse, required for Hephaestus
  • GPT-5.2 — high intelligence, default for Oracle
  • GPT-5-Nano — ultra-cheap, fast utility tasks
Different-behavior models:
  • Gemini 3 Pro — excels at visual/frontend tasks, different reasoning style
  • Gemini 3 Flash — fast, good for doc search and light tasks
  • MiniMax M2.5 — fast and smart for utility tasks
  • Grok Code Fast 1 — very fast, optimized for code grep/search

Agent model assignments

Claude-optimized agents (prompts tuned for Claude-family):
AgentRoleDefault Chain
SisyphusMain ultraworkerOpus (max) → Kimi K2.5 → GLM 5 → Big Pickle
MetisPlan reviewOpus (max) → Kimi K2.5 → GPT-5.2 → Gemini 3 Pro
Never use GPT for Sisyphus. No GPT prompt exists. Will degrade significantly.
Dual-prompt agents (auto-switch between Claude and GPT prompts):
AgentRoleDefault ChainGPT Prompt?
PrometheusStrategic plannerOpus (max) → GPT-5.2 (high) → Kimi K2.5 → Gemini 3 ProYes — XML-tagged, principle-driven
AtlasTodo orchestratorKimi K2.5 → Sonnet → GPT-5.2Yes — GPT-optimized todo management
GPT-native agents (built for GPT, don’t override to Claude):
AgentRoleDefault Chain
HephaestusDeep autonomous workerGPT-5.3-codex (medium) only
OracleArchitecture/debuggingGPT-5.2 (high) → Gemini 3 Pro → Opus
MomusHigh-accuracy reviewerGPT-5.2 (medium) → Opus → Gemini 3 Pro
Utility agents (speed over intelligence):
AgentRoleDefault Chain
ExploreFast codebase grepMiniMax M2.5 Free → Grok Code Fast → Haiku
LibrarianDocs/code searchMiniMax M2.5 Free → Gemini Flash → Big Pickle
Multimodal LookerVision/screenshotsKimi K2.5 → Kimi Free → Gemini Flash
Don’t “upgrade” utility agents to Opus. They’re intentionally using fast, cheap models. Explore and Librarian don’t need deep reasoning for grep and search tasks.

Safe vs dangerous overrides

Safe (same family):
  • Sisyphus: Opus → Sonnet, Kimi K2.5, GLM 5
  • Prometheus: Opus → GPT-5.2 (auto-switches prompt)
  • Atlas: Kimi K2.5 → Sonnet, GPT-5.2 (auto-switches)
Dangerous (no prompt support):
  • Sisyphus → GPT: No GPT prompt. Will degrade significantly.
  • Hephaestus → Claude: Built for Codex. Claude can’t replicate this.
  • Explore → Opus: Massive cost waste. Explore needs speed, not intelligence.
  • Librarian → Opus: Same. Doc search doesn’t need Opus-level reasoning.

Custom configuration

Override specific agents or categories in ~/.config/opencode/oh-my-opencode.json:
{
  "$schema": "https://raw.githubusercontent.com/code-yeongyu/oh-my-opencode/dev/assets/oh-my-opencode.schema.json",

  "agents": {
    "sisyphus": {
      "model": "kimi-for-coding/k2p5",
      "ultrawork": { "model": "anthropic/claude-opus-4-6", "variant": "max" }
    },
    "oracle": {
      "model": "openai/gpt-5.2",
      "variant": "high"
    }
  },

  "categories": {
    "visual-engineering": {
      "model": "google/gemini-3-pro",
      "variant": "high"
    },
    "quick": {
      "model": "anthropic/claude-haiku-4-5"
    }
  }
}
See the Configuration Reference for all available options.

Troubleshooting

Check ~/.config/opencode/opencode.json contains "oh-my-opencode" in the plugin array.If missing:
bunx oh-my-opencode install --no-tui
Clear auth cache and re-authenticate:
rm -rf ~/.config/opencode/auth
opencode auth login
Check if Claude is authenticated:
opencode config list | grep anthropic
If not listed, run:
opencode auth login
Select Anthropic → Claude Pro/Max.
Hephaestus requires GPT-5.3-codex access (ChatGPT Plus subscription).Verify:
opencode config list | grep gpt-5.3-codex
If not listed, authenticate OpenAI:
opencode auth login
Plugin load timeout may be too short. The plugin needs time to load all configurations.Wait 10-15 seconds after starting OpenCode, then check again:
opencode config list

Next steps

Run your first ultrawork

Try ultrawork mode and see parallel agent execution in action

Learn about agents

Deep dive into each agent’s role and capabilities

Customize configuration

Override models, disable features, and tune for your workflow

Troubleshooting guide

Solutions for common issues and advanced debugging

Build docs developers (and LLMs) love