Skip to main content
Codex is OpenAI’s open-source coding agent that runs in your terminal. It can read, modify, and execute code with support for multiple models through Ollama.

Installation

Install Codex via npm:
npm install -g @openai/codex
Learn more at developers.openai.com/codex.

Quick Setup

ollama launch codex
Ollama automatically:
1

Selects a model

Interactive picker with local and cloud models
2

Configures routing

Sets up model aliases (primary, fast)
3

Launches Codex

Starts in your current directory with OSS mode

Configuration Only

ollama launch codex --config

Pass Extra Arguments

ollama launch codex -- --sandbox workspace-write

Manual Setup

Codex requires a large context window (at least 64k tokens) for effective code understanding.
To use Codex with Ollama manually, use the --oss flag:
codex --oss
This tells Codex to use the open-source configuration, which defaults to connecting to Ollama.

Specify a Different Model

codex --oss -m gpt-oss:120b

Cloud Models

codex --oss -m gpt-oss:120b-cloud

Cloud Models

gpt-oss:120b-cloud

Large reasoning model (130k context)

qwen3-coder:480b-cloud

Advanced code generation (260k context)

deepseek-v3.1:671b-cloud

Massive reasoning model (160k context)
More cloud models at ollama.com/search?c=cloud.

Local Models

gpt-oss:20b

Default model for Codex (~16GB VRAM)

gpt-oss:120b

Larger reasoning model (~80GB VRAM)

qwen3-coder

Efficient code generation (~11GB VRAM)

Configuration File

Codex stores configuration in ~/.codex/config.toml:
model = "gpt-oss:20b"
model_provider = "ollama"

[model_providers.ollama]
name = "Ollama"
base_url = "http://localhost:11434/v1"
env_key = "OLLAMA_API_KEY"
Ollama automatically manages this file when you use ollama launch codex.

Model Aliases

Codex uses model aliases for routing:
  • primary — Main model for complex reasoning
  • fast — Lightweight model for quick operations
Ollama configures these during launch. Cloud models automatically populate the fast alias.

Features

Terminal Native

Designed for command-line workflows

Multi-file Editing

Edit multiple files simultaneously

Shell Integration

Execute commands and scripts

Context-Aware

Understands project structure

Usage Examples

Start in a Project

cd ~/projects/my-app
codex --oss

Ask Codex to Make Changes

codex --oss "Add unit tests for the API module"

Use with Sandbox Mode

codex --oss --sandbox workspace-write
Sandbox modes:
  • workspace-write — Allow file modifications
  • workspace-read — Read-only access
  • none — No file system access

Specify a Profile

codex --oss -p myprofile

Connecting to ollama.com

To use cloud models hosted on ollama.com:
1

Create an API key

2

Export the key

export OLLAMA_API_KEY=your-key-here
3

Update config file

Edit ~/.codex/config.toml:
model = "gpt-oss:120b"
model_provider = "ollama"

[model_providers.ollama]
name = "Ollama"
base_url = "https://ollama.com/v1"
env_key = "OLLAMA_API_KEY"
4

Run Codex

codex

Troubleshooting

Model Not Found

Pull the model first:
ollama pull gpt-oss:20b
ollama list

Connection Refused

Verify Ollama is running:
ollama list
Check the base URL in ~/.codex/config.toml matches your Ollama host.

Context Window Too Small

For local models, increase the context window:
ollama run gpt-oss:20b /set parameter num_ctx 65536
See Context Length for more details.

OSS Flag Required

If you see authentication errors, ensure you’re using the --oss flag:
codex --oss
This flag tells Codex to use Ollama instead of OpenAI’s API.

Advanced Usage

Custom Configuration

Manually edit ~/.codex/config.toml for advanced settings:
model = "qwen3-coder:480b-cloud"
model_provider = "ollama"
max_tokens = 131072
temperature = 0.7

[model_providers.ollama]
name = "Ollama"
base_url = "http://localhost:11434/v1"
env_key = "OLLAMA_API_KEY"

[sandbox]
mode = "workspace-write"

Multiple Profiles

Create different profiles for different workflows:
codex --oss -p frontend  # Frontend development
codex --oss -p backend   # Backend development
codex --oss -p testing   # Testing workflow

Environment Variables

Codex respects these environment variables:
  • OLLAMA_API_KEY — API key for ollama.com
  • CODEX_MODEL — Override default model
  • CODEX_PROFILE — Select a configuration profile

Learn More

Codex Docs

Official Codex documentation

OpenAI API

Ollama’s OpenAI-compatible API

Context Length

Configure model context windows

GPT-OSS Models

Browse GPT-OSS model variants

Build docs developers (and LLMs) love