Installation
Install Codex via npm:Quick Setup
Configuration Only
Pass Extra Arguments
Manual Setup
Codex requires a large context window (at least 64k tokens) for effective code understanding.
--oss flag:
Specify a Different Model
Cloud Models
Recommended Models
Cloud Models
gpt-oss:120b-cloud
Large reasoning model (130k context)
qwen3-coder:480b-cloud
Advanced code generation (260k context)
deepseek-v3.1:671b-cloud
Massive reasoning model (160k context)
Local Models
gpt-oss:20b
Default model for Codex (~16GB VRAM)
gpt-oss:120b
Larger reasoning model (~80GB VRAM)
qwen3-coder
Efficient code generation (~11GB VRAM)
Configuration File
Codex stores configuration in~/.codex/config.toml:
Example configuration
Example configuration
ollama launch codex.
Model Aliases
Codex uses model aliases for routing:- primary — Main model for complex reasoning
- fast — Lightweight model for quick operations
fast alias.
Features
Terminal Native
Designed for command-line workflows
Multi-file Editing
Edit multiple files simultaneously
Shell Integration
Execute commands and scripts
Context-Aware
Understands project structure
Usage Examples
Start in a Project
Ask Codex to Make Changes
Use with Sandbox Mode
workspace-write— Allow file modificationsworkspace-read— Read-only accessnone— No file system access
Specify a Profile
Connecting to ollama.com
To use cloud models hosted on ollama.com:Create an API key
Go to ollama.com/settings/keys
Troubleshooting
Model Not Found
Pull the model first:Connection Refused
Verify Ollama is running:~/.codex/config.toml matches your Ollama host.
Context Window Too Small
For local models, increase the context window:OSS Flag Required
If you see authentication errors, ensure you’re using the--oss flag:
Advanced Usage
Custom Configuration
Manually edit~/.codex/config.toml for advanced settings:
Multiple Profiles
Create different profiles for different workflows:Environment Variables
Codex respects these environment variables:OLLAMA_API_KEY— API key for ollama.comCODEX_MODEL— Override default modelCODEX_PROFILE— Select a configuration profile
Learn More
Codex Docs
Official Codex documentation
OpenAI API
Ollama’s OpenAI-compatible API
Context Length
Configure model context windows
GPT-OSS Models
Browse GPT-OSS model variants