@ccusage/codex
Analyze OpenAI Codex CLI usage logs with the same reporting experience as ccusage. Track token usage, costs, and sessions for GPT-5 and other OpenAI models.Beta: The Codex CLI support is experimental. Expect breaking changes until the upstream Codex tooling stabilizes.
Quick Start
Installation
Shell Alias (Recommended)
Sincenpx @ccusage/codex@latest is quite long to type repeatedly, set up a shell alias:
Data Source
The CLI looks for Codex session JSONL files under:- Default:
~/.codex/sessions/ - Custom: Set
CODEX_HOMEenvironment variable
event_msg with payload.type === "token_count" containing usage data.
Common Commands
Daily Usage Report
View token usage and costs grouped by date:Date Range Filtering
Filter reports by specific date ranges:JSON Output
Export structured data for scripting or integration:Monthly Report
View usage aggregated by month:Session Report
Detailed breakdown by individual sessions:Token Fields
The analyzer tracks these token types:| Field | Description | Billing |
|---|---|---|
input_tokens | Prompt tokens sent this turn | Priced at input rate minus cached share |
cached_input_tokens | Prompt tokens from cache | Priced at cached input rate (cheaper) |
output_tokens | Completion tokens including reasoning | Priced at output rate |
reasoning_output_tokens | Structured reasoning breakdown | Informational only (included in output) |
total_tokens | Cumulative total | Sum of input + output |
Features
Responsive Tables
Beautiful terminal tables shared with the ccusage CLI
Offline Pricing
Offline-first pricing cache with automatic LiteLLM refresh
Per-Model Tracking
Token and cost aggregation by model, including cached tokens
Multiple Reports
Daily, monthly, and session rollups with identical CLI options
Environment Variables
| Variable | Description | Default |
|---|---|---|
CODEX_HOME | Override root directory for Codex sessions | ~/.codex |
LOG_LEVEL | Control log verbosity (0 silent … 5 trace) | - |
Model Support
GPT-5 and Aliases
The CLI supports GPT-5 models and automatically resolves aliases:gpt-5-codex→ Maps to LiteLLM’sgpt-5pricinggpt-5→ 1M token context windows- Automatic fallback for legacy sessions without model metadata
Pricing Notes
Example GPT-5 pricing (as of 2025-08-07):- Input: $1.25 per 1M tokens
- Cached Input: $0.125 per 1M tokens (10x cheaper)
- Output: $10 per 1M tokens
Sessions from early September 2025 that lack model metadata are treated as
gpt-5 and marked with isFallback: true in JSON output. Pricing for these sessions is approximate.Legacy JSONL Handling
For legacy JSONL files missingturn_context metadata:
- Tokens are treated as
gpt-5for visibility - Pricing is approximate (flagged in reports)
- JSON output includes
"isFallback": trueon affected model entries
Links
npm Package
View on npm registry
OpenAI Codex
Official Codex CLI repository
ccusage Family
Explore all related tools
Main Documentation
Full ccusage documentation