Introduction
Jan’s CLI enables you to run local AI models from the command line and wire them to AI coding agents like Claude Code or OpenCode — no cloud required. The CLI shares all core logic with the Jan desktop app, so models downloaded in the desktop app are automatically available in the CLI.Key Features
- OpenAI-Compatible API: Serves models via an OpenAI-compatible endpoint
- Auto-Detection: Automatically detects LlamaCPP or MLX engines
- Agent Integration: Pre-wires environment variables for Claude Code, Codex, and OpenClaw
- HuggingFace Downloads: Auto-download models from HuggingFace repos
- Background Mode: Run models in detached mode with
--detach - Context Auto-Fit: Maximize context window based on available VRAM
Installation
The Jan CLI is included with the Jan desktop application. Build it from source:target/release/jan-cli (or target/debug/jan-cli for debug builds).
Quick Start
Serve a Model
Start a local model server:Output
Launch an AI Agent
Start a model and automatically launch Claude Code with pre-configured environment variables:- Load the specified model
- Set environment variables (OPENAI_BASE_URL, ANTHROPIC_BASE_URL, etc.)
- Launch Claude Code with the local model pre-wired
List Available Models
View all models installed in your Jan data folder:Architecture
Data Folder
Jan stores models, threads, and configuration in a platform-specific data folder:- macOS:
~/Library/Application Support/Jan - Linux:
~/.config/Jan - Windows:
%APPDATA%\Jan
Engine Auto-Detection
Jan automatically detects the appropriate inference engine:- LlamaCPP: For GGUF models (most common)
- MLX: For MLX models on macOS/Apple Silicon
model.yml file in the data folder.
Binary Discovery
Jan auto-discovers inference binaries from the Jan app installation:- llama-server: Found in Jan’s app bundle or data folder
- mlx-server: Found in Jan.app on macOS
--bin <path> if needed.
Environment Variables
HuggingFace Authentication
Set a HuggingFace token to download private/gated models:Logging
Enable verbose logging:Next Steps
Commands Reference
Complete reference for all CLI commands
Serve Command
Detailed guide for serving models
Launch Command
Wire AI agents to local models