Skip to main content

Quick Start

Get ZeroClaw installed and running your first AI agent in minutes.

Prerequisites

You’ll need an API key from one of the supported providers. We recommend OpenRouter for quick starts (supports 200+ models with a single key).Get one at openrouter.ai/keys — free tier available.

Installation

Choose your preferred installation method:
brew install zeroclaw
Fastest method for macOS and Linuxbrew users. Pre-built binaries, no compilation needed.

First-Time Setup

Run the onboarding wizard to configure your agent:
zeroclaw onboard
You’ll be prompted for:
1

Provider Selection

Choose your AI provider (default: openrouter)Popular options:
  • openrouter — Multi-provider gateway (200+ models)
  • anthropic — Claude models
  • openai — GPT-4, GPT-5
  • ollama — Local models
2

API Key

Enter your API key (securely encrypted at rest using ChaCha20-Poly1305)
# Environment variable (temporary)
export API_KEY="sk-or-v1-..."
zeroclaw onboard

# Or provide inline (non-interactive)
zeroclaw onboard --api-key "sk-or-v1-..." --provider openrouter
3

Model Selection

Pick a default model (optional — smart defaults provided)Examples:
  • anthropic/claude-sonnet-4.6 (default for OpenRouter)
  • gpt-5.2 (OpenAI)
  • llama3.2 (Ollama)
4

Memory Backend

Choose how to store conversation history:
  • sqlite (default) — Embedded, zero-config
  • markdown — Human-readable, git-friendly
  • postgres — Distributed, multi-instance
  • lucid — High-performance vector search
  • none — Stateless mode
5

OTP Pairing (Recommended)

Enable gateway authentication with time-based OTP?Recommended for security. Prevents unauthorized API access.Skip with --no-totp if testing locally behind a firewall.
Skip the wizard? Use non-interactive mode:
zeroclaw onboard \
  --api-key "$API_KEY" \
  --provider openrouter \
  --model "anthropic/claude-sonnet-4.6" \
  --memory sqlite

Start Your Agent

Option 1: Interactive Chat

Direct command-line chat session:
zeroclaw chat "Hello! What can you help me with?"
For multi-turn conversations, start the agent loop:
zeroclaw agent
Type your messages and press Enter. The agent can:
  • Execute shell commands (with approval)
  • Read and write files in your workspace
  • Search the web
  • Call custom tools
By default, all tool executions require user approval. Configure autonomy levels in config.toml to adjust this behavior.

Option 2: Gateway + Dashboard

Start the HTTP gateway server with web UI:
zeroclaw gateway
By default, the gateway listens on http://127.0.0.1:3000/
1

Access Dashboard

Open the URL shown in startup logs (usually http://127.0.0.1:3000/)
2

Pair Device (if OTP enabled)

Enter the 6-digit TOTP code from your authenticator app
3

Send Messages

Use the web UI or POST to /api/chat:
curl -X POST http://127.0.0.1:3000/api/chat \
  -H "Content-Type: application/json" \
  -d '{"message": "List files in current directory"}'
The gateway supports Server-Sent Events (SSE) for streaming responses. Connect via /api/stream for real-time output.

Option 3: Channel Integration

Connect to communication platforms:
zeroclaw channel setup telegram
# Follow prompts to enter bot token

zeroclaw channel run telegram
Run multiple channels simultaneously with the daemon mode:
zeroclaw service start
This starts all configured channels in the background.

Example Workflows

1. File Operations

zeroclaw chat "Create a README.md with project description"
zeroclaw chat "Search for TODO comments in src/ directory"
zeroclaw chat "Count lines of Rust code in this project"

2. Web Research

zeroclaw chat "Search for latest Rust async best practices"
zeroclaw chat "Fetch and summarize https://blog.rust-lang.org/"

3. Code Analysis

zeroclaw chat "Analyze the architecture of src/agent/mod.rs"
zeroclaw chat "Suggest performance improvements for this codebase"

4. System Administration

zeroclaw chat "Check disk usage and clean up logs older than 30 days"
zeroclaw chat "Monitor CPU temperature and alert if over 80°C"
Shell commands execute with your user permissions. Review approval prompts carefully before accepting destructive operations.

Configuration Files

After onboarding, you’ll find:
~/.zeroclaw/
├── config.toml          # Main configuration
├── .secret_key          # Encryption key (DO NOT COMMIT)
├── workspace/           # Agent workspace directory
└── memory/              # Conversation history (sqlite/markdown)

Quick Config Edits

# ~/.zeroclaw/config.toml

# Change default model
default_model = "anthropic/claude-sonnet-4.6"
default_temperature = 0.7

# Adjust autonomy (approve, auto, supervised)
[autonomy]
level = "approve"  # Require approval for all tools
file_operations = "auto"  # Auto-approve file reads
shell_commands = "approve"  # Always ask for shell commands

# Configure gateway
[gateway]
host = "127.0.0.1"
port = 3000
pairing_required = true
See Configuration Guide for full reference.

Next Steps

Installation Details

Platform-specific instructions, Docker setup, hardware builds

Configuration Guide

Deep dive into config.toml, environment variables, advanced settings

Tool Reference

Built-in tools, custom tool creation, WASM plugins

Channel Setup

Configure Telegram, Discord, Slack, Matrix, and more

Troubleshooting

Ensure ~/.cargo/bin is in your PATH:
export PATH="$HOME/.cargo/bin:$PATH"

# Add to ~/.bashrc or ~/.zshrc to persist
echo 'export PATH="$HOME/.cargo/bin:$PATH"' >> ~/.bashrc
Source builds need ~2GB RAM. Solutions:
  1. Use pre-built binaries: ./bootstrap.sh --prefer-prebuilt
  2. Enable swap space:
    sudo dd if=/dev/zero of=/swapfile bs=1M count=2048
    sudo mkswap /swapfile
    sudo swapon /swapfile
    
  3. Reduce parallelism: CARGO_BUILD_JOBS=1 ./bootstrap.sh
Check your API key:
# Re-run onboarding
zeroclaw onboard --force

# Or edit config directly
nano ~/.zeroclaw/config.toml
Verify the key is valid at your provider’s dashboard.
Another service is using port 3000:
# Find the process
lsof -i :3000

# Change ZeroClaw port
zeroclaw gateway --port 3001

# Or edit config
# [gateway]
# port = 3001
For more troubleshooting, see the Troubleshooting Guide in the repository.

Build docs developers (and LLMs) love