OneClawConfig struct represents the top-level configuration for the OneClaw agent, loaded from a TOML file or using defaults.
Struct Definition
Configuration Sections
SecurityConfig
Security layer configuration.Whether to deny all unauthorized requests by default
Whether device pairing is required before interaction
Whether to restrict operations to the workspace directory
Path to SQLite database for persistent device pairing
Whether to persist pairing state to SQLite
RuntimeConfig
Runtime configuration (agent name, logging).The agent instance name
The log level filter (e.g., “debug”, “info”, “warn”, “error”)
ProvidersConfig
LLM provider configuration (legacy - useprovider section for v1.5+).
Default provider: “ollama”, “openai”, “noop”
LLM call timeout threshold in seconds (monitoring, not cancellation)
Ollama provider configuration (url, model)
OpenAI provider configuration (api_key, model, base_url)
MemoryConfig
Memory backend configuration.Backend type: “sqlite” or “noop”
Path to SQLite database file for memory storage
ChannelsConfig
Communication channels configuration.Active channels: [“cli”], [“cli”, “mqtt”], [“cli”, “telegram”], etc.
TelegramConfig (Optional)
Telegram bot configuration.Bot token from @BotFather
Allowed chat IDs (empty = allow all)
Long-polling timeout in seconds
MqttConfig (Optional)
MQTT broker configuration.Broker host (e.g., “localhost”, “mqtt.example.com”)
Broker port
Client ID (unique per device)
Topics to subscribe (e.g., [“sensors/#”, “devices/+/data”])
Topic prefix for publishing responses/alerts
Username (optional)
Password (optional, masked in Debug output)
Keep-alive interval in seconds
ProviderConfigToml
v1.5 multi-provider configuration with FallbackChain support.Primary provider ID: “anthropic”, “openai”, “google”, “deepseek”, “groq”, “ollama”
Model name (e.g., “claude-sonnet-4-20250514”, “gpt-4o”, “gemini-2.0-flash”)
Max tokens for response
Temperature (0.0 - 1.0) for response randomness
API key for primary provider (also checked: ONECLAW_API_KEY env)
Fallback provider chain (tried in order after primary fails)
Max retries per provider before moving to next in chain
Per-provider API keys (overrides primary api_key for specific provider)
Ollama endpoint override (default: http://localhost:11434)
Ollama model override (separate from primary model)
Methods
load
Load config from a TOML file.Path to TOML configuration file
Loaded configuration, or error if file not found or invalid TOML
default_config
Load with defaults (no file needed).Configuration with all default values
Example Configuration (TOML)
Minimal Configuration
Full Configuration with Fallback Chain
Configuration with MQTT
Environment Variable Overrides
The following environment variables override config file values:ONECLAW_API_KEY- Primary provider API keyOPENAI_API_KEY- OpenAI-specific key (if using OpenAI provider)ANTHROPIC_API_KEY- Anthropic-specific key (if using Anthropic provider)
Security Notes
Debug output automatically masks API keys and bot tokens, showing only first 4 and last 4 characters.