Skip to main content
The OneClawConfig struct represents the top-level configuration for the OneClaw agent, loaded from a TOML file or using defaults.

Struct Definition

pub struct OneClawConfig {
    pub security: SecurityConfig,
    pub runtime: RuntimeConfig,
    pub providers: ProvidersConfig,
    pub memory: MemoryConfig,
    pub channels: ChannelsConfig,
    pub telegram: Option<TelegramConfig>,
    pub mqtt: Option<MqttConfig>,
    pub provider: ProviderConfigToml,
}

Configuration Sections

SecurityConfig

Security layer configuration.
deny_by_default
bool
default:"true"
Whether to deny all unauthorized requests by default
pairing_required
bool
default:"true"
Whether device pairing is required before interaction
workspace_only
bool
default:"true"
Whether to restrict operations to the workspace directory
persist_path
String
default:"data/security.db"
Path to SQLite database for persistent device pairing
persist_pairing
bool
default:"true"
Whether to persist pairing state to SQLite

RuntimeConfig

Runtime configuration (agent name, logging).
name
String
default:"oneclaw"
The agent instance name
log_level
String
default:""
The log level filter (e.g., “debug”, “info”, “warn”, “error”)

ProvidersConfig

LLM provider configuration (legacy - use provider section for v1.5+).
default
String
default:"noop"
Default provider: “ollama”, “openai”, “noop”
llm_timeout_secs
u64
default:"30"
LLM call timeout threshold in seconds (monitoring, not cancellation)
ollama
OllamaConfig
Ollama provider configuration (url, model)
openai
OpenAIConfig
OpenAI provider configuration (api_key, model, base_url)

MemoryConfig

Memory backend configuration.
backend
String
default:"sqlite"
Backend type: “sqlite” or “noop”
db_path
String
default:"data/oneclaw.db"
Path to SQLite database file for memory storage

ChannelsConfig

Communication channels configuration.
active
Vec<String>
default:"[\"cli\"]"
Active channels: [“cli”], [“cli”, “mqtt”], [“cli”, “telegram”], etc.

TelegramConfig (Optional)

Telegram bot configuration.
bot_token
String
required
Bot token from @BotFather
allowed_chat_ids
Vec<i64>
default:"[]"
Allowed chat IDs (empty = allow all)
polling_timeout
u64
default:"30"
Long-polling timeout in seconds

MqttConfig (Optional)

MQTT broker configuration.
host
String
required
Broker host (e.g., “localhost”, “mqtt.example.com”)
port
u16
default:"1883"
Broker port
client_id
String
default:"oneclaw-{pid}"
Client ID (unique per device)
subscribe_topics
Vec<String>
default:"[]"
Topics to subscribe (e.g., [“sensors/#”, “devices/+/data”])
publish_prefix
String
default:"oneclaw/alerts"
Topic prefix for publishing responses/alerts
username
Option<String>
default:"None"
Username (optional)
password
Option<String>
default:"None"
Password (optional, masked in Debug output)
keep_alive_secs
u64
default:"30"
Keep-alive interval in seconds

ProviderConfigToml

v1.5 multi-provider configuration with FallbackChain support.
primary
String
default:"anthropic"
Primary provider ID: “anthropic”, “openai”, “google”, “deepseek”, “groq”, “ollama”
model
String
default:"claude-sonnet-4-20250514"
Model name (e.g., “claude-sonnet-4-20250514”, “gpt-4o”, “gemini-2.0-flash”)
max_tokens
u32
default:"1024"
Max tokens for response
temperature
f32
default:"0.3"
Temperature (0.0 - 1.0) for response randomness
api_key
Option<String>
default:"None"
API key for primary provider (also checked: ONECLAW_API_KEY env)
fallback
Vec<String>
default:"[]"
Fallback provider chain (tried in order after primary fails)
max_retries
u32
default:"1"
Max retries per provider before moving to next in chain
keys
HashMap<String, String>
default:"{}"
Per-provider API keys (overrides primary api_key for specific provider)
ollama_endpoint
Option<String>
default:"None"
Ollama endpoint override (default: http://localhost:11434)
ollama_model
Option<String>
default:"None"
Ollama model override (separate from primary model)

Methods

load

Load config from a TOML file.
pub fn load(path: impl AsRef<Path>) -> Result<Self>
path
impl AsRef<Path>
required
Path to TOML configuration file
return
Result<OneClawConfig>
Loaded configuration, or error if file not found or invalid TOML
Example:
use oneclaw_core::config::OneClawConfig;

let config = OneClawConfig::load("oneclaw.toml")?;
println!("Agent name: {}", config.runtime.name);

default_config

Load with defaults (no file needed).
pub fn default_config() -> Self
return
Self
Configuration with all default values
Example:
let config = OneClawConfig::default_config();
assert_eq!(config.runtime.name, "oneclaw");
assert!(config.security.deny_by_default);

Example Configuration (TOML)

Minimal Configuration

[runtime]
name = "my-agent"
log_level = "info"

[provider]
primary = "anthropic"
model = "claude-sonnet-4-20250514"
max_tokens = 2048
temperature = 0.5

Full Configuration with Fallback Chain

[security]
deny_by_default = true
pairing_required = true
workspace_only = true
persist_path = "data/security.db"
persist_pairing = true

[runtime]
name = "oneclaw-prod"
log_level = "info"

[provider]
primary = "anthropic"
model = "claude-sonnet-4-20250514"
max_tokens = 2048
temperature = 0.5
api_key = "sk-ant-..."
fallback = ["openai", "ollama"]
max_retries = 2

[provider.keys]
openai = "sk-..."
google = "AIza..."

[memory]
backend = "sqlite"
db_path = "data/oneclaw.db"

[channels]
active = ["cli", "telegram"]

[telegram]
bot_token = "123456:ABC-DEF..."
allowed_chat_ids = [12345, 67890]
polling_timeout = 30

Configuration with MQTT

[runtime]
name = "iot-agent"

[channels]
active = ["cli", "mqtt"]

[mqtt]
host = "mqtt.local"
port = 1883
client_id = "oneclaw-01"
subscribe_topics = ["sensors/#", "devices/+/data"]
publish_prefix = "oneclaw/alerts"
username = "device"
password = "secret"
keep_alive_secs = 60

Environment Variable Overrides

The following environment variables override config file values:
  • ONECLAW_API_KEY - Primary provider API key
  • OPENAI_API_KEY - OpenAI-specific key (if using OpenAI provider)
  • ANTHROPIC_API_KEY - Anthropic-specific key (if using Anthropic provider)
Example:
export ONECLAW_API_KEY="sk-ant-api03-..."
./oneclaw

Security Notes

API keys in config files should be protected with appropriate file permissions (0600). Consider using environment variables for production deployments.
Debug output automatically masks API keys and bot tokens, showing only first 4 and last 4 characters.

Build docs developers (and LLMs) love