Overview
Nanobot supports multiple LLM providers through a unified configuration system. The provider registry automatically detects the correct provider based on model names, API keys, or explicit configuration.ProvidersConfig
The root configuration object for all LLM providers.Custom OpenAI-compatible endpoint (direct, bypasses LiteLLM).
Anthropic (Claude) provider configuration.
OpenAI (GPT) provider configuration.
OpenRouter gateway configuration.
DeepSeek provider configuration.
Groq provider configuration.
Zhipu AI (GLM) provider configuration.
DashScope (阿里云通义千问) provider configuration.
vLLM local deployment configuration.
Google Gemini provider configuration.
Moonshot (Kimi) provider configuration.
MiniMax provider configuration.
AiHubMix gateway configuration.
SiliconFlow (硅基流动) gateway configuration.
VolcEngine (火山引擎) gateway configuration.
OpenAI Codex (OAuth) provider configuration.
GitHub Copilot (OAuth) provider configuration.
ProviderConfig
Individual provider configuration.API key for the provider. Can be set via environment variables.
Custom API base URL. If not specified, uses provider’s default endpoint.
Custom HTTP headers to include in requests (e.g.,
APP-Code for AiHubMix).Provider Registry
The provider registry (PROVIDERS) defines metadata for each supported provider:
Provider Types
Standard Providers: Directly supported by LiteLLM- Anthropic (Claude)
- OpenAI (GPT)
- DeepSeek
- Gemini
- And more
- OpenRouter
- AiHubMix
- SiliconFlow
- VolcEngine
- vLLM
- Ollama (via vLLM config)
- OpenAI Codex
- GitHub Copilot
ProviderSpec
Each provider is defined by aProviderSpec with the following attributes:
Configuration field name (e.g.,
dashscope, anthropic).Model name keywords for auto-detection (lowercase).
Environment variable name for API key (e.g.,
ANTHROPIC_API_KEY).Human-readable name shown in
nanobot status.Prefix for LiteLLM routing (e.g.,
deepseek/ → deepseek/deepseek-chat).Don’t add prefix if model already starts with these.
Additional environment variables to set. Supports placeholders:
{api_key}, {api_base}.If
true, provider can route any model (OpenRouter, AiHubMix).If
true, provider is a local deployment (vLLM).Auto-detect by API key prefix (e.g.,
sk-or- for OpenRouter).Auto-detect by substring in
api_base URL.Default API base URL if not specified in config.
If
true, strip provider prefix before re-prefixing (AiHubMix behavior).Per-model parameter overrides (e.g., Kimi K2.5 requires
temperature: 1.0).If
true, uses OAuth instead of API keys (OpenAI Codex, GitHub Copilot).If
true, bypasses LiteLLM entirely (custom provider).If
true, provider supports prompt caching (Anthropic, OpenRouter).Provider Matching
Nanobot automatically selects the correct provider using this priority:- Explicit provider prefix:
github-copilot/claude→ GitHub Copilot - Model name keywords:
claude-opus→ Anthropic - API key prefix:
sk-or-xxx→ OpenRouter - API base URL:
https://openrouter.ai/...→ OpenRouter - Forced provider:
provider: anthropicin config - Fallback: First available gateway with API key
Configuration Examples
Multiple Providers
Custom API Base
Gateway with Custom Headers
OAuth Providers
Environment Variables
Providers can be configured via environment variables:Provider-Specific Notes
Anthropic
- Supports prompt caching via
cache_control - No prefix needed (
claude-opus-4-5works directly)
OpenRouter
- Auto-detected by
sk-or-key prefix - Routes any model through unified API
- Supports prompt caching
DeepSeek
- Requires
deepseek/prefix for LiteLLM - Models:
deepseek-chat,deepseek-reasoner
Moonshot (Kimi)
- Requires
moonshot/prefix - Kimi K2.5 enforces
temperature >= 1.0 - Default base:
https://api.moonshot.ai/v1(international) - Use
https://api.moonshot.cn/v1for China region
Zhipu AI
- Uses
zai/prefix for LiteLLM - Also sets
ZHIPUAI_API_KEYenv var for compatibility
AiHubMix
- Gateway that strips provider prefixes
- Example:
anthropic/claude-3→claude-3→openai/claude-3 - Requires
APP-Codeheader for some configurations
vLLM (Local)
- For self-hosted models
- Uses
hosted_vllm/prefix - Requires
api_baseconfiguration
GitHub Copilot
- OAuth-based (no API key)
- Uses
github_copilot/prefix - Models must be explicitly selected