[llm] section configures LLM provider credentials and custom provider endpoints. You can use built-in providers or define custom ones.
Built-in Provider Keys
Spacebot supports multiple LLM providers. Configure credentials for the providers you want to use.Anthropic API key. Supports
env:VAR_NAME references.OpenAI API key. Supports
env:VAR_NAME references.OpenRouter API key. Supports
env:VAR_NAME references.Kilo Gateway API key. Supports
env:VAR_NAME references.Google Gemini API key. Supports
env:VAR_NAME references.Groq API key. Supports
env:VAR_NAME references.DeepSeek API key. Supports
env:VAR_NAME references.xAI (Grok) API key. Supports
env:VAR_NAME references.Mistral AI API key. Supports
env:VAR_NAME references.Together AI API key. Supports
env:VAR_NAME references.Fireworks AI API key. Supports
env:VAR_NAME references.NVIDIA NIM API key. Supports
env:VAR_NAME references.Z.AI (GLM) API key. Supports
env:VAR_NAME references.MiniMax API key (international endpoint). Supports
env:VAR_NAME references.MiniMax API key (China endpoint). Supports
env:VAR_NAME references.Moonshot AI API key. Supports
env:VAR_NAME references.Z.AI Coding Plan API key. Supports
env:VAR_NAME references.OpenCode Zen API key. Supports
env:VAR_NAME references.OpenCode Go API key. Supports
env:VAR_NAME references.Ollama server base URL. Defaults to
http://localhost:11434.Optional Ollama API key (rarely needed for local deployments).
Custom Providers
Define custom provider endpoints to use models from any OpenAI-compatible or Anthropic-compatible API.Custom provider configuration. Replace
<name> with your provider identifier.Examples
Using Built-in Providers
Custom Provider with OpenAI API
Custom Provider with Anthropic API
Local Ollama Instance
Multiple Providers
Environment Variable References
All credential fields supportenv:VAR_NAME references to read values from environment variables: