Syntax
Supported Providers
Anthropic Claude
Requirements:ANTHROPIC_API_KEYenvironment variable
- Claude 3 Opus (most capable, slower)
- Claude 3 Sonnet (recommended)
- Claude 3 Haiku (fastest)
OpenAI
Requirements:OPENAI_API_KEYenvironment variable
- GPT-4o (recommended)
- GPT-4 Turbo
- GPT-3.5 Turbo
Ollama (Local)
Requirements:- Ollama running locally
- Models pulled via
ollama pull
- API URL (default:
http://localhost:11434) - Model selection from available models
Interactive Flow
1. Provider Selection
2. Model Selection (Anthropic)
3. Model Selection (OpenAI)
4. Ollama Configuration
Examples
Configure Anthropic Claude
Configure OpenAI
Configure Ollama
Environment Variables
API key for Anthropic Claude. Get one at console.anthropic.com
API key for OpenAI. Get one at platform.openai.com
Setting Environment Variables
Temporary (Current Session)
Permanent (Bash/Zsh)
Add to~/.bashrc or ~/.zshrc:
Ollama Setup
Install Ollama
Visit ollama.com/downloadStart Ollama
Pull Models
List Available Models
Validation
Anthropic/OpenAI
Checks for environment variable before allowing configuration.Ollama
- Tests connection to Ollama service
- Retrieves available models
- Allows configuration even if service is down (for later use)
Success Messages
Configuration Storage
Settings are stored in adist config at:Switching Providers
Re-runadist llm-config anytime to switch providers or models. The last configured provider becomes the default.
Related Commands
- adist query - Use configured provider for queries
- adist chat - Use configured provider for chat
- adist reindex - Use configured provider for summaries (with
-s)