Skip to main content
Rowboat supports multiple LLM providers and allows you to bring your own API keys. Configure your preferred model in ~/.rowboat/config/models.json.

Configuration File

The models configuration is stored in:
~/.rowboat/config/models.json

Configuration Schema

{
  "provider": {
    "flavor": "openai",
    "apiKey": "sk-...",
    "baseURL": "https://api.openai.com/v1",
    "headers": {}
  },
  "model": "gpt-4o",
  "knowledgeGraphModel": "gpt-4o-mini"
}

Schema Fields

FieldTypeRequiredDescription
provider.flavorstringYesProvider type (see supported providers below)
provider.apiKeystringConditionalAPI key (not required for Ollama)
provider.baseURLstringNoCustom API endpoint
provider.headersobjectNoAdditional HTTP headers
modelstringYesModel identifier
knowledgeGraphModelstringNoModel for knowledge graph processing (defaults to main model)

Supported Providers

1

OpenAI

Use OpenAI’s GPT models.
{
  "provider": {
    "flavor": "openai",
    "apiKey": "sk-..."
  },
  "model": "gpt-4o"
}
Default baseURL: https://api.openai.com/v1Popular models:
  • gpt-4o - Latest GPT-4 optimized
  • gpt-4o-mini - Faster, more affordable
  • gpt-4-turbo - Previous generation
Get your API key: OpenAI Platform
2

Anthropic

Use Claude models from Anthropic.
{
  "provider": {
    "flavor": "anthropic",
    "apiKey": "sk-ant-..."
  },
  "model": "claude-sonnet-4-5"
}
Default baseURL: https://api.anthropic.com/v1Popular models:
  • claude-sonnet-4-5 - Latest Sonnet
  • claude-opus-4 - Most capable
  • claude-haiku-4 - Fast and efficient
Get your API key: Anthropic Console
3

Google

Use Google’s Gemini models.
{
  "provider": {
    "flavor": "google",
    "apiKey": "AIza..."
  },
  "model": "gemini-2.5-pro"
}
Default baseURL: https://generativelanguage.googleapis.com/v1betaPopular models:
  • gemini-2.5-pro - Latest Pro model
  • gemini-2.5-flash - Fast responses
  • gemini-pro-1.5 - Previous generation
Get your API key: Google AI Studio
4

Ollama (Local)

Run models locally with Ollama.
{
  "provider": {
    "flavor": "ollama",
    "baseURL": "http://localhost:11434"
  },
  "model": "llama3.1"
}
No API key required for Ollama. Make sure Ollama is running locally.
Default baseURL: http://localhost:11434Popular models:
  • llama3.1 - Meta’s Llama 3.1
  • mistral - Mistral 7B
  • mixtral - Mixtral 8x7B
Install Ollama: ollama.com
5

OpenRouter

Access multiple models through OpenRouter.
{
  "provider": {
    "flavor": "openrouter",
    "apiKey": "sk-or-..."
  },
  "model": "openrouter/auto"
}
Default baseURL: https://openrouter.ai/api/v1Special models:
  • openrouter/auto - Automatically selects best model
  • Or use any model from their catalog
Get your API key: OpenRouter
6

OpenAI-Compatible

Use any OpenAI-compatible API (LM Studio, vLLM, etc.).
{
  "provider": {
    "flavor": "openai-compatible",
    "apiKey": "optional",
    "baseURL": "http://localhost:1234/v1"
  },
  "model": "local-model"
}
Perfect for self-hosted solutions like LM Studio, vLLM, or text-generation-webui.
7

AI Gateway (Vercel)

Use Vercel AI Gateway for unified access.
{
  "provider": {
    "flavor": "aigateway",
    "apiKey": "your-gateway-key"
  },
  "model": "gpt-4o"
}
Default baseURL: https://ai-gateway.vercel.sh/v1/ai

Environment Variables

You can use environment variables instead of hardcoding API keys:
export OPENAI_API_KEY="sk-..."
Then omit the apiKey field in your config:
{
  "provider": {
    "flavor": "openai"
  },
  "model": "gpt-4o"
}

Knowledge Graph Model

Optionally specify a separate (typically faster/cheaper) model for knowledge graph processing:
{
  "provider": {
    "flavor": "openai",
    "apiKey": "sk-..."
  },
  "model": "gpt-4o",
  "knowledgeGraphModel": "gpt-4o-mini"
}
The knowledge graph model is used for extracting entities and relationships from emails and meetings. Using a faster model can significantly reduce processing time.

Custom Headers

Add custom HTTP headers for advanced use cases:
{
  "provider": {
    "flavor": "openai",
    "apiKey": "sk-...",
    "headers": {
      "X-Custom-Header": "value",
      "Organization": "org-id"
    }
  },
  "model": "gpt-4o"
}

Testing Configuration

After creating your config file, Rowboat will automatically test the connection on startup. Check the logs for any connection errors.
Connection Issues?
  • Verify your API key is correct
  • Check that the baseURL is accessible
  • For local models (Ollama), ensure the server is running
  • Check firewall settings for network requests

Switching Models

You can switch models at any time by editing models.json. Changes take effect after restarting Rowboat.

Cost Optimization Tips

  1. Use different models for different tasks: Set a cheaper knowledgeGraphModel for background processing
  2. Try local models: Use Ollama for privacy and zero API costs
  3. OpenRouter auto: Let OpenRouter pick the best price/performance model
  4. Monitor usage: Keep track of API costs in your provider dashboard

Build docs developers (and LLMs) love