Supported Providers
Genkit includes official support for the following model providers:Google AI Providers
- Google AI (Gemini Developer API) - Quick access to Gemini models with API key authentication. Ideal for prototyping and smaller projects.
- Vertex AI - Enterprise-grade Google Cloud AI platform with advanced features, IAM integration, and broader model access including Imagen, Lyria, and Model Garden.
Third-Party Providers
- Anthropic - Access to Claude models (Haiku, Sonnet, Opus) with advanced reasoning capabilities.
- OpenAI-Compatible APIs - Support for OpenAI, xAI (Grok), DeepSeek, and any OpenAI-compatible endpoint.
- Ollama - Run open-source models locally for privacy and offline usage.
Custom Providers
- Custom Providers - Build your own model provider plugin to integrate any AI service.
Provider Comparison
| Provider | Best For | Authentication | Key Features |
|---|---|---|---|
| Google AI | Rapid prototyping, small projects | API Key | Gemini models, image/video generation, simple setup |
| Vertex AI | Production apps, enterprise | GCP IAM / API Key (Express Mode) | Model Garden, Vector Search, fine-tuning, governance |
| Anthropic | Advanced reasoning tasks | API Key | Claude models, extended thinking, document citations |
| OpenAI | GPT models, wide adoption | API Key | GPT-4o, o1, DALL-E, Whisper, multi-modal |
| Ollama | Local development, privacy | None (local) | Open-source models, offline, no API costs |
How to Choose a Provider
For Prototyping
Use Google AI if you want to get started quickly with powerful multimodal models:For Production
Use Vertex AI for enterprise applications with advanced features:- IAM-based access control
- Integration with other Google Cloud services
- Model versioning and governance
- Access to Model Garden (Anthropic, Meta, and more)
- Vector Search for RAG applications
For Advanced Reasoning
Use Anthropic when you need:- Extended thinking for complex problem-solving
- Document citations for factual accuracy
- Long context windows (200K+ tokens)
- Prompt caching for efficiency
For Local Development
Use Ollama when you need:- Privacy (data never leaves your machine)
- Offline capabilities
- No API costs
- Experimentation with open-source models
For OpenAI Compatibility
Use OpenAI-Compatible plugin to connect to:- OpenAI (GPT-4o, o1, etc.)
- xAI (Grok models)
- DeepSeek
- Any service with an OpenAI-compatible API