Prerequisites
Before installing AI Providers, make sure you have:- Obsidian 0.15.0 or later (desktop or mobile)
- An active internet connection (for cloud providers)
- Optional: Local AI runtime like Ollama or LM Studio for offline use
AI Providers works on both desktop and mobile versions of Obsidian.
Installation Methods
Cmd/Ctrl + ,)You can also install AI Providers directly from the web: obsidian.md/plugins?id=ai-providers
If you prefer to install using BRAT (Beta Reviewers Auto-update Tool):
pfrankov/obsidian-ai-providersBRAT installs the latest development version, which may include experimental features or bugs. Use the community plugin store for stable releases.
Post-Installation
Configure Your First Provider
Once installed, AI Providers won’t do anything until you configure at least one provider. You have several options:Local (Free)
Ollama or LM Studio for completely offline AI without API keys or costs.
Cloud (Paid)
OpenAI, Anthropic, or Google Gemini for powerful models with API keys.
Settings Overview
The AI Providers settings page includes:- Provider List: View all configured providers
- Add Provider Button: Create a new provider configuration
- Provider Cards: Each provider shows:
- Name and type
- Selected model
- Edit and delete buttons
- Test connection option
Understanding Provider Types
When you add a provider, you’ll choose from these types:Cloud API Providers
Cloud API Providers
- OpenAI: GPT-4, GPT-3.5, and embedding models
- Anthropic: Claude 3 (Opus, Sonnet, Haiku)
- Google Gemini: Gemini 1.5 Pro/Flash
- OpenRouter: Access to 100+ models through one API
- Groq: Ultra-fast inference for Llama, Mixtral
- Mistral AI: Mistral Large, Medium, Small
- Perplexity AI: Online LLMs with web search
- DeepSeek: DeepSeek Coder and Chat models
- xAI: Grok models from X.AI
- Together AI: Hosted open-source models
- Fireworks AI: Fast inference for open models
- And more…
Local/Self-Hosted Providers
Local/Self-Hosted Providers
- Ollama: Local inference with Llama, Gemma, Mistral, etc.
- LM Studio: Desktop app with local model hosting
- Open WebUI: Web interface for local LLMs
- OpenAI Compatible API: Any server implementing OpenAI’s API format (llama.cpp, LocalAI, Text generation web UI, etc.)
Troubleshooting
Plugin Not Showing Up
- Make sure you enabled the plugin after installation
- Try restarting Obsidian
- Check that you’re running Obsidian 0.15.0 or later
Settings Page is Empty
This is normal after first install. Click Add Provider to create your first provider configuration.Community Plugins Disabled
If you can’t browse community plugins:- Go to Settings → Community plugins
- Click Turn on community plugins
- Accept the security warning
- Obsidian will enable the community plugin store
Mobile Installation Issues
- Ensure you’re on a stable internet connection
- Try closing and reopening Obsidian
- Mobile plugin installation can be slower—wait 30-60 seconds
Uninstalling
If you need to remove AI Providers:- Open Settings → Community plugins
- Find AI Providers in the installed plugins list
- Click the toggle to disable the plugin
- Click the trash icon to uninstall
- Confirm the uninstall action
Uninstalling AI Providers will remove all provider configurations and cached embeddings. Plugins that depend on AI Providers will show fallback settings to help you reinstall.
Next Steps
Now that AI Providers is installed, let’s set up your first provider:Quick Start Guide
Follow our step-by-step guide to configure Ollama or OpenAI and test your first AI request
Cloud Providers
OpenAI, Anthropic, Gemini, and more
Local Providers
Ollama, LM Studio, and self-hosted options