LLM Providers
Microsoft Agent Framework supports multiple LLM providers, giving you the flexibility to choose the best model for your use case. Whether you’re using Azure OpenAI, OpenAI, Anthropic, AWS Bedrock, or local models with Ollama, the framework provides a consistent API across all providers.Supported Providers
Azure OpenAI
Enterprise-grade OpenAI models hosted on Azure
OpenAI
Direct integration with OpenAI’s API
Anthropic
Claude models via Anthropic’s API
Other Providers
AWS Bedrock, Ollama, GitHub Copilot, and more
Choosing a Provider
When selecting a provider, consider:- Enterprise requirements: Azure OpenAI offers enterprise-grade SLAs, compliance, and regional deployment
- Model capabilities: Different providers offer different models with varying capabilities (reasoning, vision, function calling)
- Cost: Pricing varies significantly between providers and models
- Latency: Geographic proximity and infrastructure affect response times
- Privacy: Some providers offer better data privacy guarantees
- Local development: Ollama enables local development without API costs
Common Patterns
All providers follow a consistent pattern in the framework:Chat Clients
Most providers offer a chat client for direct chat completion:Assistant/Agent Providers
Some providers offer persistent agents with managed state:Authentication
Authentication varies by provider:| Provider | Python | .NET |
|---|---|---|
| Azure OpenAI | AzureCliCredential(), API keys | DefaultAzureCredential, API keys |
| OpenAI | OPENAI_API_KEY env var | API key in constructor |
| Anthropic | ANTHROPIC_API_KEY env var | API key in constructor |
| AWS Bedrock | AWS credentials (access key/secret) | AWS credentials |
| Ollama | No authentication (local) | No authentication (local) |
For production deployments, always use managed identities or secure credential management rather than API keys in code.
Configuration
Providers can be configured via:- Environment variables - Most providers read from environment variables by default
- Constructor parameters - Explicit configuration overrides environment variables
- Settings classes - Pydantic settings for structured configuration (Python)
Environment Variables
Common environment variables:Streaming Support
All providers support streaming responses:Function Calling
Most providers support function calling (tool use):Model Capabilities
| Feature | Azure OpenAI | OpenAI | Anthropic | Bedrock | Ollama |
|---|---|---|---|---|---|
| Function Calling | ✅ | ✅ | ✅ | ✅ | ⚠️ Limited |
| Vision | ✅ | ✅ | ✅ | ✅ | ⚠️ Some models |
| Streaming | ✅ | ✅ | ✅ | ✅ | ✅ |
| Reasoning | ✅ o1/o3 | ✅ o1/o3 | ✅ Extended thinking | ⚠️ Model dependent | ❌ |
| Code Interpreter | ✅ Assistants | ✅ Assistants | ❌ | ❌ | ❌ |
| File Search | ✅ Assistants | ✅ Assistants | ❌ | ❌ | ❌ |
Next Steps
Explore the provider-specific documentation for detailed setup instructions, authentication patterns, and advanced features:Azure OpenAI
Enterprise-grade models with Azure integration
OpenAI
Latest GPT models directly from OpenAI
Anthropic
Claude models with advanced capabilities
Other Providers
Bedrock, Ollama, GitHub Copilot, and more