The Microsoft Agent Framework .NET SDK supports multiple AI providers through the IChatClient interface from Microsoft.Extensions.AI. This unified interface allows you to switch between providers with minimal code changes.
using Azure.AI.OpenAI;using Azure.Identity;using Microsoft.Agents.AI;using OpenAI.Chat;var endpoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT");var deploymentName = Environment.GetEnvironmentVariable("AZURE_OPENAI_DEPLOYMENT_NAME") ?? "gpt-4o-mini";// Use Azure AD authenticationAIAgent agent = new AzureOpenAIClient( new Uri(endpoint), new DefaultAzureCredential()) .GetChatClient(deploymentName) .AsAIAgent( instructions: "You are a helpful assistant.", name: "Assistant");var response = await agent.RunAsync("Hello!");Console.WriteLine(response.Text);
DefaultAzureCredential is convenient for development but requires careful consideration in production. Use ManagedIdentityCredential or another specific credential to avoid latency and security risks.
using Azure.AI.OpenAI;using Microsoft.Agents.AI;using OpenAI.Chat;var endpoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT");var apiKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");var deploymentName = "gpt-4o-mini";AIAgent agent = new AzureOpenAIClient( new Uri(endpoint), new System.ClientModel.ApiKeyCredential(apiKey)) .GetChatClient(deploymentName) .AsAIAgent( instructions: "You are a helpful assistant.");
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com"export AZURE_OPENAI_DEPLOYMENT_NAME="gpt-4o-mini"# For Azure AD auth, authenticate with: az login# Or for API key:export AZURE_OPENAI_API_KEY="your-api-key"
using Microsoft.Agents.AI;using Microsoft.Extensions.AI;using OpenAI;using OpenAI.Chat;var apiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY");var model = Environment.GetEnvironmentVariable("OPENAI_CHAT_MODEL_NAME") ?? "gpt-4o-mini";AIAgent agent = new OpenAIClient(apiKey) .GetChatClient(model) .AsAIAgent( instructions: "You are a helpful assistant.", name: "Assistant");var response = await agent.RunAsync("Tell me a joke.");Console.WriteLine(response.Text);
using Anthropic.Foundry;using Azure.Identity;using Microsoft.Agents.AI;var resource = Environment.GetEnvironmentVariable("ANTHROPIC_RESOURCE");var model = "claude-haiku-4-5";// With Azure ADusing var client = new AnthropicFoundryClient( new AnthropicFoundryIdentityTokenCredentials( new DefaultAzureCredential(), resource, ["https://ai.azure.com/.default"]));AIAgent agent = client.AsAIAgent( model: model, instructions: "You are a helpful assistant.");
# Direct Anthropic APIexport ANTHROPIC_API_KEY="sk-ant-..."export ANTHROPIC_CHAT_MODEL_NAME="claude-haiku-4-5"# Via Azure AI Foundryexport ANTHROPIC_RESOURCE="your-resource-name"
using Microsoft.Agents.AI;using Mscc.GenerativeAI.Microsoft;var apiKey = Environment.GetEnvironmentVariable("GOOGLE_GENAI_API_KEY");var model = "gemini-2.5-flash";var agent = new ChatClientAgent( new GeminiChatClient(apiKey: apiKey, model: model), name: "Assistant", instructions: "You are a helpful assistant.");
using Microsoft.Agents.AI;using Microsoft.Extensions.AI;using Microsoft.ML.OnnxRuntimeGenAI;var modelPath = Environment.GetEnvironmentVariable("ONNX_MODEL_PATH");using var model = new Model(modelPath);var chatClient = new OnnxRuntimeGenAIChatClient(model);var agent = new ChatClientAgent( chatClient, name: "LocalAgent", instructions: "You are a helpful assistant.");var response = await agent.RunAsync("What is AI?");Console.WriteLine(response.Text);
using Microsoft.Agents.AI;using Microsoft.Agents.AI.GitHub.Copilot;var token = Environment.GetEnvironmentVariable("GITHUB_TOKEN");var model = "gpt-4o";var chatClient = new GitHubCopilotChatClient( token: token, model: model);var agent = new ChatClientAgent( chatClient, name: "CopilotAgent", instructions: "You are a coding assistant.");var response = await agent.RunAsync( "Write a function to calculate factorial.");Console.WriteLine(response.Text);
using Azure.AI.Inference;using Azure.Identity;using Microsoft.Agents.AI;var endpoint = Environment.GetEnvironmentVariable("AZURE_FOUNDRY_ENDPOINT");var model = "gpt-4o";var client = new ChatCompletionsClient( new Uri(endpoint), new DefaultAzureCredential());var chatClient = client.AsIChatClient(model);var agent = new ChatClientAgent( chatClient, name: "FoundryAgent", instructions: "You are a helpful assistant.");
The unified interface makes switching providers easy:
// Development: Use Ollama locallyIChatClient chatClient = new OllamaApiClient( new Uri("http://localhost:11434"), "llama3.2");// Production: Use Azure OpenAI// chatClient = new AzureOpenAIClient(// new Uri(azureEndpoint),// new DefaultAzureCredential())// .GetChatClient(deploymentName);// Same agent code works with any providerAIAgent agent = chatClient.AsAIAgent( instructions: "You are a helpful assistant.", name: "Assistant");
Never hardcode credentials. Use environment variables:
var apiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new InvalidOperationException("OPENAI_API_KEY not set");
Handle Provider-Specific Features Gracefully
Not all providers support all features (e.g., function calling, structured output). Implement fallbacks:
try{ var response = await agent.RunAsync<CityInfo>(query);}catch (NotSupportedException){ // Fallback to text parsing var textResponse = await agent.RunAsync(query);}
Monitor Token Usage
Track token consumption across providers:
var response = await agent.RunAsync(query);var usage = response.Usage;Console.WriteLine($"Tokens: {usage?.TotalTokenCount}");
Use Appropriate Models
Choose models based on task requirements:
Fast/cheap: gpt-4o-mini, claude-haiku
Balanced: gpt-4o, claude-sonnet
Advanced: o1, claude-opus
Consider Latency and Availability
Cloud providers: Higher latency, always available
Local models: Lower latency, offline capable, limited power
Secure Credentials
For Azure providers, prefer managed identities in production:
// Productionnew AzureOpenAIClient( endpoint, new ManagedIdentityCredential());// NOT recommended for productionnew AzureOpenAIClient( endpoint, new DefaultAzureCredential()); // May probe multiple credential sources