The AutoGen.Anthropic package provides integration with Anthropic’s Claude models, including support for function calling, streaming, and prompt caching.
using AutoGen.Anthropic.Utils;// Claude 3.5 Sonnet (Latest and most capable)var agent = new AnthropicClientAgent( anthropicClient, "assistant", AnthropicConstants.Claude35Sonnet) .RegisterMessageConnector();
using AutoGen.Anthropic.Utils;// Claude 3 Opus (Most intelligent)var opus = new AnthropicClientAgent( anthropicClient, "opus_assistant", AnthropicConstants.Claude3Opus) .RegisterMessageConnector();// Claude 3 Sonnet (Balanced)var sonnet = new AnthropicClientAgent( anthropicClient, "sonnet_assistant", AnthropicConstants.Claude3Sonnet) .RegisterMessageConnector();// Claude 3 Haiku (Fast and affordable)var haiku = new AnthropicClientAgent( anthropicClient, "haiku_assistant", AnthropicConstants.Claude3Haiku) .RegisterMessageConnector();
// Use model strings directlyvar agent = new AnthropicClientAgent( anthropicClient, "assistant", "claude-3-5-sonnet-20241022") // Latest Sonnet .RegisterMessageConnector();// Or use constants// AnthropicConstants.Claude35Sonnet// AnthropicConstants.Claude3Opus // AnthropicConstants.Claude3Sonnet// AnthropicConstants.Claude3Haiku
using AutoGen.Core;var agent = new AnthropicClientAgent( anthropicClient, "assistant", AnthropicConstants.Claude3Sonnet) .RegisterMessageConnector();var messages = new[]{ new TextMessage(Role.User, "Write a story about a robot", from: "user")};await foreach (var message in agent.GenerateStreamingReplyAsync(messages)){ if (message.GetContent() is string content) { Console.Write(content); }}
Claude supports function calling through its tool use API:
1
Define functions
using AutoGen.Core;public partial class WeatherTools{ /// <summary> /// Get current weather /// </summary> /// <param name="location">city name</param> [Function] public async Task<string> GetWeather(string location) { return $"Weather in {location}: Sunny, 72°F"; } /// <summary> /// Get forecast /// </summary> /// <param name="location">city name</param> /// <param name="days">number of days</param> [Function] public async Task<string> GetForecast(string location, int days) { return $"{days}-day forecast for {location}: Mostly sunny"; }}
2
Register tools with agent
using AutoGen.Anthropic.Extensions;using Microsoft.Extensions.AI;var tools = new WeatherTools();// Create tool definitionsAIFunction[] aiFunctions = [ AIFunctionFactory.Create(tools.GetWeather), AIFunctionFactory.Create(tools.GetForecast),];var functionCallMiddleware = new FunctionCallMiddleware(aiFunctions);var agent = new AnthropicClientAgent( anthropicClient, "assistant", AnthropicConstants.Claude3Sonnet) .RegisterMessageConnector() .RegisterStreamingMiddleware(functionCallMiddleware) .RegisterPrintMessage();
3
Use the agent
var response = await agent.SendAsync( new TextMessage( Role.User, "What's the weather in Seattle?", from: "user"));Console.WriteLine(response.GetContent());// Claude calls the GetWeather function and returns:// "The current weather in Seattle is sunny with a temperature of 72°F."
Claude supports prompt caching to reduce costs for repeated context:
using AutoGen.Anthropic;using AutoGen.Anthropic.DTOs;using AutoGen.Core;var agent = new AnthropicClientAgent( anthropicClient, "assistant", AnthropicConstants.Claude35Sonnet);// Create messages with cache controlvar systemMessage = new SystemMessage{ Text = @" You are an expert C# developer. You know all about .NET, ASP.NET Core, Entity Framework, and modern C# patterns. Always write clean, maintainable, well-documented code. ", CacheControl = new CacheControl { Type = "ephemeral" }};var contextMessage = new SystemMessage{ Text = File.ReadAllText("large_codebase_context.txt"), // Large context CacheControl = new CacheControl { Type = "ephemeral" }};// These cached system messages will be reused across requestsvar request = new ChatCompletionRequest{ Model = AnthropicConstants.Claude35Sonnet, MaxTokens = 1024, SystemMessage = new[] { systemMessage, contextMessage }, Messages = new[] { new RequestMessage { Role = "user", Content = new ContentBase[] { new TextContent { Text = "Explain this code" } } } }};var response = await anthropicClient.CreateChatCompletionsAsync( request, CancellationToken.None);
Prompt caching is automatically enabled for Claude 3.5 Sonnet and Claude 3 Opus. Cached content reduces costs by up to 90% for repeated context.
using AutoGen.Core;var agent = new AnthropicClientAgent( anthropicClient, "vision_assistant", AnthropicConstants.Claude3Opus) .RegisterMessageConnector();// Send image with textvar messages = new IMessage[]{ new ImageMessage( Role.User, "https://example.com/diagram.png", from: "user"), new TextMessage( Role.User, "What's in this diagram? Explain the architecture.", from: "user")};var response = await agent.SendAsync(messages);Console.WriteLine(response.GetContent());
Claude 3.5 Sonnet: Best overall performance, coding, analysis
Claude 3 Opus: Most capable for complex reasoning
Claude 3 Sonnet: Balanced performance and cost
Claude 3 Haiku: Fast and affordable for simple tasks
Cost Optimization
// Use Haiku for simple tasksvar simpleAgent = new AnthropicClientAgent( anthropicClient, "simple_assistant", AnthropicConstants.Claude3Haiku, maxTokens: 500) .RegisterMessageConnector();// Use prompt caching for large contexts// Cache system messages and large documents// Reduces cost by 90% on cache hits// Use Opus only when necessaryvar expertAgent = new AnthropicClientAgent( anthropicClient, "expert", AnthropicConstants.Claude3Opus) .RegisterMessageConnector();
Error Handling
using AutoGen.Anthropic.DTOs;try{ var response = await agent.SendAsync(message);}catch (AnthropicException ex) when (ex.StatusCode == 429){ // Rate limit exceeded Console.WriteLine("Rate limited. Waiting..."); await Task.Delay(TimeSpan.FromSeconds(60)); // Retry}catch (AnthropicException ex){ Console.WriteLine($"Anthropic API error: {ex.Message}"); // Handle error}