autogen-ext) provide modular components that extend the core framework with support for different LLM providers, code execution environments, and tool integrations.
Architecture
The extensions package follows a plugin architecture where each component type implements a specific interface:Extension Categories
AutoGen extensions are organized into four main categories:Model Clients
Connect to OpenAI, Anthropic, Azure, Ollama, Gemini, Mistral, and more
Code Executors
Execute code safely in Docker containers, Jupyter notebooks, or locally
Tools
Integrate MCP servers, HTTP APIs, LangChain tools, and GraphRAG
Custom Extensions
Build your own extensions following AutoGen’s component model
Installation
Extensions are installed with optional dependencies based on what you need:Component Model
All extensions implement the AutoGen component model, which provides:Configuration Serialization
Components can be serialized to/from configuration:Lifecycle Management
Components support async lifecycle methods:Context Manager Support
Many components can be used as async context managers:Available Extensions
Model Clients
| Provider | Client Class | Models |
|---|---|---|
| OpenAI | OpenAIChatCompletionClient | GPT-4, GPT-3.5, o1, o3 |
| Azure OpenAI | AzureOpenAIChatCompletionClient | GPT-4, GPT-3.5 |
| Anthropic | AnthropicChatCompletionClient | Claude 3.5 Sonnet, Claude 3 Opus |
| AWS Bedrock | AnthropicBedrockChatCompletionClient | Claude via Bedrock |
| Azure AI | AzureAIChatCompletionClient | Azure AI models |
| Ollama | OllamaChatCompletionClient | Llama, Mistral, Qwen |
| Llama.cpp | LlamaCppChatCompletionClient | Local GGUF models |
Code Executors
| Executor | Description | Safety |
|---|---|---|
DockerCommandLineCodeExecutor | Execute code in Docker containers | High |
DockerJupyterCodeExecutor | Jupyter notebook in Docker | High |
LocalCommandLineCodeExecutor | Execute code locally | Low |
JupyterCodeExecutor | Local Jupyter execution | Medium |
Tool Integrations
| Integration | Description |
|---|---|
McpWorkbench | Model Context Protocol servers |
HttpTool | HTTP/REST API endpoints |
LangChainToolAdapter | Wrap LangChain tools |
GlobalSearchTool | GraphRAG global search |
LocalSearchTool | GraphRAG local search |
Best Practices
Use Type Hints
Leverage Python’s type system for better IDE support:Handle Cancellation
Always pass and respectCancellationToken:
Dispose Resources Properly
Use context managers or explicit cleanup:Configure Logging
Extensions use AutoGen’s event and trace logging:Next Steps
Model Clients
Learn how to configure different LLM providers
Code Executors
Set up safe code execution environments
Tools
Integrate external tools and APIs
Custom Extensions
Create your own custom extensions