Overview
The Model Context Protocol (MCP) standardizes how applications provide context to LLMs. LiteLLM provides a client implementation for connecting to MCP servers and using their tools, prompts, and resources.What is MCP?
MCP enables:- Tools: External functions that LLMs can call (e.g., database queries, API calls)
- Prompts: Pre-configured prompt templates with variables
- Resources: Context data like files, database records, or API responses
Supported Transports
- HTTP: Connect to HTTP-based MCP servers
- SSE: Server-Sent Events for real-time updates
- Stdio: Process-based communication
Installation
Quick Start
Tool Integration
Load MCP Tools for LLM Use
Convert MCP tools to OpenAI-compatible format:Execute Tool Calls
Handle tool calls from LLM responses:Transport Types
Authentication
Bearer Token
Basic Auth
API Key
Custom Headers
Working with Prompts
List Available Prompts
Get a Prompt
Working with Resources
List Resources
Read a Resource
Resource Templates
Advanced Features
Tool Transformation
Manually transform between MCP and OpenAI formats:MCP to OpenAI Tool Format
MCP to OpenAI Tool Format
OpenAI to MCP Request Format
OpenAI to MCP Request Format
SSL Configuration
Progress Callbacks
Monitor long-running tool executions:Complete Example
Full LLM + MCP Integration
Full LLM + MCP Integration
Error Handling
Best Practices
Session Management
Session Management
Always use
run_with_session to ensure proper cleanup:Tool Validation
Tool Validation
Validate tool schemas before using with LLMs:
Error Recovery
Error Recovery
Implement graceful degradation:
Reference
Source Code
- MCP Client:
litellm/experimental_mcp_client/client.py:53 - Tool utilities:
litellm/experimental_mcp_client/tools.py:18