Introduction
TheDeerFlowClient is an embedded Python client that provides direct programmatic access to DeerFlow’s agent capabilities without requiring LangGraph Server or Gateway API processes.
The Python Client implements the same protocol as the Gateway API, making it easy to switch between embedded mode and HTTP streaming without changing your event-handling logic.
Installation
The client is included in the DeerFlow backend package:Basic Usage
Simple Chat
Streaming Responses
Query Configuration
Initialization Options
The client accepts several configuration parameters:Path to config.yaml. Uses default resolution if None.
LangGraph checkpointer instance for state persistence. Required for multi-turn conversations on the same thread_id. Without a checkpointer, each call is stateless.
Override the default model name from config.
Enable model’s extended thinking mode.
Enable subagent delegation capabilities.
Enable TodoList middleware for plan mode.
Example: Custom Configuration
Gateway API Alignment
The DeerFlowClient is designed to align with the Gateway API:- Event types match the LangGraph SSE protocol
- Response schemas mirror Gateway API responses
- File operations use the same virtual path structure
Multi-Turn Conversations
Agent Recreation
The system prompt (including date, memory, and skills context) is generated when the internal agent is first created and cached until the configuration key changes.
reset_agent() to force a refresh in long-running processes:
- External changes to memory
- Skill installations
- Configuration updates that should be reflected in the system prompt
API Reference
For detailed method documentation, see:Chat
Single-message request/response
Streaming
Real-time event streaming
Configuration
Models, skills, memory, and file management