Installation Problems
Command not found: make, pnpm, uv, or nginx
Command not found: make, pnpm, uv, or nginx
Problem: Missing required dependencies when running Linux (Ubuntu/Debian):Windows:
make check or make dev.Solution:macOS:Python version mismatch or module not found errors
Python version mismatch or module not found errors
Problem: Errors like If you still encounter issues:
ModuleNotFoundError or Python 3.12+ required.Solution:DeerFlow requires Python 3.12 or higher. Install using uv:Docker Desktop not running or Docker daemon not accessible
Docker Desktop not running or Docker daemon not accessible
Problem:
Cannot connect to the Docker daemon when using Docker sandbox mode.Solution:-
Start Docker Desktop:
- macOS/Windows: Launch Docker Desktop application
- Verify it’s running: Look for Docker icon in system tray
-
Check Docker daemon:
-
Restart Docker service (Linux):
-
Add user to docker group (Linux):
Configuration Errors
Config file not found
Config file not found
Problem:
Configuration file not found error when starting services.Solution:The configuration file must be in the project root directory, not the backend directory.-
Create config from example:
-
Verify location:
-
Alternative: Set environment variable:
DEER_FLOW_CONFIG_PATHenvironment variablebackend/config.yaml(current directory)deer-flow/config.yaml(parent directory - recommended)
Invalid API key or authentication failed
Invalid API key or authentication failed
Problem:
Invalid API key or Authentication failed errors when making LLM requests.Solution:-
Verify environment variables are set:
-
Check config.yaml uses environment variable syntax:
-
Set environment variables properly:
-
Restart services after setting variables:
-
Test API key directly:
Extensions config not found or MCP servers failing to load
Extensions config not found or MCP servers failing to load
Problem:
extensions_config.json not found or MCP servers not working.Solution:-
Create extensions config from example:
-
Verify MCP server configuration:
-
Check environment variables in MCP config:
-
Validate JSON syntax:
-
Restart to reload config:
Model not found or unsupported provider
Model not found or unsupported provider
Problem:
Model 'xxx' not found or Failed to import provider.Solution:-
Install required provider package:
DeerFlow supports any LangChain-compatible provider. Install the appropriate package:
-
Verify model configuration in config.yaml:
-
Check for typos in provider path:
- Correct:
langchain_openai:ChatOpenAI(underscore, not hyphen) - Correct:
langchain_anthropic:ChatAnthropic - Wrong:
langchain-openai:ChatOpenAI
- Correct:
-
Test model directly:
Dependency Issues
Port already in use (2024, 8001, 3000, 2026)
Port already in use (2024, 8001, 3000, 2026)
Problem:
Address already in use when starting services.Solution:DeerFlow uses multiple ports:- 2024: LangGraph Server
- 8001: Gateway API
- 3000: Frontend
- 2026: Nginx (unified entry point)
- 8002: Provisioner (optional, only in Kubernetes mode)
- 8080+: Sandbox containers (dynamic)
-
Find process using the port:
-
Kill the conflicting process:
-
Stop DeerFlow services properly:
-
Change port (advanced):
Edit service configurations if you need different ports:
- LangGraph:
backend/langgraph.json - Gateway:
backend/src/gateway/app.py - Frontend:
frontend/package.json(dev script) - Nginx:
nginx.conf
- LangGraph:
npm or pnpm install failures
npm or pnpm install failures
Problem: Frontend dependencies fail to install with errors like
EACCES or ERESOLVE.Solution:-
Use pnpm instead of npm:
-
Clear cache and reinstall:
-
Fix permissions (macOS/Linux):
-
Use Node.js 22+:
General Troubleshooting
Services start but frontend shows connection error
Services start but frontend shows connection error
Problem: Frontend loads but cannot connect to backend services.Solution:
-
Verify all services are running:
-
Test backend endpoints directly:
-
Check nginx configuration:
-
Restart services in order:
-
Check frontend environment variables:
Clean reinstall procedure
Clean reinstall procedure
Problem: Multiple issues or corrupted state requiring fresh start.Solution:Perform a complete clean reinstall:
Next Steps
- Sandbox Errors - Docker and container issues
- Model Configuration - LLM provider problems
- Performance Optimization - Speed and resource issues
- Configuration Guide - Complete configuration reference