Introduction
The DeerFlow Gateway API provides RESTful endpoints for managing models, skills, MCP servers, memory, file uploads, and artifacts. It runs as a separate FastAPI service alongside the LangGraph Server.Base URL
The API Gateway runs on port8001 by default:
/api:
Architecture
The Gateway API is built with FastAPI and provides:- Models Management: Query available AI models and their capabilities
- Custom Agents: Create and manage specialized agents with SOUL.md profiles
- Skills Management: List, enable/disable, and install custom skills
- MCP Configuration: Manage Model Context Protocol server configurations
- Memory Management: Access and reload global memory data
- File Uploads: Upload files to thread-specific directories
- Artifacts: Access and download thread-generated files
Authentication
The Gateway API currently does not require authentication for local development. CORS is handled by nginx in production deployments. CORS Origins: Configurable viaCORS_ORIGINS environment variable (default: http://localhost:3000)
Configuration
The Gateway can be configured using environment variables:Host to bind the gateway server
Port to bind the gateway server
Comma-separated list of allowed CORS origins
Error Handling
The API uses standard HTTP status codes:Request succeeded
Invalid request parameters or malformed data
Access denied (e.g., path traversal attempt)
Resource not found
Resource already exists (e.g., duplicate skill)
Server error
Error Response Format
Error responses follow this format:Interactive Documentation
The Gateway API provides interactive documentation:- Swagger UI:
http://localhost:8001/docs - ReDoc:
http://localhost:8001/redoc - OpenAPI Schema:
http://localhost:8001/openapi.json
Health Check
Check the Gateway service health:Next Steps
Models API
Query available AI models
Custom Agents API
Create specialized agents
Skills API
Manage skills and extensions
MCP API
Configure MCP servers
Memory API
Access global memory
Uploads API
Manage file uploads