Overview
Most agents forget everything when a conversation ends. DeerFlow remembers:- User Context: Your work, preferences, and habits
- Conversation History: Recent and historical interactions
- Extracted Facts: Discrete facts with confidence scores
backend/.deer-flow/memory.json
Memory is stored locally and stays under your control. No data is sent to external services.
Memory Structure
Memory is organized into three main sections:User Context
Current state and preferences:History
Temporal context organization:Facts
Discrete, scored knowledge:How Memory Works
1. Conversation Capture
MemoryMiddleware filters relevant messages:2. Debounced Updates
Memory updates are debounced to batch changes:3. Fact Extraction
An LLM extracts facts from the conversation:- 0.9-1.0: Explicit statements (“I prefer X”)
- 0.7-0.9: Strong inference (“I always use X”)
- 0.5-0.7: Weak inference (“I might use X”)
- Below 0.5: Discarded
4. Memory Storage
Facts are merged with existing memory:5. Context Injection
On the next conversation, memory is injected into the system prompt:Configuration
Memory is configured inconfig.yaml:
Memory Configuration
Detailed configuration options
Fact Categories
Facts are categorized for organization:preference
preference
User preferences and likes/dislikes.Examples:
- “Prefers Python over JavaScript”
- “Uses VS Code as primary editor”
- “Likes dark themes”
knowledge
knowledge
User’s expertise and knowledge areas.Examples:
- “Expert in distributed systems”
- “Familiar with Kubernetes”
- “Knows React and Next.js”
context
context
Current situation and environment.Examples:
- “Works at TechCorp as senior engineer”
- “Based in San Francisco”
- “Team size is 8 engineers”
behavior
behavior
How the user works and makes decisions.Examples:
- “Prefers test-driven development”
- “Always writes documentation first”
- “Uses git rebase instead of merge”
goal
goal
User’s objectives and aspirations.Examples:
- “Learning Rust for systems programming”
- “Building a SaaS product”
- “Planning to migrate to microservices”
Memory API
Manage memory via the Gateway API:Get Memory
Reload Memory
Get Configuration
Memory API Reference
Complete API documentation
Python Client
Access memory programmatically:Best Practices
Provide explicit information
Provide explicit information
The more explicit you are, the higher the confidence:✅ Good: “I prefer Python because it’s more readable”❌ Bad: “I guess Python is okay”
Correct inaccuracies
Correct inaccuracies
If the agent has wrong information, correct it:“Actually, I don’t use VS Code anymore. I switched to Neovim last month.”The system will update or replace the fact.
Review memory periodically
Review memory periodically
Check memory data for accuracy:Manually edit
backend/.deer-flow/memory.json if needed.Adjust confidence threshold
Adjust confidence threshold
If you get too many low-quality facts:
Memory Privacy
Memory is completely private:- Stored locally in
backend/.deer-flow/memory.json - Never sent to external services
- Only used for prompt injection in your own agent
- Fully under your control
Troubleshooting
Memory not persisting
Memory not persisting
Check that memory is enabled:Verify the storage path exists:
Too many low-quality facts
Too many low-quality facts
Raise the confidence threshold:
Memory not appearing in prompts
Memory not appearing in prompts
Check injection is enabled:Verify facts exist:
Next Steps
Memory Configuration
Configure memory system
Memory API
API reference
Context Engineering
How memory injection works
Agent System
Learn about the agent