Overview
AgentChat supports memory through theMemory interface. Agents can query memory systems to augment their context before generating responses.
Memory interface
Memory systems implement theMemory protocol:
Vector memory
For semantic search over documents:Agent state
Agents maintain state across messages throughAgentState:
Team state
Teams also maintain state:Conversation history
Agents automatically maintain conversation history:Context management
Control how much context is sent to the LLM:Memory with RAG
Combine memory with Retrieval-Augmented Generation:Persistent memory
Save and restore agent state across sessions:Memory events
Agents emit memory query events:Best practices
Use semantic search for large document sets
Use semantic search for large document sets
Vector memory with embeddings is ideal for searching large knowledge bases.
Limit context window for long conversations
Limit context window for long conversations
Use
BufferedChatCompletionContext to prevent context overflow and reduce costs.Separate short-term and long-term memory
Separate short-term and long-term memory
Keep recent conversation in agent state, use vector memory for long-term knowledge.
Persist important state
Persist important state
Save agent state after critical interactions for recovery and continuity.
Memory integration examples
Example: FAQ bot with memory
Example: Conversation summarization
Next steps
Extensions Overview
Advanced extension patterns
State Management
Team state patterns
Examples
See memory in action
Custom Agents
Build memory-aware agents