Thread-based conversation persistence with semantic recall and working memory in Mastra
Memory in Mastra provides thread-based conversation persistence with semantic recall, working memory, and observational patterns. It enables agents to maintain context across conversations and recall relevant information.
import { Memory } from '@mastra/memory';import { PostgresStore } from '@mastra/postgres';const memory = new Memory({ name: 'conversation', storage: new PostgresStore({ connectionString: process.env.DATABASE_URL }), options: { lastMessages: 20 // Include last 20 messages in context }});const agent = new Agent({ id: 'assistant', instructions: 'You are a helpful assistant', model: 'openai/gpt-5', memory});// Conversation persists across callsconst result1 = await agent.generate( 'My name is Alice', { threadId: 'user-123' });const result2 = await agent.generate( 'What is my name?', { threadId: 'user-123' });console.log(result2.text); // "Your name is Alice"
Memory organizes conversations using threads and resources:
Thread: A single conversation (e.g., a chat session)
Resource: An entity that owns threads (e.g., a user)
const result = await agent.generate('Hello', { threadId: 'thread-456', // Specific conversation resourceId: 'user-123' // User who owns the thread});// List threads for a userconst threads = await memory.listThreads({ resourceId: 'user-123'});// Get messages from a threadconst messages = await memory.getMessages({ threadId: 'thread-456'});
Retrieve relevant past messages using vector similarity:
import { PineconeVector } from '@mastra/pinecone';import { OpenAIEmbedder } from '@mastra/embedders';const memory = new Memory({ name: 'conversation', storage: new PostgresStore({ url: process.env.DATABASE_URL }), vector: new PineconeVector({ apiKey: process.env.PINECONE_API_KEY, indexName: 'conversations' }), embedder: 'openai/text-embedding-3-small', options: { semanticRecall: { topK: 5, // Retrieve 5 most similar messages threshold: 0.7 // Minimum similarity score } }});const agent = new Agent({ id: 'assistant', model: 'openai/gpt-5', memory});// Agent automatically recalls relevant past messagesconst result = await agent.generate( 'What did I say about my vacation plans?', { threadId: 'user-123' });// Memory retrieves similar messages from history
const memory = new Memory({ name: 'conversation', options: { observationalMemory: { enabled: true, updateInterval: 5 // Update after every 5 messages } }});// Memory tracks patterns like:// - Communication style// - Common topics// - Interaction patterns// - User preferencesconst agent = new Agent({ memory });// Agent adapts based on learned patternsconst result = await agent.generate( 'How should I plan my day?', { resourceId: 'user-123' });// Response considers learned preferences and patterns