Long-term memory storage and semantic search for AI agents
Memory services enable agents to store and retrieve relevant information from past conversations. Unlike sessions which track active conversations, memory provides long-term context that persists indefinitely.
import { MemoryService, InMemoryStorageProvider } from '@iqai/adk';// Minimal configuration - no summarization or embeddingsconst memoryService = new MemoryService({ storage: new InMemoryStorageProvider(),});
interface MemoryRecord { id: string; // Unique identifier sessionId: string; // Original session ID userId: string; // User who owns this memory appName: string; // Application name timestamp: string; // When memory was created content: MemoryContent; // Summarized or raw content embedding?: number[]; // Vector for semantic search}
If no summaryProvider is configured, the service stores minimal content with raw text extracted from the session events.
const results = await memoryService.search({ query: 'vegetarian pizza with extra cheese', userId: 'user-123',});// Finds memories about dietary preferences, cheese, pizza// even if exact words differ
When no embeddingProvider is configured:
Storage uses keyword-based search
Simple text matching on content fields
Basic relevance scoring
Requires exact keyword matches
const results = await memoryService.search({ query: 'pizza cheese', userId: 'user-123',});// Finds memories containing words "pizza" or "cheese"
// Delete all memories for a userconst deleted = await memoryService.delete({ userId: 'user-123',});// Delete memories from a specific sessionawait memoryService.delete({ userId: 'user-123', sessionId: 'session-456',});// Delete old memories (before a date)await memoryService.delete({ userId: 'user-123', before: '2024-01-01T00:00:00Z',});// Delete specific memory IDsawait memoryService.delete({ ids: ['memory-1', 'memory-2'],});
import { LlmSummaryProvider } from '@iqai/adk';const summary = new LlmSummaryProvider({ model: 'gpt-4o-mini', // Custom prompt to guide summarization prompt: `Summarize this conversation focusing on: - User preferences and interests - Important decisions or commitments - Key facts to remember Format as structured JSON with summary, keyFacts, and entities.`,});
Store raw session data without summarization:
import { PassthroughSummaryProvider } from '@iqai/adk';const summary = new PassthroughSummaryProvider();// Simply extracts raw text from session events
import { AgentBuilder, MemoryService, VectorStorageProvider, QdrantVectorStore, OpenAIEmbeddingProvider, LlmSummaryProvider, createDatabaseSessionService,} from '@iqai/adk';// Configure memory serviceconst memoryService = new MemoryService({ storage: new VectorStorageProvider({ vectorStore: new QdrantVectorStore({ url: process.env.QDRANT_URL, collectionName: 'agent-memories', }), }), embeddingProvider: new OpenAIEmbeddingProvider(), summaryProvider: new LlmSummaryProvider({ model: 'gpt-4o-mini', }), searchLimit: 10,});// Configure session serviceconst sessionService = createDatabaseSessionService( process.env.DATABASE_URL);// Create agent with memoryconst { runner } = await AgentBuilder .withModel('gpt-4') .withInstruction('You are a helpful assistant with long-term memory.') .withSessionService(sessionService) .withMemoryService(memoryService) .build();// Have a conversationconst session1 = await runner.ask({ prompt: 'I love vegetarian pizza with extra cheese', userId: 'user-123',});// End session and save to memoryconst finalSession = await sessionService.endSession( session1.appName, session1.userId, session1.id);await memoryService.addSessionToMemory(finalSession);// Later, in a new conversationconst session2 = await runner.ask({ prompt: 'What kind of pizza do I like?', userId: 'user-123',});// The agent can search its memoryconst memories = await memoryService.search({ query: 'pizza preferences', userId: 'user-123',});console.log('Agent remembers:', memories[0]?.memory.content.summary);