Introduction to Memori
Memori is the memory fabric for enterprise AI that plugs into the software and infrastructure you already use. It is LLM, datastore, and framework agnostic, seamlessly integrating into the architecture you’ve already designed.Quickstart
Get started in minutes with Memori Cloud or bring your own database
Memori Cloud
Zero-config managed memory service. Start in minutes with no database setup.
BYODB (Bring Your Own Database)
Full control with your own database. Works with PostgreSQL, MySQL, MongoDB, and more.
API reference
Explore the complete SDK reference for Python and TypeScript
What is Memori?
Memori enables your AI applications to remember and recall information across conversations. Instead of starting fresh with every interaction, your AI can build on past conversations, understand user preferences, and maintain context over time.Key features
Zero-latency memory
Background processing ensures your LLM calls are never slowed down. Memories are persisted and recalled asynchronously.
Advanced augmentation
Automatically extracts and structures facts, preferences, relationships, events, and more from conversations.
LLM agnostic
Native support for OpenAI, Anthropic, Gemini, Bedrock, and Grok. Works with any LLM provider.
Framework integration
Seamless integration with popular frameworks like LangChain and Agno.
Flexible storage
Use Memori Cloud for zero-config hosting or bring your own database (PostgreSQL, MongoDB, SQLite, and more).
Automatic recall
Relevant memories are automatically injected into your prompts - no manual context management required.
How it works
Memori operates at three distinct levels to provide comprehensive memory management:Entity level
Track information about individual users, customers, or any distinct person, place, or thing in your system.
Memory types
Memori’s Advanced Augmentation automatically extracts and organizes several types of information:- Facts - Objective information learned from conversations
- Preferences - User likes, dislikes, and choices
- Attributes - Characteristics of entities and processes
- Relationships - Connections between different entities
- Events - Important occurrences and milestones
- Skills - Capabilities and competencies
- Rules - Guidelines and constraints to follow
- People - Information about individuals mentioned in conversations
Advanced Augmentation runs asynchronously in the background, ensuring zero latency impact on your LLM calls.
Supported platforms
LLM providers
Memori works with all major LLM providers:- OpenAI (Chat Completions & Responses API)
- Anthropic (Claude)
- Google Gemini
- AWS Bedrock
- Grok (xAI)
Frameworks
- Agno - Build AI agents with persistent memory
- LangChain - Add memory to your LangChain applications
Cloud platforms
- Nebius AI Studio - Integrated support for Nebius platform
Databases (BYODB)
- PostgreSQL - Including Neon, DigitalOcean, and other PostgreSQL-compatible services
- CockroachDB - Distributed SQL database
- MongoDB - Document database
- SQLite - Local file-based database
- OceanBase - Distributed database
- Oracle - Enterprise database
- MySQL - Popular relational database
Installation
Getting started
Ready to add memory to your AI application? Check out the Quickstart guide to build your first memory-enabled application in minutes. For detailed documentation:- Memori Cloud: Complete guide for the managed solution
- BYODB: Setup instructions for using your own database
- Examples: Browse the Memori Cookbook for more examples
Support
Need help getting started?- Documentation: memorilabs.ai/docs
- Discord: Join our community
- GitHub: Report issues or contribute on GitHub