Skip to main content

Introduction to Memori

Memori is the memory fabric for enterprise AI that plugs into the software and infrastructure you already use. It is LLM, datastore, and framework agnostic, seamlessly integrating into the architecture you’ve already designed.

Quickstart

Get started in minutes with Memori Cloud or bring your own database

Memori Cloud

Zero-config managed memory service. Start in minutes with no database setup.

BYODB (Bring Your Own Database)

Full control with your own database. Works with PostgreSQL, MySQL, MongoDB, and more.

API reference

Explore the complete SDK reference for Python and TypeScript

What is Memori?

Memori enables your AI applications to remember and recall information across conversations. Instead of starting fresh with every interaction, your AI can build on past conversations, understand user preferences, and maintain context over time.

Key features

Zero-latency memory

Background processing ensures your LLM calls are never slowed down. Memories are persisted and recalled asynchronously.

Advanced augmentation

Automatically extracts and structures facts, preferences, relationships, events, and more from conversations.

LLM agnostic

Native support for OpenAI, Anthropic, Gemini, Bedrock, and Grok. Works with any LLM provider.

Framework integration

Seamless integration with popular frameworks like LangChain and Agno.

Flexible storage

Use Memori Cloud for zero-config hosting or bring your own database (PostgreSQL, MongoDB, SQLite, and more).

Automatic recall

Relevant memories are automatically injected into your prompts - no manual context management required.

How it works

Memori operates at three distinct levels to provide comprehensive memory management:
1

Entity level

Track information about individual users, customers, or any distinct person, place, or thing in your system.
2

Process level

Maintain context for specific agents, workflows, or application processes.
3

Session level

Group related interactions together, such as a conversation thread or multi-step agent execution.

Memory types

Memori’s Advanced Augmentation automatically extracts and organizes several types of information:
  • Facts - Objective information learned from conversations
  • Preferences - User likes, dislikes, and choices
  • Attributes - Characteristics of entities and processes
  • Relationships - Connections between different entities
  • Events - Important occurrences and milestones
  • Skills - Capabilities and competencies
  • Rules - Guidelines and constraints to follow
  • People - Information about individuals mentioned in conversations
Advanced Augmentation runs asynchronously in the background, ensuring zero latency impact on your LLM calls.

Supported platforms

LLM providers

Memori works with all major LLM providers:
  • OpenAI (Chat Completions & Responses API)
  • Anthropic (Claude)
  • Google Gemini
  • AWS Bedrock
  • Grok (xAI)
All providers support streamed, unstreamed, synchronous, and asynchronous modes.

Frameworks

  • Agno - Build AI agents with persistent memory
  • LangChain - Add memory to your LangChain applications

Cloud platforms

  • Nebius AI Studio - Integrated support for Nebius platform

Databases (BYODB)

  • PostgreSQL - Including Neon, DigitalOcean, and other PostgreSQL-compatible services
  • CockroachDB - Distributed SQL database
  • MongoDB - Document database
  • SQLite - Local file-based database
  • OceanBase - Distributed database
  • Oracle - Enterprise database
  • MySQL - Popular relational database

Installation

pip install memori
Python requires version 3.10 or higher. TypeScript requires Node.js 18.0.0 or higher.

Getting started

Ready to add memory to your AI application? Check out the Quickstart guide to build your first memory-enabled application in minutes. For detailed documentation:

Support

Need help getting started?

License

Memori is open source under the Apache 2.0 license.

Build docs developers (and LLMs) love