Skip to main content

Quick Start

Get started with Memori in under 3 minutes. Since Memori BYODB is open source, you bring your own database — and for this quick start, we will use SQLite so there is nothing extra to install.
Want a zero-setup option? Try Memori Cloud at app.memorilabs.ai.
In this example, we will use Memori with OpenAI and SQLite. Check out the LLM providers and database guides for other integrations.

Prerequisites

  • Python 3.10 or higher
  • An OpenAI API key

Step 1: Install Libraries

Install Memori and the OpenAI SDK:
pip install memori openai

Step 2: Set Environment Variables

Set your OpenAI API key as an environment variable:
export OPENAI_API_KEY="your-openai-api-key"

Step 3: Run Your First Memori Application

Create a new Python file quickstart.py and add the following code:

Setup & Configuration

Import libraries, set up a SQLite database with Python’s built-in sqlite3, and initialize Memori with your OpenAI client.
  • conn accepts a connection factory (SQLAlchemy, DB-API 2.0, Django ORM, or MongoDB callable)
  • llm.register() wraps your LLM client for automatic memory capture
  • attribution() links memories to a specific user and process
  • build() creates the Memori schema tables in your database
import os
import sqlite3
from memori import Memori
from openai import OpenAI

def get_sqlite_connection():
    return sqlite3.connect("memori.db")

client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
mem = Memori(conn=get_sqlite_connection).llm.register(client)
mem.attribution(entity_id="user_123", process_id="test-ai-agent")
mem.config.storage.build()
If your app already uses SQLAlchemy, you can pass a sessionmaker instead of a DB-API 2.0 connection function.

SQLAlchemy Alternative

from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from memori import Memori

engine = create_engine("sqlite:///memori.db")
SessionLocal = sessionmaker(bind=engine)

mem = Memori(conn=SessionLocal)

First Conversation

Tell the LLM a fact about yourself. Memori automatically captures the conversation and processes it through Advanced Augmentation. Since augmentation runs asynchronously, call augmentation.wait() in short-lived scripts to ensure memories are fully processed before continuing.
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {"role": "user", "content": "My favorite color is blue."}
    ]
)
print(response.choices[0].message.content + "\n")

# Wait for background augmentation to finish
mem.augmentation.wait()

Memory Recall

Create a completely new client and Memori instance — no prior context carried over. When you ask the LLM what it remembers, Memori automatically injects the relevant facts via semantic search. The second response should correctly recall your favorite color, proving memory persistence works across sessions.
client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))

mem = Memori(conn=get_sqlite_connection).llm.register(client)
mem.attribution(entity_id="user_123", process_id="test-ai-agent")

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {"role": "user", "content": "What's my favorite color?"}
    ]
)
print(response.choices[0].message.content + "\n")

Complete Code

import os
import sqlite3
from memori import Memori
from openai import OpenAI

# Setup and configuration
def get_sqlite_connection():
    return sqlite3.connect("memori.db")

client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
mem = Memori(conn=get_sqlite_connection).llm.register(client)
mem.attribution(entity_id="user_123", process_id="test-ai-agent")
mem.config.storage.build()

# First conversation - store a fact
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {"role": "user", "content": "My favorite color is blue."}
    ]
)
print(response.choices[0].message.content + "\n")

# Wait for background augmentation to finish
mem.augmentation.wait()

# Memory recall - new instance, no prior context
client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
mem = Memori(conn=get_sqlite_connection).llm.register(client)
mem.attribution(entity_id="user_123", process_id="test-ai-agent")

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {"role": "user", "content": "What's my favorite color?"}
    ]
)
print(response.choices[0].message.content + "\n")

Step 4: Run the Application

Execute your Python file:
python quickstart.py
You should see the AI respond to both questions, with the second response correctly recalling that your favorite color is blue!

Step 5: Inspect Your Memories

Since you own the database, you can inspect what Memori stored directly:
sqlite3 memori.db "SELECT * FROM memori_conversation_message;"

What You Learned

1

Database Connection

Memori accepts a connection factory (conn) that works with SQLAlchemy, DB-API 2.0, Django ORM, or MongoDB.
2

Schema Creation

The build() method creates all required Memori tables in your database.
3

Memory Capture

Memori automatically captures all LLM conversations when you wrap your client with llm.register().
4

Attribution

Using attribution() links memories to specific users and processes, enabling personalized recall.
5

Automatic Recall

Memori injects relevant memories into new conversations without manual retrieval code.
6

Data Ownership

All memories are stored in your database, which you can query directly for custom analytics.

Next Steps

1

Explore Other Databases

Learn how to connect PostgreSQL, MySQL, MongoDB, and more.Go to Databases →
2

Try Other LLMs

Integrate with Anthropic, Gemini, Bedrock, or OpenAI-compatible providers.View LLM integrations in the sidebar
3

Advanced Features

Learn about session management, manual recall, and custom augmentation.View concepts in the sidebar

Build docs developers (and LLMs) love