Skip to main content

Quickstart

Get started with Memori in just a few minutes. This guide shows you how to add persistent memory to your LLM applications using either Memori Cloud or your own database.

Installation

First, install the Memori SDK:
pip install memori
Python requires version 3.10+. TypeScript requires Node.js 18.0.0+.

Choose your setup

Memori Cloud

Zero configuration - get an API key and start building immediately

Bring your own database

Use your existing database infrastructure for complete control
The fastest way to get started. Memori Cloud provides fully managed memory infrastructure with zero configuration.
1

Get your API key

Sign up at app.memorilabs.ai and get your Memori API key.
2

Set environment variables

Export your Memori API key and LLM provider API key:
export MEMORI_API_KEY="your_memori_api_key"
export OPENAI_API_KEY="your_openai_api_key"
3

Run your first memory-enabled app

Create and run the example below.
from memori import Memori
from openai import OpenAI

# Initialize OpenAI and Memori
client = OpenAI()
mem = Memori().llm.register(client)

# Set attribution to identify the user and process
mem.attribution(entity_id="user_123", process_id="support_agent")

# First conversation - teach the AI
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "My favorite color is blue."}]
)
print(response.choices[0].message.content)

# Second conversation - Memori recalls automatically
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "What's my favorite color?"}]
)
print(response.choices[0].message.content)
# Output: Your favorite color is blue.
Memori Cloud is free for developers with generous rate limits. Advanced Augmentation runs in the background with zero latency impact.

Option 2: Bring your own database (BYODB)

Use your existing database infrastructure. Memori supports PostgreSQL, MongoDB, SQLite, CockroachDB, and more.
1

Choose your database

Memori supports PostgreSQL, MongoDB, SQLite, CockroachDB, OceanBase, Oracle, and MySQL.
2

Install database dependencies

Install the required database driver for your chosen database:
Python
# For PostgreSQL/CockroachDB
pip install "memori[cockroachdb]"

# For MongoDB
pip install pymongo
3

Set up your connection

Configure Memori with your database connection.

SQLite example

Perfect for local development and testing:
import os
from openai import OpenAI
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from memori import Memori

# Setup OpenAI
client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))

# Setup SQLite database
engine = create_engine("sqlite:///memori.db")
Session = sessionmaker(bind=engine)

# Setup Memori with SQLite
mem = Memori(conn=Session).llm.register(client)
mem.attribution(entity_id="user-123", process_id="my-app")
mem.config.storage.build()

# First conversation
print("You: My favorite color is blue and I live in Paris")
response1 = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{
        "role": "user", 
        "content": "My favorite color is blue and I live in Paris"
    }],
)
print(f"AI: {response1.choices[0].message.content}\n")

# Second conversation - Memori recalls context
print("You: What's my favorite color?")
response2 = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "What's my favorite color?"}],
)
print(f"AI: {response2.choices[0].message.content}\n")

# Third conversation
print("You: What city do I live in?")
response3 = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "What city do I live in?"}],
)
print(f"AI: {response3.choices[0].message.content}")

# Wait for augmentation to complete
mem.augmentation.wait()

PostgreSQL example

For production deployments:
import os
from openai import OpenAI
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from memori import Memori

client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))

# Connect to PostgreSQL
database_connection_string = os.getenv("DATABASE_CONNECTION_STRING")
engine = create_engine(database_connection_string)
Session = sessionmaker(bind=engine)

# Setup Memori with PostgreSQL
mem = Memori(conn=Session).llm.register(client)
mem.attribution(entity_id="user-123", process_id="my-app")
mem.config.storage.build()

# Use as normal
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{
        "role": "user",
        "content": "My favorite color is blue and I live in Paris"
    }],
)
print(response.choices[0].message.content)

mem.augmentation.wait()

MongoDB example

For document-based storage:
import os
from openai import OpenAI
from pymongo import MongoClient
from memori import Memori

client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))

# Connect to MongoDB
mongo_client = MongoClient(os.getenv("MONGODB_CONNECTION_STRING"))
db = mongo_client["memori"]

# Setup Memori with MongoDB
mem = Memori(conn=lambda: db).llm.register(client)
mem.attribution(entity_id="user-123", process_id="my-app")
mem.config.storage.build()

# Use as normal
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{
        "role": "user",
        "content": "My favorite color is blue and I live in Paris"
    }],
)
print(response.choices[0].message.content)

mem.augmentation.wait()
The mem.config.storage.build() call creates the necessary database tables/collections on first run.

Understanding attribution

Attribution tells Memori who is interacting and what process is handling the interaction. This is essential for memory organization.
Python
mem.attribution(entity_id="user_123", process_id="support_agent")
TypeScript
mem.attribution('user-123', 'my-app');
  • entity_id: Identifies the user, customer, or any distinct entity
  • process_id: Identifies your agent, workflow, or application process
Without attribution, Memori cannot create memories. Always set attribution before making LLM calls.

Session management

Memori automatically manages sessions to group related interactions. You can also control sessions manually:
# Start a new session
mem.new_session()

# Or restore a previous session
session_id = mem.config.session_id
# ... later ...
mem.set_session(session_id)

What’s happening behind the scenes?

When you make an LLM call with Memori registered:
1

Recall

Memori retrieves relevant memories and automatically injects them into your prompt’s system context.
2

LLM call

Your LLM call executes normally with the enhanced context.
3

Persistence

The conversation is persisted asynchronously in the background.
4

Augmentation

Advanced augmentation extracts facts, preferences, relationships, and more - all with zero latency impact.

Framework integration

Memori works seamlessly with popular AI frameworks:

Agno example

import os
from agno.agent import Agent
from agno.models.openai import OpenAIChat
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from memori import Memori

# Setup database
engine = create_engine("sqlite:///memori_agno.db")
Session = sessionmaker(bind=engine)

# Setup Agno with OpenAI
model = OpenAIChat(id="gpt-4o-mini")

# Register Memori with Agno
mem = Memori(conn=Session).llm.register(openai_chat=model)
mem.attribution(entity_id="customer-456", process_id="support-agent")
mem.config.storage.build()

# Create agent
agent = Agent(
    model=model,
    instructions=[
        "You are a helpful customer support agent.",
        "Remember customer preferences and history.",
    ],
    markdown=True,
)

# Use the agent - memory is automatic
response = agent.run(
    "Hi, I'd like to order a large pepperoni pizza with extra cheese"
)
print(response.content)

response = agent.run("Can you remind me what I just ordered?")
print(response.content)

mem.augmentation.wait()

View your memories

If you’re using Memori Cloud, explore your memories in the dashboard:
  • Dashboard: app.memorilabs.ai
  • View memories, analytics, and manage API keys
  • Test memories in the interactive playground

Managing quota

Memori Advanced Augmentation is free for developers with generous rate limits.
# Check your current quota
python -m memori quota

# Sign up for an API key
python -m memori sign-up [email protected]
You can also manage your quota at app.memorilabs.ai.

Next steps

Now that you have Memori running, explore more advanced features:

Memori Cloud guide

Learn about cloud-specific features and configuration

BYODB guide

Deep dive into database configuration options

API reference

Explore the complete SDK reference

Examples

Browse the Memori Cookbook for more examples

Need help?

Build docs developers (and LLMs) love