Skip to main content
Tabby uses Mem0 for persistent AI memory, allowing the assistant to remember your preferences, coding style, and past interactions across sessions.

Architecture

The memory system consists of three components:
  1. Vector Store - Supabase PostgreSQL with pgvector for semantic search
  2. Graph Store - Neo4j for knowledge graph relationships (optional)
  3. Memory Backend - FastAPI server running Mem0

Configuration

Backend Environment

Configure the memory backend in backend/.env:
# Required: OpenAI for memory operations
OPENAI_API_KEY="sk-..."

# Required: Supabase PostgreSQL connection
SUPABASE_CONNECTION_STRING="postgresql://postgres:[email protected]:54322/postgres"

# Optional: Neo4j knowledge graph
NEO4J_URL="neo4j+s://xxx.databases.neo4j.io"
NEO4J_USERNAME="neo4j"
NEO4J_PASSWORD="..."
NEO4J_DATABASE="neo4j"
AURA_INSTANCEID="..."
AURA_INSTANCENAME="..."

Frontend Configuration

Point the frontend and Next.js backend to the memory API:
NEXT_PUBLIC_MEMORY_API_URL="http://localhost:8000"

Mem0 Configuration

The memory backend uses the following configuration (from backend/main.py:36-76):
backend/main.py
config = {
    "llm": {
        "provider": "openai",
        "config": {
            "model": "gpt-4.1-nano-2025-04-14",
            "enable_vision": True,
        }
    },
    "vector_store": {
        "provider": "supabase",
        "config": {
            "connection_string": supabase_connection_string,
            "collection_name": os.environ.get("SUPABASE_COLLECTION_NAME", "memories"),
            "index_method": os.environ.get("SUPABASE_INDEX_METHOD", "hnsw"),
            "index_measure": os.environ.get("SUPABASE_INDEX_MEASURE", "cosine_distance")
        }
    }
}

# Optional: Neo4j graph store
if neo4j_url and neo4j_password:
    config["graph_store"] = {
        "provider": "neo4j",
        "config": {
            "url": neo4j_url,
            "username": neo4j_username,
            "password": neo4j_password,
        }
    }

Configuration Options

VariableDefaultDescription
SUPABASE_COLLECTION_NAMEmemoriesCollection name for vector storage
SUPABASE_INDEX_METHODhnswVector index method (hnsw or ivfflat)
SUPABASE_INDEX_MEASUREcosine_distanceDistance metric for similarity search

Vector Store Setup

Tabby uses a local Supabase instance running in Docker for vector storage.
1

Start Docker Desktop

Ensure Docker Desktop is running before proceeding.
2

Initialize Supabase

Run this command once in the project root:
npx supabase init
3

Start Supabase

npx supabase start
The first run downloads ~13 Docker images. Subsequent starts take ~10 seconds.
4

Note Credentials

Copy the DB URL from the output:
npx supabase status
Look for DB URL (format: postgresql://postgres:[email protected]:54322/postgres)
5

Update Backend Environment

Add the connection string to backend/.env:
SUPABASE_CONNECTION_STRING="postgresql://postgres:[email protected]:54322/postgres"
The database schema is automatically applied from supabase/migrations/ when Supabase starts.

Supabase Commands

CommandDescription
npx supabase startStart local Supabase
npx supabase stopStop Supabase containers
npx supabase statusView connection details
npx supabase db resetReset database to migrations
Docker Desktop must be running before executing npx supabase start.

Neo4j Knowledge Graph (Optional)

Neo4j provides a knowledge graph visualization of memory relationships, enhancing contextual understanding.

Setup Neo4j AuraDB

1

Create Free Instance

Visit Neo4j AuraDB and create a free instance.
2

Save Credentials

Download the credentials text file when creating the instance. It contains:
  • URI (e.g., neo4j+s://xxx.databases.neo4j.io)
  • Username (usually neo4j)
  • Password
3

Get Instance Details

Note your Instance ID and Instance Name from the dashboard.
4

Configure Backend

Add credentials to backend/.env:
NEO4J_URL="neo4j+s://xxx.databases.neo4j.io"
NEO4J_USERNAME="neo4j"
NEO4J_PASSWORD="your-password"
NEO4J_DATABASE="neo4j"
AURA_INSTANCEID="your-instance-id"
AURA_INSTANCENAME="your-instance-name"
The graph store is automatically enabled when NEO4J_URL and NEO4J_PASSWORD are set.

Benefits of Neo4j Integration

  • Relationship Mapping - Visualize connections between memories
  • Contextual Retrieval - Better understanding of related concepts
  • Knowledge Graph - See how information connects over time
  • Brain Panel Visualization - Interactive graph view in the app

Memory Types

Tabby automatically classifies memories into five types using AI:
TypeDescriptionExamples
SHORT_TERMTemporary states, current activities”Currently working on…”, “Right now I’m…”
LONG_TERMPermanent preferences, identity”I prefer dark mode”, “My name is…”
EPISODICPast events with time context”Yesterday I had a meeting”
SEMANTICGeneral knowledge and facts”Python uses indentation”
PROCEDURALHow-to knowledge, processes”To deploy, run npm build”
Memory classification uses OpenAI’s gpt-4.1-nano-2025-04-14 model and happens automatically when memories are added.

Starting the Memory Backend

cd backend
uv run main.py
The server runs on http://localhost:8000 with auto-reload enabled.

Memory API Endpoints

The FastAPI backend provides the following endpoints:

Core Operations

EndpointMethodDescription
/memory/addPOSTAdd memories from conversation
/memory/searchPOSTSearch memories by query
/memory/get_allPOSTGet all memories for a user
/memory/{memory_id}GETGet specific memory by ID
/memory/updatePUTUpdate existing memory
/memory/{memory_id}DELETEDelete specific memory
/memory/user/{user_id}DELETEDelete all user memories

Special Features

EndpointMethodDescription
/memory/add_imagePOSTAdd image-based memory with vision
/memory/history/{memory_id}GETGet memory change history
All memory operations support automatic type classification and metadata filtering.

Alternative Vector Store

Mem0 also supports ChromaDB as an alternative to Supabase. To use Chroma:
backend/main.py
config = {
    "vector_store": {
        "provider": "chroma",
        "config": {
            "collection_name": "memories",
            "path": "db",
        }
    }
}
Using ChromaDB requires modifying backend/main.py. Supabase is recommended for production use.

Troubleshooting

Check OpenAI API key: The backend requires OPENAI_API_KEY in backend/.envVerify Supabase is running:
npx supabase status
Check Python dependencies:
cd backend
uv sync
Verify connection string: Ensure SUPABASE_CONNECTION_STRING points to the correct local databaseCheck Supabase logs:
docker logs supabase_db_postgres
Test database connection: Visit Studio at http://localhost:54323 → SQL Editor
Verify credentials: Check that all NEO4J_* variables are set in backend/.envTest connection: Visit your Neo4j instance dashboard to verify it’s activeCheck backend logs: The startup message shows if graph store is enabled
Verify image URLs: Ensure image URLs are accessibleCheck OpenAI API: Vision features require enable_vision: True and proper API accessReview request format: Image URLs must be in the correct format (see API endpoint docs)

Best Practices

Memory Management

  • Use memory types to organize and filter memories effectively
  • Enable auto-classification for automatic memory categorization
  • Add metadata for better search and filtering
  • Regularly review memories in the Brain Panel

Performance

  • Use HNSW indexing for faster similarity search (default)
  • Set appropriate search limits to avoid overwhelming results
  • Consider Neo4j for complex relationship queries
  • Monitor vector store size and clean up old memories

Security

  • Keep Supabase local for development (Docker)
  • Use secure connections for production Neo4j (SSL/TLS)
  • Rotate database passwords regularly
  • Never commit .env files with credentials

Next Steps

AI Providers

Configure OpenAI and other AI providers

Application Settings

Customize Tabby settings and preferences

Build docs developers (and LLMs) love