Skip to main content
The Graphiti MCP server provides Model Context Protocol integration, allowing AI assistants like Claude Desktop and Cursor to interact with Graphiti’s knowledge graph capabilities through a standardized protocol.

Overview

The MCP server exposes Graphiti’s core functionality through two transport modes:
  • HTTP transport (default): Accessible at http://localhost:8000/mcp/ for broad client compatibility
  • stdio transport: For clients that only support standard input/output
Key capabilities:
  • Episode management (add, retrieve, delete)
  • Entity and relationship operations
  • Semantic and hybrid search
  • Group-based data organization
  • Graph maintenance operations

Quick Start

The simplest way to run the MCP server is using Docker with the bundled FalkorDB database:
cd graphiti/mcp_server
docker compose up
This starts:
  • MCP server on http://localhost:8000/mcp/
  • FalkorDB on localhost:6379
  • FalkorDB web UI on http://localhost:3000

Local Development

For development or customization:
# Install dependencies
curl -LsSf https://astral.sh/uv/install.sh | sh
uv sync

# Run with default settings
uv run main.py

# Run with custom configuration
uv run main.py --config config/custom.yaml

Configuration

The server supports three configuration methods (in order of precedence):
  1. Command-line arguments
  2. Environment variables
  3. config.yaml file

Database Configuration

FalkorDB (Default)

FalkorDB is a Redis-based graph database bundled in a single container:
database:
  provider: "falkordb"
  providers:
    falkordb:
      uri: "redis://localhost:6379"
      password: ""  # Optional
      database: "default_db"

Neo4j

For production deployments:
database:
  provider: "neo4j"
  providers:
    neo4j:
      uri: "bolt://localhost:7687"
      username: "neo4j"
      password: "your_password"
      database: "neo4j"
Run with Neo4j:
docker compose -f docker/docker-compose-neo4j.yml up

LLM Provider Configuration

The server supports multiple LLM providers:
llm:
  provider: "openai"  # or "anthropic", "gemini", "groq", "azure_openai"
  model: "gpt-4.1"

embedder:
  provider: "openai"  # or "voyage", "sentence_transformers", "gemini"
  model: "text-embedding-3-small"

Ollama for Local LLM

Run LLMs locally using Ollama:
llm:
  provider: "openai"
  model: "gpt-oss:120b"  # Your Ollama model
  api_base: "http://localhost:11434/v1"
  api_key: "ollama"  # Dummy key required

embedder:
  provider: "sentence_transformers"
  model: "all-MiniLM-L6-v2"
Ensure Ollama is running:
ollama serve

Environment Variables

The config.yaml supports environment variable expansion:
database:
  providers:
    neo4j:
      uri: "${NEO4J_URI:bolt://localhost:7687}"
      username: "${NEO4J_USER:neo4j}"
      password: "${NEO4J_PASSWORD}"

llm:
  api_key: "${OPENAI_API_KEY}"
Key environment variables:
  • OPENAI_API_KEY - OpenAI API key
  • ANTHROPIC_API_KEY - Anthropic API key
  • GOOGLE_API_KEY - Google Gemini API key
  • GROQ_API_KEY - Groq API key
  • AZURE_OPENAI_API_KEY - Azure OpenAI key
  • SEMAPHORE_LIMIT - Concurrency control (see Performance Tuning)

Command-Line Arguments

uv run main.py \
  --config config.yaml \
  --llm-provider openai \
  --database-provider neo4j \
  --model gpt-4.1 \
  --temperature 0.7 \
  --transport http \
  --group-id my-namespace \
  --destroy-graph  # CAUTION: Destroys all data on startup

Entity Types

The MCP server includes built-in entity types for structured knowledge extraction:
  • Preference: User preferences, choices, opinions
  • Requirement: Specific needs or functionality requirements
  • Procedure: Standard operating procedures
  • Location: Physical or virtual places
  • Event: Time-bound activities
  • Organization: Companies, institutions, groups
  • Document: Information content (books, articles, videos)
  • Topic: Subject of conversation (fallback)
  • Object: Physical items, tools, devices (fallback)
Customize in config.yaml:
graphiti:
  entity_types:
    - name: "Preference"
      description: "User preferences, choices, opinions, or selections"
    - name: "CustomType"
      description: "Your custom entity type"

MCP Client Integration

Cursor IDE

  1. Start the MCP server:
    uv run main.py --group-id cursor-workspace
    
  2. Configure Cursor (.cursor/mcp.json):
    {
      "mcpServers": {
        "graphiti-memory": {
          "url": "http://localhost:8000/mcp/"
        }
      }
    }
    
  3. Add Graphiti rules to Cursor’s User Rules (see cursor_rules.md)

Claude Desktop (via mcp-remote)

Claude Desktop requires the mcp-remote gateway for HTTP transport:
  1. Start the MCP server:
    docker compose up
    
  2. Install mcp-remote (optional, npx will fetch it):
    npm install -g mcp-remote
    
  3. Configure Claude Desktop (claude_desktop_config.json):
    {
      "mcpServers": {
        "graphiti-memory": {
          "command": "npx",
          "args": [
            "mcp-remote",
            "http://localhost:8000/mcp/"
          ]
        }
      }
    }
    
  4. Restart Claude Desktop

stdio Transport

For clients supporting only stdio transport:
{
  "mcpServers": {
    "graphiti-memory": {
      "transport": "stdio",
      "command": "/Users/<user>/.local/bin/uv",
      "args": [
        "run",
        "--isolated",
        "--directory",
        "/path/to/graphiti/mcp_server",
        "--project",
        ".",
        "main.py",
        "--transport",
        "stdio"
      ],
      "env": {
        "NEO4J_URI": "bolt://localhost:7687",
        "NEO4J_USER": "neo4j",
        "NEO4J_PASSWORD": "password",
        "OPENAI_API_KEY": "sk-XXXXXXXX"
      }
    }
  }
}

Available Tools

The MCP server exposes these tools to AI assistants:

Episode Management

  • add_episode - Add text, JSON, or message episodes to the graph
  • get_episodes - Retrieve recent episodes for a group
  • delete_episode - Remove an episode by UUID

Entity & Relationship Operations

  • get_entity_edge - Get an entity edge by UUID
  • delete_entity_edge - Delete an entity edge
  • search_nodes - Search for relevant node summaries
  • search_facts - Search for relevant facts (entity relationships)

Graph Maintenance

  • clear_graph - Clear all data and rebuild indices
  • get_status - Check server and database connection status

Working with JSON Data

Ingest structured JSON data for entity extraction:
add_episode(
    name="Customer Profile",
    episode_body=json.dumps({
        "company": {"name": "Acme Technologies"},
        "products": [
            {"id": "P001", "name": "CloudSync"},
            {"id": "P002", "name": "DataMiner"}
        ]
    }),
    source="json",
    source_description="CRM data"
)

Deployment

Production Deployment with Neo4j

# docker-compose.prod.yml
version: '3.8'

services:
  graphiti-mcp:
    image: zepai/knowledge-graph-mcp
    ports:
      - "8000:8000"
    environment:
      - OPENAI_API_KEY=${OPENAI_API_KEY}
      - NEO4J_URI=bolt://neo4j:7687
      - NEO4J_USER=neo4j
      - NEO4J_PASSWORD=${NEO4J_PASSWORD}
      - SEMAPHORE_LIMIT=15
    depends_on:
      - neo4j

  neo4j:
    image: neo4j:5.26.0
    ports:
      - "7474:7474"
      - "7687:7687"
    volumes:
      - neo4j_data:/data
    environment:
      - NEO4J_AUTH=neo4j/${NEO4J_PASSWORD}

volumes:
  neo4j_data:

Health Checks

Monitor the MCP server:
curl http://localhost:8000/health
Expected response:
{
  "status": "healthy",
  "database": "connected",
  "version": "0.22.1"
}

Telemetry

The MCP server uses Graphiti core’s telemetry. To disable:
export GRAPHITI_TELEMETRY_ENABLED=false
Or in .env:
GRAPHITI_TELEMETRY_ENABLED=false
Telemetry collects:
  • Anonymous usage statistics
  • System information
  • Configuration choices (LLM provider, database type)
Never collected: API keys, personal data, graph content, queries

Troubleshooting

Connection Issues

Problem: MCP client cannot connect to server Solutions:
  • Verify server is running: curl http://localhost:8000/health
  • Check firewall allows port 8000
  • For Docker: ensure containers are on same network

Database Connection Errors

Problem: Cannot connect to Neo4j/FalkorDB Solutions:
  • Verify database is running
  • Check credentials in config or environment
  • For Neo4j: ensure Bolt port 7687 is accessible
  • For FalkorDB: ensure Redis port 6379 is accessible

Rate Limit Errors

Problem: 429 errors from LLM provider Solution: Lower SEMAPHORE_LIMIT (see Performance Tuning)

Next Steps

Build docs developers (and LLMs) love