Skip to main content
Basic Memory Hero Light

Welcome to Basic Memory

Basic Memory lets you build persistent knowledge through natural conversations with Large Language Models (LLMs) like Claude, while keeping everything in simple Markdown files on your computer. It uses the Model Context Protocol (MCP) to enable any compatible LLM to read and write to your local knowledge base.

Pick up your conversation right where you left off

AI assistants can load context from local files in a new conversation

No more starting from scratch. Your AI assistant remembers what you’ve discussed before.

Notes are saved locally as Markdown files in real time

Everything stays on your computer in simple, readable Markdown that you can edit with any text editor.

No project knowledge or special prompting required

Just talk naturally. The AI can create, read, and update your knowledge base automatically.

Build a traversable knowledge graph

Notes link together using WikiLinks, creating a semantic web of connected ideas.

Why Basic Memory?

Most LLM interactions are ephemeral - you ask a question, get an answer, and everything is forgotten. Each conversation starts fresh, without the context or knowledge from previous ones. Current workarounds have limitations:

Chat histories

Capture conversations but aren’t structured knowledge

RAG systems

Can query documents but don’t let LLMs write back

Vector databases

Require complex setups and often live in the cloud

Knowledge graphs

Typically need specialized tools to maintain

The Basic Memory Approach

Basic Memory addresses these problems with a simple approach: structured Markdown files that both humans and LLMs can read and write to.

Local-first

All knowledge stays in files you control

Bi-directional

Both you and the LLM read and write to the same files

Structured yet simple

Uses familiar Markdown with semantic patterns

Traversable knowledge graph

LLMs can follow links between topics

Standard formats

Works with existing editors like Obsidian

Lightweight infrastructure

Just local files indexed in a local SQLite database

Key Features

Have conversations that build on previous knowledge. Ask the LLM to recall what you discussed last week, and it can load the relevant context from your knowledge base.
Create structured notes during natural conversations. The LLM can organize information into Entities with Observations (facts) and Relations (links to other entities).
Navigate your knowledge graph semantically. The LLM can follow relations like relates_to [[Coffee Bean Origins]] to build rich context from interconnected notes.
Keep everything local and under your control. Your knowledge stays on your computer in simple Markdown files.
Use familiar tools like Obsidian to view and edit notes. Basic Memory uses standard Markdown with WikiLinks.
Sync your knowledge to the cloud with bidirectional synchronization for cross-device access.

Quick Example

Here’s how a typical interaction works:
1

Chat naturally with your AI assistant

I've been experimenting with different coffee brewing methods. Key things I've learned:

- Pour over gives more clarity in flavor than French press
- Water temperature is critical - around 205°F seems best
- Freshly ground beans make a huge difference
2

Ask the AI to structure your knowledge

Let's write a note about coffee brewing methods.
3

The AI creates a Markdown file

The file appears instantly in ~/basic-memory/coffee-brewing-methods.md:
---
title: Coffee Brewing Methods
permalink: coffee-brewing-methods
tags:
- coffee
- brewing
---

# Coffee Brewing Methods

## Observations

- [method] Pour over provides more clarity and highlights subtle flavors
- [technique] Water temperature at 205°F (96°C) extracts optimal compounds
- [principle] Freshly ground beans preserve aromatics and flavor

## Relations

- relates_to [[Coffee Bean Origins]]
- requires [[Proper Grinding Technique]]
- affects [[Flavor Extraction]]
4

Reference it in future conversations

Look at coffee-brewing-methods for context about pour over coffee
The AI loads the note and follows the relations to build comprehensive context from your knowledge graph.

How It Works

Under the hood, Basic Memory:
  1. Stores everything in Markdown files
  2. Uses a SQLite database for searching and indexing
  3. Extracts semantic meaning from simple Markdown patterns:
    • Files become Entity objects
    • Each Entity can have Observations (facts)
    • Relations connect entities together to form the knowledge graph
  4. Maintains the local knowledge graph derived from the files
  5. Provides bidirectional synchronization between files and the knowledge graph
  6. Implements the Model Context Protocol (MCP) for AI integration
  7. Exposes tools that let AI assistants traverse and manipulate the knowledge graph
  8. Uses memory:// URLs to reference entities across tools and conversations
Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to LLMs. Basic Memory implements MCP to enable AI assistants like Claude Desktop to interact with your knowledge base.

Get Started

Quick Start

Get up and running in 5 minutes

Installation

Detailed installation instructions

User Guide

Learn how to use Basic Memory effectively

CLI Reference

Complete command-line interface documentation

Community

Discord

Join our community

GitHub

View the source code

Website

Learn more

Build docs developers (and LLMs) love