Skip to main content

MemoryStore

The MemoryStore class implements a two-layer memory system:
  1. Long-term memory (MEMORY.md): Structured facts and knowledge
  2. History log (HISTORY.md): Chronological, grep-searchable conversation log
This design allows the agent to maintain context across conversations while staying within token limits.

Constructor

MemoryStore(workspace: Path)
workspace
Path
required
The workspace directory where memory files will be stored
File locations:
  • Long-term memory: {workspace}/memory/MEMORY.md
  • History log: {workspace}/memory/HISTORY.md
Example:
from nanobot.agent.memory import MemoryStore
from pathlib import Path

workspace = Path("/home/user/workspace")
memory = MemoryStore(workspace)

Methods

read_long_term

def read_long_term() -> str
Read the long-term memory file (MEMORY.md).
content
str
Contents of MEMORY.md, or empty string if file doesn’t exist
Example:
memory = MemoryStore(workspace)
current_memory = memory.read_long_term()
print(f"Current memory has {len(current_memory)} characters")

write_long_term

def write_long_term(content: str) -> None
Write new content to the long-term memory file (MEMORY.md).
content
str
required
The complete memory content to write (overwrites existing)
Example:
memory = MemoryStore(workspace)

# Read, update, write pattern
current = memory.read_long_term()
updated = current + "\n\n## New Facts\n- User prefers Python 3.11\n"
memory.write_long_term(updated)

append_history

def append_history(entry: str) -> None
Append an entry to the history log (HISTORY.md).
entry
str
required
A history entry, typically starting with [YYYY-MM-DD HH:MM] timestamp
Example:
memory = MemoryStore(workspace)

memory.append_history(
    "[2026-03-06 14:30] User requested to create a FastAPI project. "
    "Generated project structure with main.py, requirements.txt, and Docker config."
)

get_memory_context

def get_memory_context() -> str
Get formatted memory content for inclusion in system prompt.
context
str
Formatted memory string with ”## Long-term Memory” header, or empty string if no memory
Example:
memory = MemoryStore(workspace)
context = memory.get_memory_context()

# This is used by ContextBuilder
system_prompt = f"""
You are nanobot...

{context}
"""

consolidate

async def consolidate(
    session: Session,
    provider: LLMProvider,
    model: str,
    *,
    archive_all: bool = False,
    memory_window: int = 50,
) -> bool
Consolidate old conversation messages into memory using an LLM.
session
Session
required
The session containing messages to consolidate
provider
LLMProvider
required
LLM provider for consolidation
model
str
required
Model name to use for consolidation
archive_all
bool
default:"False"
If True, archives all messages (used by /new command). If False, only archives oldest messages
memory_window
int
default:"50"
Number of recent messages to keep in active memory
success
bool
True if consolidation succeeded, False on failure
Behavior:
  1. Extracts old messages from session (before memory_window)
  2. Formats messages with timestamps and roles
  3. Calls LLM with current memory and messages to process
  4. LLM calls save_memory tool with:
    • history_entry: Summary paragraph for HISTORY.md
    • memory_update: Updated long-term memory content
  5. Updates session’s last_consolidated pointer
Example:
from nanobot.session.manager import SessionManager
from nanobot.providers.anthropic import AnthropicProvider

sessions = SessionManager(workspace)
memory = MemoryStore(workspace)
provider = AnthropicProvider(api_key="sk-...")

session = sessions.get_or_create("cli:user123")

# Consolidate when history gets long
if len(session.messages) > 100:
    success = await memory.consolidate(
        session,
        provider,
        model="claude-3-5-sonnet-20241022",
        memory_window=50
    )
    
    if success:
        print(f"Consolidated {len(session.messages) - 50} messages")
        sessions.save(session)

Memory Consolidation Flow

The consolidation process uses a specialized system prompt:
You are a memory consolidation agent. Call the save_memory tool with your
consolidation of the conversation.
And provides:
## Current Long-term Memory
(existing MEMORY.md content)

## Conversation to Process
[2026-03-06 14:20] USER: Create a Python web server
[2026-03-06 14:20] ASSISTANT [tools: exec, write_file]: Created FastAPI server...
[2026-03-06 14:25] USER: Add authentication
...
The LLM must call save_memory with:
{
  "history_entry": "[2026-03-06 14:20] Created FastAPI server with basic routes. Added JWT authentication middleware. User prefers async handlers.",
  "memory_update": "# User Preferences\n- Prefers FastAPI over Flask\n- Uses JWT for auth\n..."
}

Tool Definition

The save_memory tool definition used during consolidation:
{
  "type": "function",
  "function": {
    "name": "save_memory",
    "description": "Save the memory consolidation result to persistent storage.",
    "parameters": {
      "type": "object",
      "properties": {
        "history_entry": {
          "type": "string",
          "description": "A paragraph (2-5 sentences) summarizing key events/decisions/topics. Start with [YYYY-MM-DD HH:MM]. Include detail useful for grep search."
        },
        "memory_update": {
          "type": "string",
          "description": "Full updated long-term memory as markdown. Include all existing facts plus new ones. Return unchanged if nothing new."
        }
      },
      "required": ["history_entry", "memory_update"]
    }
  }
}

Memory vs History

FeatureMEMORY.mdHISTORY.md
PurposeStructured factsChronological log
FormatMarkdown sectionsTimestamped entries
UpdatesMerged/updatedAppend-only
SearchSemantic (by LLM)Grep by keyword/date
SizeStays boundedGrows indefinitely
When to use each:
  • MEMORY.md: User preferences, project facts, important decisions
  • HISTORY.md: Detailed log of what happened when, for auditing and context

Automatic Consolidation

The AgentLoop automatically triggers consolidation when:
unconsolidated = len(session.messages) - session.last_consolidated
if unconsolidated >= memory_window:
    # Start background consolidation
This happens asynchronously without blocking message processing.

Manual Consolidation

The /new slash command triggers full consolidation:
await memory.consolidate(
    session,
    provider,
    model,
    archive_all=True  # Archive everything, start fresh
)
session.clear()

Error Handling

Consolidation can fail if:
  • LLM doesn’t call the save_memory tool
  • LLM returns malformed arguments
  • Network errors during LLM call
On failure:
  • Returns False
  • Logs error with logger.exception()
  • Session state is not modified
  • Can be retried safely

Architecture Notes

  • Memory files are created in {workspace}/memory/ directory
  • Directory is created automatically if it doesn’t exist
  • Files use UTF-8 encoding
  • History entries are double-newline separated for readability
  • The agent can read these files directly using read_file tool
  • Users can manually edit these files to correct or add information

Build docs developers (and LLMs) love