Skip to main content
Logicore agents have two distinct memory layers that work together to give agents both immediate conversational continuity and durable recall across sessions.

Short-Term Memory

Per-session conversation history stored in AgentSession. Tracks all messages, roles, and metadata for the active context window.

Long-Term Memory

Persistent vector storage via AgentrySimpleMem and LanceDB. Survives process restarts and can be shared across sessions.

Architecture

The two layers are independently managed but both feed the agent’s LLM context at inference time:

How the Two Layers Work Together

1

User message arrives

The agent receives the user message and appends it to the active AgentSession message list.
2

Long-term retrieval runs

If memory=True, AgentrySimpleMem.on_user_message() is called. It performs a fast embedding-based semantic search against the LanceDB table and returns relevant memory strings to inject into the LLM context.
3

LLM is called with full context

The LLM receives the full session message history (short-term) augmented with any retrieved memory snippets (long-term).
4

Response is queued for persistence

After the assistant responds, AgentrySimpleMem.on_assistant_message() queues the turn. When process_pending() is called (automatically at the end of each chat), high-signal facts are extracted, embedded, and written to LanceDB.

Key Behaviors

Short-term memory is always available for any active session. Long-term memory requires memory=True on agent construction and a running Ollama embedding service.
BehaviorShort-TermLong-Term
ScopeSingle sessionConfigurable (per-session or per-user)
PersistenceIn-process onlySurvives restarts (LanceDB on disk)
RetrievalFull history in orderTop-K semantic similarity
FilteringNone — all messages keptScore-based; small talk is dropped
Enabled by defaultYesNo — requires memory=True

The AgentrySimpleMem Class

AgentrySimpleMem is the primary interface for long-term memory. It lives at logicore.simplemem.AgentrySimpleMem and is instantiated automatically by Agent when memory=True.
from logicore.simplemem import AgentrySimpleMem

memory = AgentrySimpleMem(
    user_id="alice",
    session_id="project-alpha",
    max_context_entries=5,
    isolate_by_session=True,
    debug=True
)
You can interact with the instance directly, or let the agent manage it for you. See the SimpleMem Integration page for full constructor details.

When to Use Each Type

Use short-term memory alone when:
  • The conversation is self-contained within a single session
  • You don’t need recall across process restarts
  • You want the simplest possible setup
from logicore.agents.agent import Agent

agent = Agent(llm="ollama")  # memory defaults to False
await agent.chat("Hello", session_id="my-session")

Next Steps

Short-Term Memory

Session history, AgentSession, context compression, and SessionManager.

Long-Term Memory

The LanceDB storage pipeline, scoring logic, and retrieval flow.

SimpleMem Integration

Full AgentrySimpleMem API reference and configuration options.

Build docs developers (and LLMs) love