Data directories
[data]
Controls where Lerim stores its data.Global data directory path. Used for:
- User config (
config.toml) - Global memory store
- Session database
- Platform configuration
Memory settings
[memory]
Controls memory scope and storage behavior.Memory read/write scope. Determines where Lerim looks for and stores memories.Options:
project_fallback_global— Read from project first, fall back to global. Write to project. (recommended)project_only— Read and write only in<repo>/.lerim/global_only— Read and write only in~/.lerim/
Name of the project memory directory inside repositories.
Use
project_fallback_global to keep project-specific memories in the repo while still having access to global learnings. Use project_only for strict project isolation.[memory.decay]
Controls automatic confidence decay for memories.Enable time-based memory decay. When enabled, memories that haven’t been accessed lose confidence over time.
Number of days of no access before full decay. Memories decay gradually from last access to this threshold.
Minimum confidence multiplier. Decay never drops confidence below this value (0.0-1.0).
Effective confidence threshold for archiving. Memories with confidence below this value become archive candidates during
maintain (0.0-1.0).Grace period in days. Memories accessed within this window skip archiving even if confidence is below threshold.
How memory decay works
How memory decay works
Memory decay keeps your knowledge store relevant by automatically reducing confidence for unused memories:
- Each memory tracks
last_accessedtimestamp - Confidence decays linearly from
last_accessedtolast_accessed + decay_days - Decay multiplier is clamped to
min_confidence_floor(never goes below 10% by default) - Effective confidence =
base_confidence * decay_multiplier - During
maintain, memories with effective confidence belowarchive_thresholdare archived - Recently accessed memories (within
recent_access_grace_days) are protected from archiving
- Frequently used knowledge stays strong
- Stale information gradually fades
- Critical decisions don’t disappear (floor prevents total decay)
- Recent memories aren’t archived prematurely
Server settings
[server]
Controls the daemon server and sync/maintain intervals.Server bind address. Use
127.0.0.1 for local-only access or 0.0.0.0 to allow network access.Server port for HTTP API and dashboard.
How often the sync (hot) path runs. Sync indexes new sessions and extracts memories.
How often the maintain (cold) path runs. Maintain merges duplicates, archives stale entries, and applies decay.
How many days back to scan for new sessions during sync.
Maximum number of sessions to process in a single sync run.
The daemon runs two independent loops:
sync (hot path, frequent) and maintain (cold path, less frequent). Adjust intervals based on how actively you use your coding agents.Model roles
Lerim uses four model roles, each independently configurable. See the model roles guide for detailed explanation.[roles.lead]
Orchestrates chat, sync, and maintain flows using PydanticAI.Provider name:
openrouter, openai, zai, anthropic, ollamaModel identifier for the provider.
Custom API base URL. Overrides the provider default.
List of fallback models to try if primary model fails.
Request timeout in seconds.
Maximum agent iterations per run.
OpenRouter provider routing preference (e.g.,
["Together", "Lepton"]).[roles.explorer]
Read-only subagent for candidate gathering.Provider name.
Model identifier.
Custom API base URL.
List of fallback models.
Request timeout in seconds.
Maximum agent iterations.
OpenRouter provider routing preference.
[roles.extract]
DSPy extraction pipeline for identifying decisions and learnings.Provider name.
Model identifier.
Custom API base URL.
List of fallback models.
Request timeout in seconds.
Maximum tokens per transcript window. Increase for large-context models.
Token overlap between consecutive windows.
OpenRouter provider routing preference.
[roles.summarize]
DSPy summarization pipeline for session summaries.Provider name.
Model identifier.
Custom API base URL.
List of fallback models.
Request timeout in seconds.
Maximum tokens per transcript window.
Token overlap between windows.
OpenRouter provider routing preference.
Provider API bases
[providers]
Default API base URLs per provider. Per-roleapi_base settings take precedence.
ZAI API base URL.
OpenAI API base URL.
OpenRouter API base URL.
Ollama API base URL.
You can point all roles at a custom endpoint by changing the provider base URL here, or override on a per-role basis using
[roles.*.api_base].Tracing
[tracing]
OpenTelemetry tracing settings. See the tracing guide for setup instructions.Enable OpenTelemetry tracing. Can also be enabled with
LERIM_TRACING=1.Capture raw HTTP request/response bodies in traces.
Include prompt and completion text in trace spans.
Agents and projects
[agents]
Connected coding agent platforms. Written bylerim init and lerim connect.
Example:
[projects]
Registered project paths. Written bylerim project add.
Example:
When running in Docker (
lerim up), these paths determine the volume mounts. Adding or removing a project restarts the container to update mounts.