Overview
Ingests external file content into LongMem’s user observations table for enhanced context awareness. This allows the AI to reference documentation, configuration files, or other codebase artifacts that weren’t directly modified during the session.This is an advanced API primarily used by ecosystem integrations that want to enrich LongMem’s context with additional files.
Authentication
Requires Bearer token ifauthToken is configured in daemon settings.
Request
Array of file objects to ingest
File path (used as identifier)
File content to ingest
Content hash (e.g., SHA-256) for deduplication
Source identifier (e.g., “vscode-extension”, “cli-tool”)
Response
Always “ok” on success
Number of files newly ingested or updated
Number of files skipped (identical hash already stored)
Total files in the request
Example
cURL
Response
Behavior
Deduplication
The endpoint uses content hashing to avoid storing duplicate content:- Same hash: Increments
access_countand updateslast_accessedtimestamp - Different hash: Updates content in place and rebuilds FTS index
- New path: Inserts as new observation
Full-Text Search
Ingested content is automatically indexed in SQLite’s FTS5 table, making it searchable via GET /search.Use Cases
Documentation Indexing
Index project documentation for AI-assisted coding:Configuration Context
Ingest configuration files so the AI understands project setup:API Reference Caching
Cache external API documentation for offline reference.Implementation Details
Fromdaemon/routes.ts:325-384:
- Files are stored in the
user_observationstable with type"ecosystem" - Uses
metadatafield to store path, hash, and source - Updates existing files in place if hash changes
- Tracks access count and last accessed timestamp
- Automatically rebuilds FTS index on content changes
Storage Schema
Ecosystem observations are stored with:Error Responses
Missing files array
Invalid file objects
Files withoutpath or content are silently skipped and counted in the response totals.
Performance Notes
- Batch ingestion: Send multiple files in one request
- Deduplication: Same hash = no database write
- FTS rebuild: Only happens on content changes
- No size limits: But keep individual files under 1MB for best performance
Related
- GET /search - Search ingested content
- POST /observe - Record tool observations