Macroa-Pulse is in active development as a Proof of Concept. It is the first component of the Macroa platform — an AI agent operating system.
The Problem
Every AI agent framework today is reactive. An agent acts because something triggered it — a user message, a cron job, a webhook, an API call. Remove the trigger, and the agent does nothing. Building a genuinely useful AI assistant requires developers to anticipate every situation in which the agent should act and write an explicit trigger for it. The Pulse eliminates this requirement.What is Macroa Pulse?
Macroa Pulse is a three-layer hierarchical signal perception system that enables AI agents to notice when something is worth attention and act on it — without being explicitly triggered.Key Features
Zero-Cost Operation
No LLM polling required. Runs entirely on local compute with deterministic detection and tiny neural networks.
Hierarchical Filtering
Three-layer architecture inspired by the brain’s perceptual processing ensures only relevant signals reach the agent.
Online Learning
Improves accuracy over time through implicit and explicit feedback from agent activations.
Privacy by Design
All training data and models stay local. No external API calls or data transmission.
Module Fingerprints
Cold-start initialization using signal fingerprints ensures models are useful from day one.
Scoped Questions
Agents wake up with specific, focused questions — not generic polling loops.
How It Works
The Pulse continuously monitors your environment through four signal sources:- File system events — New, modified, or deleted files in monitored directories
- Memory namespace deltas — Facts written to the agent’s persistent memory
- Time signals — Cyclically-encoded hour, day of week, and elapsed time
- Network events (optional) — Hash comparison on monitored HTTP endpoints
- Layer 1 (Retina) detects changes deterministically and emits structured
SignalEventobjects - Layer 2 (Limbic) uses small LSTM networks (~50k-200k parameters) to compute relevance scores per module cluster
- Layer 3 (Prefrontal) converts high-relevance signals into specific, actionable questions using template interpolation
“A new file appeared at /home/user/Downloads/hw3.pdf. Is this file related to a course assignment?”
Why Pulse?
Traditional approaches to proactive agents:- Cron jobs — Static, can’t adapt to user patterns
- Webhooks — Reactive by definition, require external triggers
- LLM polling — Effective but economically infeasible (~$10/minute at current API prices)
- Learns user-specific patterns from usage
- Runs continuously at near-zero cost
- Achieves proactivity without AI, reserving LLM inference for the tasks it’s uniquely suited to
Part of Macroa
Macroa Pulse is the first module of the Macroa platform — an AI agent operating system designed around the principle that AI should be used only where no deterministic process can do the job. The Pulse embodies this principle: it achieves proactive behavior using deterministic detection and tiny neural networks, ensuring that when the LLM is finally invoked, it’s for focused reasoning, not continuous monitoring.Get Started
Quickstart
Get up and running with Pulse in minutes
Architecture
Understand the three-layer hierarchical design
API Reference
Explore the public API surface
Design Paper
Read the full design rationale and approach