Design Principles
Loom’s architecture is built on three core principles:Modularity
Clean separation between core abstractions, LLM providers, and tools
Extensibility
Easy addition of new LLM providers and tools via trait implementations
Reliability
Robust error handling with retry mechanisms and structured logging
High-Level Architecture
Workspace Structure
Loom is organized as a Cargo workspace with 80+ crates undercrates/. The workspace follows a layered architecture:
Layer Organization
Common Layer (loom-common-*)
Common Layer (loom-common-*)
Foundation crates providing shared functionality:
loom-common-core- Core abstractions (LlmClient,Agent, state machine)loom-common-http- HTTP client with retry logic and User-Agentloom-common-config- Configuration managementloom-common-secret- Secret handling withSecret<T>wrapperloom-common-thread- Conversation persistence and syncloom-common-webhook- Webhook handlingloom-common-i18n- Internationalization (17 locales)loom-common-spool- Version control integrationloom-common-version- Version management
Server Layer (loom-server-*)
Server Layer (loom-server-*)
Server-side components:
loom-server- Main HTTP API serverloom-server-api- API route handlersloom-server-db- Database access layerloom-server-llm-service- LLM provider abstractionloom-server-llm-proxy- LLM proxy endpointsloom-server-llm-anthropic- Anthropic Claude clientloom-server-llm-openai- OpenAI GPT clientloom-server-llm-vertex- Google Vertex AI clientloom-server-llm-zai- Z.ai clientloom-server-auth*- Authentication providers (OAuth, magic links, OIDC)loom-server-k8s- Kubernetes integrationloom-server-weaver- Remote execution podsloom-server-scm- Git hostingloom-server-analytics- PostHog-style analytics- And many more specialized services…
CLI Layer (loom-cli-*)
CLI Layer (loom-cli-*)
Command-line interface crates:
loom-cli- Main CLI binaryloom-cli-config- CLI configurationloom-cli-credentials- Credential managementloom-cli-tools- Agent tool implementationsloom-cli-git- Git operationsloom-cli-auto-commit- Auto-commit after tool executionloom-cli-acp- Agent Client Protocolloom-cli-spool- Spool (jj-based VCS) integration
Observability Suite
Observability Suite
Integrated observability platform:
loom-analytics-core- Analytics core typesloom-analytics- Analytics clientloom-crash-core- Crash reporting typesloom-crash- Crash reporterloom-crash-symbolicate- Source map symbolicationloom-crons-core- Cron monitoring typesloom-crons- Cron monitoring clientloom-sessions-core- Session analytics typesloom-flags-core- Feature flags coreloom-flags- Feature flags client
TUI Layer (loom-tui-*)
TUI Layer (loom-tui-*)
Terminal UI components (Ratatui 0.30):
loom-tui-app- Main TUI applicationloom-tui-core- Core TUI abstractionsloom-tui-component- Component systemloom-tui-theme- Theming supportloom-tui-widget-*- Reusable widgets (message list, input box, markdown, etc.)loom-tui-storybook- Visual snapshot testing
Weaver (Remote Execution)
Weaver (Remote Execution)
Kubernetes-based remote execution:
loom-weaver-secrets- SPIFFE-style identity and secretsloom-weaver-audit-sidecar- eBPF syscall auditingloom-weaver-wgtunnel- WireGuard tunnel clientloom-weaver-ebpf- eBPF programsloom-wgtunnel-*- WireGuard tunnel with DERP relay
Key Components
Core Agent
The agent state machine manages conversation flow and tool orchestration. See Agent State Machine for details.- Event-driven architecture
- Explicit state transitions
- Built-in retry mechanisms
- Clean separation of I/O from state logic
LLM Proxy
Server-side proxy architecture keeps API keys secure. See LLM Proxy Architecture for details.API keys never leave the server. Clients communicate through proxy endpoints using provider-specific paths like
/proxy/anthropic/stream.Tool System
Extensible tool registry for agent capabilities:Tooltrait for implementation- JSON Schema-based input validation
- Async execution
- Progress reporting
Thread System
Conversation persistence with FTS5 search:- SQLite-based storage
- Full-text search across all messages
- Git metadata tracking
- Sync across devices
Dependency Flow
Build System
Nix + cargo2nix (Preferred)
Loom uses cargo2nix for reproducible builds with per-crate caching:Cargo (Development)
For quick iteration during development:Deployment
Deployments happen automatically viagit push to the trunk branch:
- NixOS server checks for new commits every 10 seconds
- Automatic rebuild when changes detected
- Service restart with new binary
- Health checks verify deployment
Next Steps
LLM Proxy
Learn about the server-side LLM proxy architecture
State Machine
Understand the agent state machine design
Workspace Structure
Explore the complete crate organization
Specifications
Read detailed specs for all features