What is Loom?
Loom is an AI-powered coding agent built in Rust that provides a REPL interface for interacting with LLM-powered agents. It can execute tools to perform file system operations, code analysis, and other development tasks through a conversational interface. The system uses a server-side proxy architecture where API keys are stored server-side only. Clients communicate through the proxy, ensuring your LLM provider credentials never leave the server.Core Principles
Loom is designed around three fundamental principles:Modularity
Clean separation between core abstractions, LLM providers, and tools
Extensibility
Easy addition of new LLM providers and tools via trait implementations
Reliability
Robust error handling with retry mechanisms and structured logging
Architecture Overview
Loom is organized as a Cargo workspace with 30+ specialized crates:Key Components
| Component | Description |
|---|---|
| Core Agent | State machine for conversation flow and tool orchestration |
| LLM Proxy | Server-side proxy architecture - API keys never leave the server |
| Tool System | Registry and execution framework for agent capabilities |
| Weaver | Remote execution environments via Kubernetes pods |
| Thread System | Conversation persistence with FTS5 search |
| Analytics | PostHog-style product analytics with identity resolution |
| Auth | OAuth, magic links, ABAC authorization |
| Feature Flags | Runtime feature toggles, experiments, and kill switches |
Server-Side LLM Proxy
All LLM interactions flow through a server-side proxy: API keys are stored server-side only. The CLI communicates through/proxy/{provider}/complete and /proxy/{provider}/stream endpoints.
Available Tools
Loom agents have access to the following tools:- ReadFileTool - Read file contents
- EditFileTool - Edit files with precise string replacements
- ListFilesTool - List directory contents
- BashTool - Execute shell commands
- OracleTool - Query server-side knowledge base
- WebSearchToolGoogle - Search using Google Custom Search
- WebSearchToolSerper - Search using Serper API
Next Steps
Quick Start
Get up and running with your first REPL session
Installation
Build Loom using Nix or Cargo
Specifications
Explore design docs in
specs/README.md for implementation detailsSource Code
All code is proprietary - Copyright (c) 2025 Geoffrey Huntley
LLM Providers
Loom supports multiple LLM providers through theProxyLlmClient:
- Anthropic (Claude models) - Default provider
- OpenAI (GPT models)
- Vertex AI (Google Cloud)
- zAI (Custom provider)
--provider flag or LOOM_LLM_PROVIDER environment variable.