Overview
LLM Gateway provides a client-side state management system that transforms flat event streams into a conversation graph, then projects that graph into different view formats. This architecture separates event collection from rendering, enabling efficient UI updates and multiple simultaneous views (chat thread, token usage, DAG visualization).Architecture
Complete Example
- React
- Vue
- Svelte
ViewNode Structure
TheprojectThread projection produces a flat array of ViewNodes:
ViewContent Types
Rendering Patterns
Streaming Indicators
Usestatus to show loading states:
Tool Progress
Render incremental tool output:Nested Branches
Render subagent runs recursively:Pending States
Show placeholders for streaming subagents:Conversation State
TheConversationState includes metadata beyond the graph:
Handling Relays
Access pending relays from state:Other Projections
Messages Projection
Convert to LLM API format for follow-up requests:DAG Projection
Visualize the event graph:Progressive Enhancement
Render immediately, enhance with streaming:Performance
The graph reducer is optimized for streaming:- O(1) event reduction: Each event adds constant nodes/edges
- Immutable updates: Old state remains valid for diffing
- Efficient projections: Walk only the active set, not the full graph
Next Steps
Recursive Language Model
Process arbitrarily long inputs with RLM
Client Library API
Full client API documentation
