Overview
The agent is built using LangGraph, a framework for creating stateful, multi-actor applications with LLMs. The architecture is based on a state graph that routes execution through different nodes based on conditional logic.State Graph Structure
Visual Representation
The graph consists of:- 3 Nodes: Processing units that execute specific functions
- 2 Edge Types: Conditional edges (decision-based) and direct edges (automatic)
- 1 Entry Point: All conversations start at the
support_botnode - 1 End State: Conversations conclude when no more tools are needed
State Schema
The agent state is defined using a TypedDict insrc/copilot/graph.py:74:
State Fields
Conversation history with special
add_messages reducer that intelligently merges new messagesGenerated conversation title (2-4 words summarizing the conversation)
Unique identifier for the conversation thread (used for checkpointing)
User identifier for tracing and observability
Flag to enable Langfuse tracing (default: False for privacy)
Whether to generate a title in the graph (default: True)
State Reducers
Theadd_messages reducer is crucial for state management:
- Append new messages to the existing list
- Update messages with matching IDs
- Preserve conversation history across nodes
Node Functions
Each node is a Python function that takes the current state and returns a partial state update.1. Support Bot Node (call_model)
Location: src/copilot/graph.py:210
Purpose: Invokes the LLM with tool bindings to generate responses or tool calls
Process:
- Extract User Query: Gets the latest user message from state
- Search Golden Examples: Finds similar past conversations for context
- Enhance System Prompt: Injects golden examples into the system message
- Bind Tools: Attaches available tools to the LLM
- Invoke LLM: Calls the model with enhanced prompt and conversation history
- Return Response: Adds LLM response (text or tool calls) to state
src/copilot/graph.py:156):
- Prioritizes verified knowledge from golden examples
- Provides tool selection guidance
- Enforces query rewriting for tools
- Sets citation and formatting rules
2. Incident Tools Node (tool_wrapper)
Location: src/copilot/graph.py:266
Purpose: Executes tool calls requested by the LLM
Process:
- Extract Tool Calls: Gets tool name and arguments from LLM response
- Execute Tools: Runs the appropriate tool function
- Stream Status: Updates UI with search progress
- Return Results: Adds tool results to message history
src/copilot/tools/__init__.py:14):
lookup_incident_by_id: Direct ID-based lookupsearch_similar_incidents: Semantic similarity searchget_incidents_by_application: Application-filtered searchget_recent_incidents: Time-based filtering
3. Title Generation Node (title_generation_node)
Location: src/copilot/graph.py:316
Purpose: Generates a concise title for the conversation
Process:
- Extract Conversation: Collects all messages from state
- Create Summary Prompt: Instructs LLM to generate 2-4 word title
- Invoke LLM: Calls model without tools
- Update State: Adds title to state and streams to UI
Edge Logic
Direct Edges
Direct edges create automatic transitions between nodes:- Tools → Support Bot: After tool execution, always return to LLM for response generation
- Title Generation → End: After generating title, conversation is complete
Conditional Edge (wants_qdrant_tool)
Location: src/copilot/graph.py:286
Purpose: Decides the next node based on LLM response
Graph Compilation
Location:src/copilot/graph.py:417
The graph is compiled with PostgreSQL checkpointing:
State Persistence
Checkpointing
The agent uses PostgreSQL for state persistence:- Connection: Uses the same database as vector storage (
VECTOR_DATABASE_URL) - Configuration: Auto-commit enabled, prepare threshold set to 0
- Thread ID: Each conversation has a unique
thread_idfor checkpoint retrieval
Invoking with Persistence
Continuing Conversations
LLM Configuration
Model Caching
The agent caches LLM instances to avoid recreating them on every invocation (src/copilot/graph.py:85):
Dynamic Provider Selection
Function:set_llm_from_config (src/copilot/graph.py:89)
Observability
Langfuse Integration
Optional tracing and observability:- Lazy Initialization: Handler created only when needed
- Opt-in: Default is
langfuse_enabled=Falsefor privacy - Attribute Propagation:
session_idanduser_idattached to traces
Stream Updates
The agent streams status updates to the UI:Next Steps
Workflow
Learn how queries flow through the graph from input to response