Overview
The ZeroEval LangChain integration provides automatic tracing for LangChain and LangGraph applications through a callback handler. The handler captures spans for LLM calls, chains, tools, retrievers, and agents.The LangChain integration is exported from
zeroeval/langchain, not the main SDK export.ZeroEvalCallbackHandler
A callback handler that implements LangChain’sBaseCallbackHandler interface to trace all operations.
Constructor
Options
Enable debug mode. When
true, includes runId and parentRunId in span attributes.Regular expression to filter out LangChain/LangGraph internal metadata properties from span attributes.
Maximum number of concurrent spans before warnings are issued. Prevents memory issues in high-throughput scenarios.
Interval in milliseconds for cleaning up orphaned spans (spans that weren’t properly closed).
Methods
Cleans up resources including the cleanup timer and any active spans. Call this when you’re done with the handler.
Traced Operations
The callback handler automatically traces:LLM Calls
- Methods:
handleLLMStart,handleLLMEnd,handleLLMError - Captures: Model name, prompts, completions, token usage, temperature, and other parameters
- Span kind:
llm
Chat Model Calls
- Methods:
handleChatModelStart - Captures: Messages, model parameters, token usage, tool calls
- Span kind:
llm
Chains
- Methods:
handleChainStart,handleChainEnd,handleChainError - Captures: Chain inputs, outputs, nested chain execution
- Span kind:
chain
Tools
- Methods:
handleToolStart,handleToolEnd,handleToolError - Captures: Tool name, input arguments, output results
- Span kind:
tool
Agents
- Methods:
handleAgentAction,handleAgentEnd - Captures: Agent actions, tool selection, final outputs
- Span kind:
agent
Retrievers
- Methods:
handleRetrieverStart,handleRetrieverEnd,handleRetrieverError - Captures: Queries, retrieved documents, document count
- Span kind:
retriever
Global Callback Handler
Set a callback handler globally to trace all LangChain operations without passing handlers to individual components.setGlobalCallbackHandler
The callback handler instance to register globally.
clearGlobalHandler
getGlobalHandler
undefined if no handler is set.
Example:
Usage Patterns
Per-Component Callbacks
Pass the handler directly to individual components:Global Callbacks
Set once at startup for application-wide tracing:LangGraph Tracing
Trace LangGraph state machine executions:Span Attributes
Spans created by the callback handler include:LangChain component type:
"llm", "chain", "tool", "agent", or "retriever"For LLM operations, set to
"llm"For LLM operations, defaults to
"openai"Service identifier, typically matches provider
Model name for LLM/chat operations
Chat messages for chat model operations
Temperature parameter (if provided)
Max tokens parameter (if provided)
Tool definitions (if provided)
Prompt tokens consumed
Completion tokens generated
Tokens per second (calculated from duration and token usage)
LangChain run ID (only when
debug: true)Parent run ID for nested operations (only when
debug: true)Performance Optimizations
The callback handler includes several optimizations:- Object pooling: Reuses metadata objects to reduce allocations
- Lazy serialization: Defers JSON serialization until needed
- Orphan cleanup: Automatically closes spans that weren’t properly ended
- Metadata filtering: Excludes internal LangChain/LangGraph properties via regex
- Concurrent span limits: Prevents memory issues in high-throughput scenarios
Related Documentation
wrap()- Auto-wrap AI clientswrapOpenAI()- OpenAI-specific wrapperwrapVercelAI()- Vercel AI SDK wrapper- LangChain Documentation
- LangGraph Documentation