Auto-detection with wrap()
The wrap() function automatically detects the client type and applies the appropriate wrapper:
If you haven’t called
ze.init() and ZEROEVAL_API_KEY is set in your environment, the SDK will automatically initialize when you use wrap().Supported integrations
OpenAI
Wrap OpenAI clients for automatic tracing
Vercel AI SDK
Wrap AI SDK functions for automatic tracing
LangChain
Use callback handlers for LangChain and LangGraph
What gets traced
All integrations automatically capture:- Input and output data - Prompts, messages, and completions
- Token usage - Prompt tokens, completion tokens, and total tokens
- Latency metrics - Time to first token (streaming), total duration
- Throughput - Characters or tokens per second
- Model parameters - Temperature, max tokens, top-p, etc.
- Errors - Exception messages and stack traces
- Metadata - Tags, custom attributes, and ZeroEval prompt metadata
Integration patterns
Proxy-based wrapping
OpenAI and Vercel AI integrations use JavaScript proxies to intercept API calls:- Non-invasive - Original client behavior is preserved
- Type-safe - Full TypeScript support with original types
- Double-wrap protection - Safe to wrap multiple times
- Automatic initialization - Uses
ZEROEVAL_API_KEYfrom environment
Callback handlers
LangChain uses the callback handler pattern:- Set globally with
setGlobalCallbackHandler() - Or pass per-invocation in
callbacksoption - Traces chains, LLMs, tools, retrievers, and agents
- Supports LangGraph workflows
Streaming support
All integrations support streaming responses:- Captures time to first token as latency metric
- Tracks throughput as characters per second
- Automatically accumulates full response for tracing
- Yields chunks transparently to your code
ZeroEval metadata extraction
Integrations automatically detect and process ZeroEval prompt metadata embedded in system messages:- Extract metadata (
task,prompt_version_id,variables) - Strip the
<zeroeval>tags from the message - Interpolate variables like
{{name}} - Attach metadata to the span for filtering and analysis
- Look up bound models from prompt versions