Overview
The@contextcompany/otel package integrates with Vercel’s OpenTelemetry implementation to automatically capture:
- LLM calls and streaming responses
- Tool/function calls and results
- Token usage and costs
- Latency and performance metrics
- Errors and exceptions
Installation
Set your API key
Add your Observatory API key to your environment variables:
.env.local
Get your API key from the Observatory dashboard.
Create instrumentation file
Create or update
instrumentation.ts in your project root:instrumentation.ts
For Next.js 15+, this file should be at the root of your project. For older versions, place it in the
src directory if you’re using one.Enable instrumentation in Next.js
Update your
next.config.js to enable instrumentation:next.config.js
Configuration
TheregisterOTelTCC function accepts an optional configuration object:
instrumentation.ts
Configuration Options
Your Observatory API key. Overrides the
TCC_API_KEY environment variable.Custom ingestion endpoint URL. Defaults to Observatory’s production endpoint.
Enable debug logging to see detailed instrumentation information in the console.
Enable local mode for development. Starts a WebSocket server for real-time trace viewing without sending data to the cloud.
Optional OpenTelemetry span processor to run alongside Observatory’s processor.
Additional configuration options passed to Vercel’s
registerOTel function. See Vercel OTel docs for available options.Usage Examples
Basic AI SDK Usage
No changes needed to your existing AI SDK code:app/api/chat/route.ts
- LLM request and response
- Token usage (prompt, completion, total)
- Latency metrics
- Model information
With Tools
Tool calls are automatically captured:- Tool definitions sent to the LLM
- Tool call arguments
- Tool execution results
- Tool execution time
Streaming Responses
Streaming is fully supported:app/api/chat/route.ts
Local Development Mode
For local development, enable local mode to view traces without sending them to the cloud:instrumentation.ts
Local mode can be used alongside cloud mode by providing an API key. Traces will be sent to both destinations.
Environment Variables
Your Observatory API key. Get it from the Observatory dashboard.
Custom ingestion endpoint URL. Only needed if using a self-hosted instance.
Set to
1 or true to enable debug logging.Submitting Feedback
You can submit user feedback for specific runs:Troubleshooting
Traces not appearing
Verify instrumentation is enabled
Verify instrumentation is enabled
Ensure
experimental.instrumentationHook is set to true in your next.config.js.Check API key
Check API key
Verify your
TCC_API_KEY is set correctly in your environment variables. Enable debug mode to see connection logs:Runtime compatibility
Runtime compatibility
The instrumentation only works in the Node.js runtime. If you’re using Edge runtime, traces won’t be captured. Check your route files:
Development API key
Development API key
If you’re using a development API key (prefix
dev_), ensure you’re not pointing to the production endpoint. The package automatically routes dev keys to the dev environment.Performance impact
The instrumentation uses OpenTelemetry’s batching span processor, which:- Batches traces before sending
- Sends traces asynchronously
- Has minimal impact on request latency (typically less than 5ms)
Vercel Deployment
The instrumentation works seamlessly on Vercel:- Ensure your
TCC_API_KEYis added to your Vercel project’s environment variables - The instrumentation hook is automatically enabled during deployment
- Traces will appear in Observatory for all production traffic
Next Steps
Widget
Learn about the real-time visualization widget
Custom Instrumentation
Add manual instrumentation for custom logic
API Reference
Complete API documentation
Feedback
Learn how to collect and analyze user feedback
