Prerequisites
Before you begin, make sure you have:- Node.js 18+ installed
- An existing Next.js project (or create one with
npx create-next-app@latest) - The Vercel AI SDK installed (
npm install ai) - An LLM provider API key (OpenAI, Anthropic, etc.)
Step 1: Install Dependencies
Install the required Observatory packages along with OpenTelemetry dependencies:The
@vercel/otel package provides Next.js-specific OpenTelemetry utilities that Observatory uses for instrumentation.Step 2: Add Instrumentation to Next.js
Create aninstrumentation.ts file in the root directory of your project (or inside the src folder if you’re using one). This file is used by Next.js to set up observability before your application starts.
instrumentation.ts
Setting
local: true enables local-first mode with no account or API key required. Observatory will run completely offline and display traces in your browser via the widget.More instrumentation options
More instrumentation options
You can customize the instrumentation with additional options:For production use with The Context Company backend:Then set your API key as an environment variable:
.env
Step 3: Add the Visualization Widget
Add the Observatory widget to your root layout. This provides the real-time visualization overlay in your browser.app/layout.tsx
The widget is loaded from unpkg.com for convenience. For production use or air-gapped environments, you can self-host the widget bundle.
Step 4: Enable Telemetry in AI SDK Calls
As of AI SDK v5, telemetry is experimental and requires theexperimental_telemetry flag. Add this flag to all AI SDK calls you want to instrument.
Basic Example
app/api/chat/route.ts
With Session and Run Tracking
For better observability, track sessions (entire conversations) and runs (individual AI calls):app/api/chat/route.ts
With Tool Calling
Observatory automatically traces tool calls when you use AI SDK tools:app/api/chat/route.ts
Observatory will automatically capture:
- Tool definitions
- Tool arguments
- Tool results
- Execution time
- Any errors
Step 5: Test Your Setup
Start your Next.js development server:- Open your application in a browser
- Trigger an AI interaction (chat message, etc.)
- Look for the Observatory widget in the bottom-right corner
- Click the widget to see your AI traces in real-time

What You’ll See
The Observatory widget displays:- Request details — Model name, prompt, system message
- Response data — LLM output, finish reason
- Token usage — Input tokens (cached/uncached), output tokens
- Timing — Total duration, time to first token
- Tool calls — Arguments, results, execution time
- Metadata — Session ID, run ID, custom metadata
Troubleshooting
No traces appearing in the widget
No traces appearing in the widget
Check these common issues:
- Verify
instrumentation.tsis in the correct location (root orsrc/) - Ensure
experimental_telemetry.isEnabledis set totrue - Restart your Next.js dev server after adding instrumentation
- Check browser console for any errors
- Verify the widget script is loading (check Network tab)
instrumentation.ts
Widget not appearing
Widget not appearing
Possible causes:
- Script tag is missing from layout
- CSP (Content Security Policy) is blocking the script
- Ad blocker is interfering
Instrumentation not working in production
Instrumentation not working in production
Important notes:
- Make sure
instrumentation.tsis included in your build - For Vercel deployments, instrumentation is automatically supported
- For other hosting providers, ensure Node.js runtime is used
- Check that
NEXT_RUNTIMEcheck isn’t being stripped by your bundler
Only seeing some AI calls, not all
Only seeing some AI calls, not all
Common reasons:
- Missing
experimental_telemetry.isEnabled: trueon some calls - Some calls are happening before instrumentation is registered
- Calls are being made from edge runtime (not currently supported)
Next Steps
Now that you have Observatory running, explore these topics:Session & Run Tracking
Learn how to track conversations and individual AI calls
User Feedback
Collect and link user feedback to specific agent runs
Custom Metadata
Add custom metadata to filter and group traces
Configuration
Configure Observatory for different environments
Example Application
For a complete working example, check out the Next.js AI SDK example in the Observatory repository:Next.js + AI SDK Example
View a complete example with weather agent, tool calling, and feedback integration
Need Help?
If you run into issues:- Check the troubleshooting guide
- Open an issue on GitHub
- Join our community (coming soon)
