Skip to main content

AI Agent Observability

Monitor, trace, and debug your AI agents with local-first visualization. No account required.

Quick start

Get Observatory running in your AI application in minutes

1

Install the package

Choose your framework and install the corresponding Observatory package.
npm install @contextcompany/otel @vercel/otel @opentelemetry/api
2

Configure instrumentation

Add the Observatory instrumentation to your application. For Next.js with AI SDK, create an instrumentation.ts file:
instrumentation.ts
export async function register() {
  if (process.env.NEXT_RUNTIME === "nodejs") {
    const { registerOTelTCC } = await import("@contextcompany/otel/nextjs");
    registerOTelTCC({ local: true });
  }
}
Setting local: true enables local-first mode with no account or API key required.
3

Add the visualization widget

Include the Observatory widget in your application layout:
app/layout.tsx
import Script from "next/script";

export default function RootLayout({ children }) {
  return (
    <html lang="en">
      <head>
        <Script
          crossOrigin="anonymous"
          src="//unpkg.com/@contextcompany/widget/dist/auto.global.js"
        />
      </head>
      <body>{children}</body>
    </html>
  );
}
4

Enable telemetry in your AI calls

Add the telemetry flag to your AI SDK calls:
route.ts
import { generateText } from "ai";

const result = await generateText({
  model: openai("gpt-4"),
  prompt: "What is AI observability?",
  experimental_telemetry: { isEnabled: true },
});
Once configured, you’ll see a real-time visualization overlay in your browser showing:
  • AI model requests and responses
  • Token usage and costs
  • Tool calls and execution traces
  • Performance metrics

Explore by framework

Observatory integrates seamlessly with popular AI frameworks

Vercel AI SDK

OpenTelemetry integration for Next.js applications with local-first visualization

Claude Agent SDK

Built-in instrumentation for Anthropic’s Claude agents with streaming support

Mastra

Native observability integration for the Mastra AI framework

Custom Agents

Manual instrumentation SDK for any TypeScript or Python agent

Key features

Everything you need for AI agent observability

Local-first mode

Run completely offline with no account, API key, or external dependencies required

Real-time widget

In-browser visualization overlay showing traces, metrics, and performance data

OpenTelemetry

Standards-based instrumentation built on OpenTelemetry specifications

Multi-language SDKs

TypeScript and Python SDKs with consistent APIs across platforms

User feedback

Collect and associate user feedback with specific agent runs

Session tracking

Group related interactions and maintain context across conversations

Ready to start observing?

Get Observatory up and running in your application in less than 5 minutes.

Get Started

Build docs developers (and LLMs) love