Skip to main content
The @contextcompany/otel package provides automatic instrumentation for Vercel AI SDK via OpenTelemetry. It captures traces from AI SDK spans and exports them to The Context Company for observability.

Installation

npm install @contextcompany/otel

Setup

Register the TCC span processor in your OpenTelemetry configuration:
// instrumentation.ts (or instrumentation.node.ts)
import { registerOTelTCC } from '@contextcompany/otel';

export async function register() {
  registerOTelTCC();
}

API Reference

registerOTelTCC

Registers The Context Company’s OpenTelemetry span processor to automatically capture AI SDK telemetry.
function registerOTelTCC(options?: RegisterOpts): void
options
RegisterOpts
Configuration options for the span processor.

TCCSpanProcessor

The OpenTelemetry span processor class that filters and exports AI SDK spans.
class TCCSpanProcessor implements SpanProcessor {
  constructor(options?: {
    apiKey?: string;
    url?: string;
    baseProcessor?: SpanProcessor;
  });
  onStart(span: Span, parentContext: Context): void;
  onEnd(span: ReadableSpan): void;
  shutdown(): Promise<void>;
  forceFlush(): Promise<void>;
}
options
object
Configuration options for the span processor.
The processor automatically filters spans to only capture those starting with ai. from Vercel AI SDK.

submitFeedback

Submit user feedback for a specific run.
function submitFeedback(params: {
  runId: string;
  score?: "thumbs_up" | "thumbs_down";
  text?: string;
}): Promise<Response | undefined>
params
object
required
return
Promise<Response | undefined>
Returns the fetch Response object if successful, or undefined if the request fails or validation errors occur.

Environment Variables

TCC_API_KEY
string
required
Your Context Company API key. Get one from the dashboard.
TCC_URL
string
Custom ingestion endpoint URL. Overrides the default endpoint.
TCC_DEBUG
boolean
Enable debug logging. Set to true or 1.

Usage Example

import { registerOTelTCC, submitFeedback } from '@contextcompany/otel';
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';

// Register once at app startup
registerOTelTCC({ debug: true });

// Use AI SDK normally - traces are captured automatically
const result = await generateText({
  model: openai('gpt-4'),
  prompt: 'Explain quantum computing in simple terms',
});

console.log(result.text);

// Optionally submit user feedback
await submitFeedback({
  runId: result.runId,
  score: 'thumbs_up',
  text: 'Great explanation!',
});

How It Works

  1. The TCCSpanProcessor registers with OpenTelemetry’s tracing system
  2. It filters spans to only process those from Vercel AI SDK (prefixed with ai.)
  3. Captured spans are batched and exported to The Context Company’s ingestion endpoint
  4. Traces appear in your dashboard with full observability into model calls, tokens, costs, and latency

Next Steps

AI SDK Integration

Learn more about AI SDK integration patterns

View Traces

View your traces in the dashboard

Build docs developers (and LLMs) love