Observatory provides multiple TypeScript packages for different instrumentation approaches and frameworks.
Package Overview
@contextcompany/otel OpenTelemetry integration for Vercel AI SDK
@contextcompany/custom Manual instrumentation for custom agents
@contextcompany/claude Anthropic Claude Agent SDK integration
@contextcompany/mastra Mastra framework integration
@contextcompany/widget Local-first UI widget for Next.js
@contextcompany/api Core API utilities (feedback, config)
@contextcompany/otel
OpenTelemetry integration for Vercel AI SDK with automatic span collection and Next.js support.
Installation
npm install @contextcompany/otel @opentelemetry/api
Exports
Main Entry Point
TCCSpanProcessor
submitFeedback
import { TCCSpanProcessor } from '@contextcompany/otel' ;
type TCCSpanProcessorOptions = {
apiKey ?: string ; // TCC API key (defaults to TCC_API_KEY env var)
otlpUrl ?: string ; // Custom endpoint URL
baseProcessor ?: SpanProcessor ; // Custom base processor
debug ?: boolean ; // Enable debug logging
};
const processor = new TCCSpanProcessor ({
apiKey: 'your-api-key' ,
debug: true
});
Next.js Integration
import { registerOTelTCC } from '@contextcompany/otel/nextjs' ;
export function register () {
registerOTelTCC ({
apiKey: process . env . TCC_API_KEY , // Optional if TCC_API_KEY is set
debug: true ,
local: true // Enable local widget mode
});
}
Registers OpenTelemetry with TCC span processor for Next.js applications. TCC API key. Defaults to TCC_API_KEY environment variable.
Custom OTLP endpoint URL. Auto-detects prod/dev based on API key.
Custom span processor for additional processing.
Additional Vercel OTEL configuration options.
Enable debug logging to console.
Enable local widget mode with WebSocket server.
Usage with Vercel AI SDK
Next.js App Router
instrumentation.ts
import { streamText } from 'ai' ;
import { openai } from '@ai-sdk/openai' ;
export async function POST ( req : Request ) {
const { messages } = await req . json ();
const result = streamText ({
model: openai ( 'gpt-4o' ),
messages ,
});
return result . toDataStreamResponse ();
}
Environment Variables
Your Observatory API key. Keys starting with dev_ route to development.
Override the default ingestion endpoint URL.
@contextcompany/custom
Manual instrumentation SDK for custom TypeScript agents. Two patterns supported:
Builder pattern — instrument as you go
Factory pattern — send pre-built data
Installation
npm install @contextcompany/custom
Core Exports
import { run } from '@contextcompany/custom' ;
const r = run ({
runId? : string , // Custom run ID (auto-generated if omitted)
sessionId? : string , // Group runs into sessions
conversational? : boolean , // Mark as multi-turn conversation
timeout? : number // Auto-flush timeout in ms (default: 20min)
});
// Builder methods
r . prompt ( text : string | { user_prompt: string ; system_prompt ?: string });
r . response ( text : string );
r . metadata ( ... entries : Record < string , string > []);
r . status ( code : number , message ?: string );
r . endTime ( date : Date );
// Child objects
const step = r . step ( stepId ?: string );
const toolCall = r . toolCall ( name ?: string );
// Finalize
await r . end (); // Success
await r . error ( msg ); // Error
Step Builder
const step = r . step ({ stepId? : string , startTime? : Date });
step . prompt ( text : string );
step . response ( text : string );
step . model ( config : string | { requested? : string ; used ?: string });
step . finishReason ( reason : string );
step . tokens ({
uncached? : number ,
cached? : number ,
completion? : number
});
step . cost ( amount : number ); // USD
step . toolDefinitions ( defs : string | unknown []);
step . status ( code : number , message ?: string );
step . endTime ( date : Date );
step . end (); // Success
step . error ( msg ); // Error
const tc = r . toolCall ( 'tool_name' | { name? : string , toolCallId? : string });
tc . name ( toolName : string );
tc . args ( value : string | Record < string , unknown > );
tc . result ( value : string | Record < string , unknown > );
tc . status ( code : number , message ?: string );
tc . endTime ( date : Date );
tc . end (); // Success
tc . error ( msg ); // Error
Complete Example
Builder Pattern
Factory Pattern
import { run } from '@contextcompany/custom' ;
const r = run ({ sessionId: 'session_123' , conversational: true });
r . prompt ( 'What \' s the weather in SF?' );
r . metadata ({ agent: 'weather-bot' , version: '1.0' });
const step = r . step ();
step . prompt ( JSON . stringify ( messages ));
step . response ( assistantContent );
step . model ( 'gpt-4o' );
step . tokens ({ uncached: 120 , cached: 30 , completion: 45 });
step . cost ( 0.0042 );
step . end ();
const tc = r . toolCall ( 'get_weather' );
tc . args ({ city: 'San Francisco' });
tc . result ({ temp: 72 , unit: 'F' });
tc . end ();
r . response ( '72°F and sunny in San Francisco.' );
await r . end ();
Configuration
import { configure } from '@contextcompany/custom' ;
configure ({
apiKey? : string , // Override TCC_API_KEY env var
debug? : boolean , // Enable debug logging
url? : string , // Custom ingestion endpoint
runTimeout? : number // Default auto-flush timeout (ms)
});
Type Exports
import type {
RunOptions ,
StepOptions ,
ToolCallOptions ,
TokenUsage ,
ModelConfig ,
RunInput ,
StepInput ,
ToolCallInput ,
ClientConfig
} from '@contextcompany/custom' ;
@contextcompany/claude
Instrumentation for Anthropic Claude Agent SDK.
Installation
npm install @contextcompany/claude @anthropic-ai/claude-agent-sdk
Usage
import { Agent } from '@anthropic-ai/claude-agent-sdk' ;
import { instrumentClaudeAgent , submitFeedback } from '@contextcompany/claude' ;
const agent = new Agent ({
model: 'claude-4-sonnet' ,
tools: [ /* ... */ ]
});
const instrumented = instrumentClaudeAgent ( agent );
// Query with TCC metadata
for await ( const message of instrumented . query ({
prompt: 'What is the weather in SF?' ,
tcc: {
runId: 'custom-run-id' ,
sessionId: 'session-123' ,
metadata: { user: 'alice' },
debug: true
}
})) {
console . log ( message );
}
// Submit feedback
await submitFeedback ({
runId: 'custom-run-id' ,
score: 'thumbs_up'
});
Type Signatures
export function instrumentClaudeAgent < T extends object >(
sdk : T
) : WrappedSDK < T >;
export type TCCConfig = {
runId ?: string ;
sessionId ?: string ;
metadata ?: Record < string , unknown >;
debug ?: boolean ;
};
export type WrappedSDK < T > = T extends { query : infer Q }
? Omit < T , 'query' > & {
query : ( params : {
prompt : Parameters < Q >[ 0 ][ 'prompt' ];
options ?: Parameters < Q >[ 0 ][ 'options' ];
tcc ?: TCCConfig ;
}) => ReturnType < Q >;
}
: T ;
@contextcompany/mastra
Integration for the Mastra AI framework.
Installation
npm install @contextcompany/mastra @mastra/core
Usage
import { Mastra } from '@mastra/core' ;
import { TCCMastraExporter } from '@contextcompany/mastra' ;
const framework = new Mastra ({
tracing: {
serviceName: 'my-mastra-agent' ,
exporters: [
new TCCMastraExporter ({
apiKey: process . env . TCC_API_KEY ,
debug: true
})
]
}
});
TCCMastraExporter
AI tracing exporter for Mastra framework that sends spans to Observatory. TCC API key. Defaults to TCC_API_KEY environment variable.
Custom endpoint URL. Auto-detects prod/dev based on API key.
Enable debug logging to console.
Type Exports
import type { TCCMastraExporterConfig } from '@contextcompany/mastra' ;
type TCCMastraExporterConfig = {
apiKey ?: string ;
endpoint ?: string ;
debug ?: boolean ;
};
@contextcompany/widget
Local-first UI widget for Next.js applications that displays real-time agent traces.
Installation
npm install @contextcompany/widget
Usage
Manual Init
Auto-Init via Script Tag
Next.js Layout
import { initWidget } from '@contextcompany/widget' ;
if ( typeof window !== 'undefined' ) {
initWidget ({
enabled: true ,
onMount : () => console . log ( 'Widget mounted' )
});
}
API Reference
export interface WidgetOptions {
enabled ?: boolean ; // Enable/disable widget (default: true)
onMount ?: () => void ; // Callback when widget mounts
}
export function initWidget ( options ?: WidgetOptions ) : () => void ;
export function cleanup () : void ;
Integration with OTEL
The widget automatically connects to the local WebSocket server started by @contextcompany/otel when local: true is set:
import { registerOTelTCC } from '@contextcompany/otel/nextjs' ;
export function register () {
registerOTelTCC ({
local: true // Starts WebSocket server on port 3002
});
}
@contextcompany/api
Core API utilities used by other Observatory packages.
Installation
npm install @contextcompany/api
Exports
submitFeedback
Configuration Helpers
import { submitFeedback } from '@contextcompany/api' ;
await submitFeedback ({
runId: string ,
score? : 'thumbs_up' | 'thumbs_down' ,
text? : string
});
Function Signatures
export function submitFeedback ( params : {
runId : string ;
score ?: 'thumbs_up' | 'thumbs_down' ;
text ?: string ;
}) : Promise < Response | undefined >;
export function getTCCApiKey () : string | undefined ;
export function getTCCUrl (
apiKey : string | undefined ,
prodUrl : string ,
devUrl : string
) : string ;
export function getTCCFeedbackUrl () : string ;
Common Patterns
Environment Configuration
All packages respect these environment variables:
TCC_API_KEY = your_api_key_here # Required (keys starting with dev_ route to dev)
TCC_URL = https://custom.endpoint.com # Optional (override default endpoint)
TCC_DEBUG = true # Optional (enable debug logging)
Feedback Collection
Every package exports submitFeedback for collecting user feedback:
import { submitFeedback } from '@contextcompany/otel' ;
// or '@contextcompany/custom'
// or '@contextcompany/claude'
// or '@contextcompany/mastra'
// or '@contextcompany/api'
await submitFeedback ({
runId: 'run_abc123' ,
score: 'thumbs_up' ,
text: 'Excellent response!'
});
Error Handling
import { run } from '@contextcompany/custom' ;
const r = run ();
try {
r . prompt ( 'Process this data' );
// ... agent logic
r . response ( result );
await r . end ();
} catch ( error ) {
await r . error ( `Failed: ${ error . message } ` );
}
Session Tracking
Group related runs into sessions for multi-turn conversations:
import { run } from '@contextcompany/custom' ;
const sessionId = 'session_' + userId ;
// First turn
const r1 = run ({ sessionId , conversational: true });
r1 . prompt ( 'Hello' );
r1 . response ( 'Hi! How can I help?' );
await r1 . end ();
// Second turn (same session)
const r2 = run ({ sessionId , conversational: true });
r2 . prompt ( 'What is the weather?' );
r2 . response ( 'It is sunny.' );
await r2 . end ();
Next Steps
Quickstart Get started with Observatory in 5 minutes
Python SDK Python SDK reference documentation
TypeScript API Complete TypeScript API reference
Configuration Configuration guide