Skip to main content
OpenInference provides a comprehensive set of JavaScript/TypeScript packages for instrumenting AI/ML applications with OpenTelemetry.

Instrumentation Packages

These packages provide automatic instrumentation for popular AI SDKs and frameworks:
PackageDescriptionnpm
@arizeai/openinference-instrumentation-openaiAuto-instrumentation for OpenAI Node.js SDKnpm
@arizeai/openinference-instrumentation-anthropicAuto-instrumentation for Anthropic SDKnpm
@arizeai/openinference-instrumentation-langchainAuto-instrumentation for LangChain.js v1+npm
@arizeai/openinference-instrumentation-langchain-v0Auto-instrumentation for LangChain.js v0.X (deprecated)npm
@arizeai/openinference-instrumentation-bedrockAuto-instrumentation for AWS Bedrock Runtimenpm
@arizeai/openinference-instrumentation-bedrock-agent-runtimeAuto-instrumentation for AWS Bedrock Agent Runtimenpm
@arizeai/openinference-instrumentation-beeaiAuto-instrumentation for BeeAI frameworknpm
@arizeai/openinference-instrumentation-claude-agent-sdkAuto-instrumentation for Claude Agent SDKnpm
@arizeai/openinference-instrumentation-mcpAuto-instrumentation for MCP TypeScript SDKnpm
@arizeai/openinference-vercelUtilities for ingesting Vercel AI SDK spansnpm
@arizeai/openinference-mastraUtilities for ingesting Mastra spans (deprecated)npm

Core Packages

These packages provide foundational functionality for building instrumentations and working with OpenInference:
PackageDescriptionnpm
@arizeai/openinference-coreShared tracing foundation with context propagation, span wrappers, and attribute helpersnpm
@arizeai/openinference-genaiUtilities to convert OpenTelemetry GenAI span attributes to OpenInferencenpm
@arizeai/openinference-semantic-conventionsOpenInference semantic conventions for JavaScriptnpm

Installation

All packages are available via npm and can be installed using your preferred package manager:
npm install @arizeai/openinference-instrumentation-openai

Common Features

All OpenInference JavaScript instrumentations provide:
  • Automatic Tracing: Capture traces without modifying application code
  • OpenTelemetry Compatible: Works with standard OpenTelemetry infrastructure
  • Context Propagation: Automatically propagate session ID, user ID, metadata, and tags
  • Data Masking: Support for hiding sensitive inputs/outputs via trace configuration
  • Custom Tracer Providers: Ability to use non-global tracer providers
  • TypeScript Support: Full TypeScript type definitions included

Getting Started

Most instrumentations follow a similar pattern:
  1. Install the instrumentation package
  2. Create a tracer provider
  3. Register the instrumentation
  4. Use your AI SDK normally
Example:
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { OpenAIInstrumentation } from "@arizeai/openinference-instrumentation-openai";
import { registerInstrumentations } from "@opentelemetry/instrumentation";

const provider = new NodeTracerProvider();
provider.register();

registerInstrumentations({
  instrumentations: [new OpenAIInstrumentation()],
});

// Now use OpenAI SDK as normal - traces will be captured automatically

Resources

Build docs developers (and LLMs) love