Skip to main content
OpenInference provides comprehensive Python instrumentation for AI/ML frameworks and LLM providers. All instrumentations are fully OpenTelemetry compatible and can be sent to any OpenTelemetry collector for viewing, such as Arize Phoenix.

Available Instrumentations

The following table lists all available Python instrumentation packages:
PackageDescriptionPyPI
openinference-instrumentation-openaiAuto-instrumentation for OpenAI’s Python SDKPyPI
openinference-instrumentation-anthropicAuto-instrumentation for Anthropic SDKPyPI
openinference-instrumentation-langchainAuto-instrumentation for LangChain (1.x and Classic)PyPI
openinference-instrumentation-llama-indexAuto-instrumentation for LlamaIndexPyPI
openinference-instrumentation-bedrockAuto-instrumentation for AWS Bedrock (boto3 and aioboto3)PyPI
openinference-instrumentation-vertexaiAuto-instrumentation for Google Vertex AIPyPI
openinference-instrumentation-mistralaiAuto-instrumentation for Mistral AI SDKPyPI
openinference-instrumentation-dspyAuto-instrumentation for DSPyPyPI
openinference-instrumentation-haystackAuto-instrumentation for Haystack pipelinesPyPI
openinference-instrumentation-groqAuto-instrumentation for Groq SDKPyPI
openinference-instrumentation-litellmAuto-instrumentation for LiteLLMPyPI
openinference-instrumentation-instructorAuto-instrumentation for Instructor libraryPyPI
openinference-instrumentation-crewaiAuto-instrumentation for CrewAI agentsPyPI
openinference-instrumentation-google-genaiAuto-instrumentation for Google GenAIPyPI
openinference-instrumentation-google-adkAuto-instrumentation for Google ADKPyPI
openinference-instrumentation-pydantic-aiAuto-instrumentation for PydanticAIPyPI
openinference-instrumentation-openai-agentsAuto-instrumentation for OpenAI Agents SDKPyPI
openinference-instrumentation-autogenAuto-instrumentation for Autogen (ag2)PyPI
openinference-instrumentation-autogen-agentchatAuto-instrumentation for Autogen AgentChatPyPI
openinference-instrumentation-agent-frameworkAuto-instrumentation for Microsoft Agent FrameworkPyPI
openinference-instrumentation-agentspecAuto-instrumentation for Agent SpecPyPI
openinference-instrumentation-agnoAuto-instrumentation for Agno agentsPyPI
openinference-instrumentation-smolagentsAuto-instrumentation for smolagentsPyPI
openinference-instrumentation-strands-agentsAuto-instrumentation for Strands AgentsPyPI
openinference-instrumentation-beeaiAuto-instrumentation for BeeAIPyPI
openinference-instrumentation-guardrailsAuto-instrumentation for GuardrailsPyPI
openinference-instrumentation-openllmetryConverts OpenLLMetry traces to OpenInferencePyPI
openinference-instrumentation-openlitAuto-instrumentation for OpenLitPyPI
openinference-instrumentation-pipecatAuto-instrumentation for PipecatPyPI
openinference-instrumentation-portkeyAuto-instrumentation for PortkeyPyPI
openinference-instrumentation-mcpAuto-instrumentation for MCP SDK (context propagation)PyPI
openinference-instrumentation-promptflowAuto-instrumentation for PromptFlowPyPI

Getting Started

All instrumentations follow a similar pattern:
  1. Install the instrumentation package
  2. Configure the OpenTelemetry tracer provider
  3. Instrument your application
  4. Run your code and view traces

Basic Example

from openinference.instrumentation.<package> import <Package>Instrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

# Configure tracer
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))

# Instrument
<Package>Instrumentor().instrument(tracer_provider=tracer_provider)

# Your application code here

Common Features

All OpenInference instrumentations support:
  • Session tracking - Track user sessions across multiple requests
  • User identification - Associate traces with specific users
  • Metadata and tags - Add custom metadata and tags to traces
  • Prompt template tracking - Track prompt templates and variables
  • PII masking - Configure trace masking for sensitive data
  • Context propagation - Automatic propagation of tracing context

Using Context Attributes

from openinference.instrumentation import using_attributes

with using_attributes(
    session_id="my-session-id",
    user_id="user-123",
    metadata={"key": "value"},
    tags=["production", "feature-x"],
):
    # Your instrumented code here
    pass

Additional Resources

Contributing

To contribute a new instrumentation or report issues, visit the OpenInference GitHub repository.

Build docs developers (and LLMs) love