Skip to main content
OpenInference provides auto-instrumentation packages for popular AI/ML frameworks and LLM providers.

Available Instrumentations

PackageFrameworkDescription
openinference-instrumentation-openaiOpenAIOpenAI Python SDK (chat, completions, embeddings, assistants)
openinference-instrumentation-langchainLangChainLangChain framework for LLM applications
openinference-instrumentation-llama-indexLlamaIndexLlamaIndex data framework (formerly GPT Index)
openinference-instrumentation-anthropicAnthropicAnthropic Claude API
openinference-instrumentation-bedrockAWS BedrockAmazon Bedrock LLM service
openinference-instrumentation-vertexaiVertex AIGoogle Cloud Vertex AI
openinference-instrumentation-mistralaiMistral AIMistral AI models
openinference-instrumentation-groqGroqGroq LLM API
openinference-instrumentation-litellmLiteLLMLiteLLM unified LLM interface
openinference-instrumentation-dspyDSPyDSPy framework for LM programs
openinference-instrumentation-haystackHaystackHaystack NLP framework
openinference-instrumentation-crewaiCrewAICrewAI multi-agent framework
openinference-instrumentation-autogenAutoGenMicrosoft AutoGen framework
openinference-instrumentation-autogen-agentchatAutoGen AgentChatAutoGen AgentChat API
openinference-instrumentation-instructorInstructorInstructor structured outputs library
openinference-instrumentation-guardrailsGuardrails AIGuardrails AI validation framework
openinference-instrumentation-pydantic-aiPydantic AIPydantic AI agent framework
openinference-instrumentation-smolagentssmol-agentsHugging Face smol-agents
openinference-instrumentation-google-genaiGoogle GenAIGoogle Generative AI SDK
openinference-instrumentation-google-adkGoogle ADKGoogle Agent Developer Kit
openinference-instrumentation-openai-agentsOpenAI AgentsOpenAI Agents API
openinference-instrumentation-mcpMCPModel Context Protocol servers
openinference-instrumentation-beeaiBeeAIBeeAI agent framework
openinference-instrumentation-agnoAgnoAgno agent framework
openinference-instrumentation-strands-agentsStrands AgentsStrands agent framework
openinference-instrumentation-portkeyPortkeyPortkey AI gateway
openinference-instrumentation-openllmetryOpenLLMetryOpenLLMetry observability
openinference-instrumentation-openlitOpenLITOpenLIT observability
openinference-instrumentation-pipecatPipecatPipecat voice agents
openinference-instrumentation-agentspecAgentSpecAgentSpec protocol
openinference-instrumentation-agent-frameworkAgent FrameworkGeneric agent framework support

Installation

Install any instrumentation package using pip:
pip install openinference-instrumentation-{name}
For example:
pip install openinference-instrumentation-openai
pip install openinference-instrumentation-langchain
pip install openinference-instrumentation-llama-index

Basic Usage Pattern

All instrumentations follow a similar pattern:
from openinference.instrumentation.{name} import {Name}Instrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

# Configure tracer provider
tracer_provider = TracerProvider()
tracer_provider.add_span_processor(
    SimpleSpanProcessor(OTLPSpanExporter("http://localhost:6006/v1/traces"))
)

# Instrument
{Name}Instrumentor().instrument(tracer_provider=tracer_provider)

Example: OpenAI

import openai
from openinference.instrumentation.openai import OpenAIInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

# Setup tracing
tracer_provider = TracerProvider()
tracer_provider.add_span_processor(
    SimpleSpanProcessor(OTLPSpanExporter("http://localhost:6006/v1/traces"))
)

# Instrument OpenAI
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)

# Use OpenAI as normal - automatically traced
client = openai.OpenAI()
response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello!"}]
)

Example: LangChain

from langchain.chains import LLMChain
from langchain.llms import OpenAI
from langchain.prompts import PromptTemplate
from openinference.instrumentation.langchain import LangChainInstrumentor

# Instrument LangChain
LangChainInstrumentor().instrument(tracer_provider=tracer_provider)

# Use LangChain - automatically traced
llm = OpenAI(temperature=0.9)
prompt = PromptTemplate(
    input_variables=["product"],
    template="What is a good name for a company that makes {product}?",
)
chain = LLMChain(llm=llm, prompt=prompt)
response = chain.run("eco-friendly water bottles")

Example: LlamaIndex

from llama_index import VectorStoreIndex, SimpleDirectoryReader
from openinference.instrumentation.llama_index import LlamaIndexInstrumentor

# Instrument LlamaIndex
LlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)

# Use LlamaIndex - automatically traced
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
response = query_engine.query("What is the main topic?")

Features

All instrumentations provide:
  • Automatic span creation - Traces are created without code changes
  • OpenInference semantic conventions - Standardized attributes
  • Context propagation - Session, user, metadata, tags
  • TraceConfig support - Privacy and payload controls
  • Suppression support - Use suppress_tracing() to pause tracing
  • OpenTelemetry compatible - Works with any OTel collector

Testing

Instrumentations use pytest with VCR cassettes for testing:
# Run tests for a specific instrumentation
cd python/instrumentation/openinference-instrumentation-openai
pytest tests/

# Record new cassettes (requires API key)
pytest tests/ -k test_name --vcr-record=once

Contributing

See the development guide for creating new instrumentations.

Resources

Build docs developers (and LLMs) love