Available Instrumentations
| Framework | Package Name | Version | Maven Central |
|---|---|---|---|
| LangChain4j | openinference-instrumentation-langchain4j | 0.1.5 | View on Maven Central |
| Spring AI | openinference-instrumentation-springAI | 0.1.0 | View on Maven Central |
Core Packages
In addition to framework-specific instrumentations, OpenInference provides core packages:| Package | Description | Version | Maven Central |
|---|---|---|---|
openinference-semantic-conventions | Java constants for OpenInference semantic conventions | 0.1.1 | View on Maven Central |
openinference-instrumentation | Base instrumentation utilities and OITracer | 0.1.1 | View on Maven Central |
Installation
Gradle
Add the desired instrumentation to yourbuild.gradle:
Maven
Add the desired instrumentation to yourpom.xml:
Requirements
General Requirements
- Java Version: Java 11 or higher (Java 17+ for Spring AI)
- OpenTelemetry: OpenTelemetry Java 1.49.0 or higher
- Build Tools: Gradle 7.0+ or Maven 3.6+
Framework-Specific Requirements
LangChain4j
- LangChain4j 1.0.0 or higher
- Java 11+
Spring AI
- Spring AI 1.0.0 or higher
- Micrometer Observation 1.15.0 or higher
- Java 17+
Quick Start Examples
LangChain4j
Spring AI
Features
All OpenInference Java instrumentations provide:- Automatic Trace Capture: LLM calls, parameters, and responses
- Token Usage Tracking: Prompt, completion, and total token counts
- Message Tracing: Input/output messages with roles (user, assistant, system, tool)
- Tool Call Tracing: Function names, arguments, and results
- Error Tracking: Exceptions and error messages
- Configurable Privacy: Hide sensitive input/output data
- OpenTelemetry Native: Works with any OTel-compatible backend
- Context Propagation: Distributed tracing support
OpenTelemetry Setup
All instrumentations require OpenTelemetry to be initialized. Here’s a basic setup:Viewing Traces
Using Phoenix (Recommended)
Arize Phoenix is an open-source observability platform designed for LLM applications:-
Start Phoenix:
-
Configure your application to send traces to
http://localhost:4317 - View traces at http://localhost:6006
Other OpenTelemetry Backends
OpenInference instrumentations work with any OpenTelemetry-compatible backend:- Jaeger: Distributed tracing platform
- Zipkin: Distributed tracing system
- Grafana Tempo: Scalable distributed tracing backend
- Cloud Services: AWS X-Ray, Google Cloud Trace, Azure Monitor
Semantic Conventions
OpenInference Java packages follow the OpenInference semantic conventions, which define:- Span Kinds: LLM, Chain, Tool, Agent, Retriever, Embedding, Reranker, Guardrail, Evaluator
- Attributes: Standardized attribute names for model info, tokens, messages, etc.
- Message Format: Structured format for capturing LLM conversations
Common Patterns
Configuration Privacy Controls
All instrumentations support hiding sensitive data:Graceful Shutdown
Always flush spans before shutdown:Custom Attributes
Add custom attributes to your resource:Troubleshooting
Dependency Conflicts
If you encounter OpenTelemetry version conflicts:Missing Traces
- Verify OpenTelemetry is initialized before instrumentation
- Check that OTLP endpoint is reachable
- Ensure
forceFlush()is called before application exit - Enable debug logging:
-Dio.opentelemetry.javaagent.debug=true
Performance Considerations
- Use
BatchSpanProcessorinstead ofSimpleSpanProcessorin production - Configure appropriate batch sizes and delays
- Consider sampling for high-volume applications
Examples and Resources
Example Applications
Complete working examples are available in the OpenInference repository:Documentation Links
- OpenInference Specification
- OpenTelemetry Java Documentation
- LangChain4j Documentation
- Spring AI Documentation
- Arize Phoenix Documentation
Community and Support
- GitHub Issues: Report bugs or request features
- Discussions: Ask questions and share ideas
- Slack Community: Join the Arize community