Available Instrumentations
LangChain4j
Auto-instrumentation for LangChain4j applications. Package:com.arize.instrumentation.langchain4jMaven Artifact:
com.arize:openinference-instrumentation-langchain4j:0.1.1
Spring AI
Instrumentation for Spring AI applications using Micrometer observation handlers. Package:com.arize.instrumentation.springAIMaven Artifact:
com.arize:openinference-instrumentation-springAI:0.1.1
LangChain4j Instrumentation
Installation
Basic Usage
With Custom TracerProvider
With Custom TraceConfig
Manual Model Listener
For finer control, you can manually create and attach model listeners:Uninstrumenting
Spring AI Instrumentation
Installation
Configuration
Spring AI instrumentation uses Micrometer’s observation API. Register the instrumentor as an observation handler:With Custom TraceConfig
Usage with Spring AI
Once configured, Spring AI automatically creates observations that the instrumentor converts to OpenInference traces:Captured Information
The Spring AI instrumentor automatically captures:- Model name and provider
- Input/output messages
- Token counts
- Tool calls
- Invocation parameters (temperature, max_tokens, etc.)
- Error information
Environment Variables
LangChain4j
OTEL_INSTRUMENTATION_LANGCHAIN4J_ENABLED: Enable/disable LangChain4j auto-instrumentation (default:true)
Captured Span Attributes
Both instrumentations automatically set the following OpenInference attributes:Core Attributes
openinference.span.kind: Set toLLMllm.model_name: Model identifierllm.provider: Provider name (e.g., “openai”)llm.system: System name (e.g., “openai”, “spring-ai”)
Input/Output
input.value: Input messages as JSONinput.mime_type: “application/json”output.value: Output messages as JSONoutput.mime_type: “application/json”
Token Counts
llm.token_count.prompt: Input token countllm.token_count.completion: Output token countllm.token_count.total: Total token count
Invocation Parameters
llm.invocation_parameters: JSON string of model parameters (temperature, max_tokens, etc.)
Tool Calls
llm.input_messages.{i}.message.tool_calls: Tool calls in messagestool_call.id: Tool call identifiertool_call.function.name: Function nametool_call.function.arguments: Function arguments
Privacy Controls
UseTraceConfig to control what data is captured:
Dependencies
LangChain4j Instrumentation
- OpenInference Base Instrumentation
- OpenInference Semantic Conventions
- LangChain4j (1.0.0+)
- OpenTelemetry API and SDK
Spring AI Instrumentation
- OpenInference Base Instrumentation
- OpenInference Semantic Conventions
- Spring AI Core
- Micrometer Observation API
- OpenTelemetry API and SDK
Complete Example
LangChain4j Application
Spring AI Application
Next Steps
- Review Semantic Conventions for all available attributes
- Learn about Base Instrumentation for manual instrumentation
- Check the Installation guide for dependency details