Overview
The@contextcompany/mastra package provides a TCCMastraExporter that integrates with Mastra’s built-in AI tracing system to capture:
- Agent executions and workflows
- LLM calls (generations, embeddings, etc.)
- Tool calls and results
- Streaming responses
- Token usage and costs
- Custom metadata
Installation
Install the package
This package requires
@mastra/core version 0.24.0 or higher as a peer dependency.Configuration
TheTCCMastraExporter accepts an optional configuration object:
mastra.config.ts
Configuration Options
Your Observatory API key. Overrides the
TCC_API_KEY environment variable.Custom ingestion endpoint URL. Defaults to Observatory’s production endpoint.
Enable debug logging to see detailed trace information in the console.
Usage Examples
Basic Agent Usage
Once configured, all your Mastra agents are automatically instrumented:- Agent configuration
- User prompt
- Model selection
- Complete response
- Token usage
- Execution time
Tracking Runs with Custom Metadata
Add custom metadata to track runs and sessions:Use the
tcc.runId metadata key to specify a custom run ID. If not provided, Observatory generates one automatically.Using Workflows
Mastra workflows are fully supported:- Step dependencies
- Individual step timing
- Data flow between steps
Using Tools
Tools are automatically captured:- Tool definitions
- Tool invocation arguments
- Tool execution results
- Tool execution time
Streaming Responses
Streaming is fully supported:Run and Session Tracking
Run IDs
Specify a custom run ID using metadata:Session IDs
Group related runs together:Submitting Feedback
Collect user feedback for specific runs:Environment Variables
Your Observatory API key. Get it from the Observatory dashboard.
Custom ingestion endpoint URL. Only needed for self-hosted instances.
How It Works
The integration works through Mastra’s AI tracing system:- Span Collection: The exporter receives span events from Mastra’s tracer
- Batching: Spans are collected in memory until the root span (agent run) completes
- Export: When the root span ends, all spans are sent to Observatory in a single batch
- Metadata Extraction: Custom metadata and run IDs are extracted from the root span
- Complete traces (all spans are sent together)
- Accurate timing (captures the full execution)
- Minimal overhead (single network request per run)
API Reference
TCCMastraExporter
Implements Mastra’s AITracingExporter interface.
Constructor:
API key for authentication. Defaults to
TCC_API_KEY environment variable.Custom ingestion endpoint. Defaults to Observatory’s production endpoint.
Enable debug logging.
exportEvent(event: AITracingEvent): Promise<void>- Called by Mastra for each span eventshutdown(): Promise<void>- Exports any remaining traces before shutdowninit(config: TracingConfig): void- Called by Mastra during initialization
Troubleshooting
Traces not appearing
Traces not appearing
- Verify
TCC_API_KEYis set correctly - Enable debug mode:
- Check console for error messages
- Ensure agent execution completes (traces are sent after completion)
Missing spans
Missing spans
Make sure:
- Your Mastra version is 0.24.0 or higher
- AI tracing is enabled in your Mastra config
- The exporter is added to the
exportersarray
Custom metadata not showing
Custom metadata not showing
Metadata must be passed in the agent’s metadata option:Not in the prompt or other locations.
Development vs production endpoints
Development vs production endpoints
API keys with the
dev_ prefix automatically route to the development endpoint. Production keys route to production. To override, use the endpoint config option.Performance Considerations
- Memory: Spans are held in memory until the root span completes. For very long-running agents, consider the memory footprint.
- Network: One HTTP request per agent run after completion.
- Latency: Zero impact on agent execution time - export happens asynchronously after completion.
Next Steps
Configuration
Learn about configuration options
Sessions & Runs
Learn more about tracking runs and sessions
Feedback
Set up user feedback collection
API Reference
Complete API documentation
