Before you begin
Make sure you have:- A LangSmith account (sign up here)
- A LangSmith API key from your Settings page
- Python 3.10+ or Node.js 18+
Get your first trace
Write and run your first traced code
Create a simple script that calls OpenAI and automatically traces it:
View your trace
Visit smith.langchain.com and navigate to your default project. You should see your trace with:
- Input messages and output response
- Token usage and costs
- Latency metrics
- Model parameters
Try streaming
Streaming responses are automatically traced too:Add metadata and tags
Organize your traces with custom metadata and tags:Trace custom code
Beyond OpenAI calls, you can trace any function with the@traceable decorator:
Next steps
Now that you have tracing working, explore more features:Installation
Configure advanced settings and environment variables
Tracing concepts
Learn about run trees, projects, and filtering
OpenAI integration
Explore advanced OpenAI features like structured outputs
Evaluation
Test your LLM applications with datasets
Troubleshooting
No traces appearing
No traces appearing
If traces aren’t showing up:
- Verify your API key is set:
echo $LANGSMITH_API_KEY - Check that your key starts with
ls_ - Look for error messages in your application logs
- Try setting
export LANGSMITH_TRACING=trueexplicitly - Ensure you’re viewing the correct project in the LangSmith UI
Import errors
Import errors
If you see import errors:Python:TypeScript:Make sure you’re using Python 3.10+ or Node.js 18+.
Authentication errors
Authentication errors
If you see “Unauthorized” or 401 errors:
- Verify your API key is correct in your Settings page
- Check that the key hasn’t been revoked
- If using an organization-scoped key, set
LANGSMITH_WORKSPACE_ID - Ensure there are no extra spaces in your environment variable