Skip to main content
This guide will get you from zero to tracing your first OpenAI call in under 5 minutes.

Before you begin

Make sure you have:

Get your first trace

1

Install the packages

Install LangSmith and the OpenAI SDK:
pip install langsmith openai
2

Set your API keys

Configure your environment with your credentials:
export LANGSMITH_API_KEY=ls_...
export OPENAI_API_KEY=sk-...
Get your OpenAI API key from platform.openai.com/api-keys
3

Write and run your first traced code

Create a simple script that calls OpenAI and automatically traces it:
import openai
from langsmith import wrappers

# Wrap your OpenAI client
client = wrappers.wrap_openai(openai.OpenAI())

# Make an API call - automatically traced!
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "What is LangSmith?"}
    ]
)

print(response.choices[0].message.content)
4

View your trace

Visit smith.langchain.com and navigate to your default project. You should see your trace with:
  • Input messages and output response
  • Token usage and costs
  • Latency metrics
  • Model parameters
Traces typically appear within 1-2 seconds of making the API call.

Try streaming

Streaming responses are automatically traced too:
import openai
from langsmith import wrappers

client = wrappers.wrap_openai(openai.OpenAI())

# Stream the response
stream = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Count to 5"}],
    stream=True
)

for chunk in stream:
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="")
The complete streamed response will appear in your LangSmith trace, including all chunks and final token counts.

Add metadata and tags

Organize your traces with custom metadata and tags:
import openai
from langsmith import wrappers

# Add metadata at the client level
client = wrappers.wrap_openai(
    openai.OpenAI(),
    tracing_extra={
        "metadata": {"user_id": "user_123", "environment": "production"},
        "tags": ["quickstart", "openai"]
    }
)

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello!"}]
)

Trace custom code

Beyond OpenAI calls, you can trace any function with the @traceable decorator:
from langsmith import traceable
import openai
from langsmith import wrappers

client = wrappers.wrap_openai(openai.OpenAI())

@traceable
def generate_poem(topic: str) -> str:
    """Generate a haiku about the given topic."""
    response = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[
            {"role": "system", "content": "You are a poet. Write a haiku."},
            {"role": "user", "content": f"Write a haiku about {topic}"}
        ]
    )
    return response.choices[0].message.content

# This creates a trace with a nested OpenAI call
poem = generate_poem("coding")
print(poem)
This creates a hierarchical trace showing your custom function with the OpenAI call nested inside.

Next steps

Now that you have tracing working, explore more features:

Installation

Configure advanced settings and environment variables

Tracing concepts

Learn about run trees, projects, and filtering

OpenAI integration

Explore advanced OpenAI features like structured outputs

Evaluation

Test your LLM applications with datasets

Troubleshooting

If traces aren’t showing up:
  1. Verify your API key is set: echo $LANGSMITH_API_KEY
  2. Check that your key starts with ls_
  3. Look for error messages in your application logs
  4. Try setting export LANGSMITH_TRACING=true explicitly
  5. Ensure you’re viewing the correct project in the LangSmith UI
If you see import errors:Python:
pip install --upgrade langsmith openai
TypeScript:
npm install langsmith openai
Make sure you’re using Python 3.10+ or Node.js 18+.
If you see “Unauthorized” or 401 errors:
  1. Verify your API key is correct in your Settings page
  2. Check that the key hasn’t been revoked
  3. If using an organization-scoped key, set LANGSMITH_WORKSPACE_ID
  4. Ensure there are no extra spaces in your environment variable

Build docs developers (and LLMs) love