What You’ll Build
You’ll:- Install Phoenix
- Launch the Phoenix server
- Instrument a simple OpenAI application
- View traces in the Phoenix UI
Install Phoenix
Install Phoenix using pip:This installs the complete Phoenix platform including the server, tracing capabilities, and evaluation tools.
Phoenix requires Python 3.10 or higher. See the Installation page for other installation methods.
Launch the Phoenix Server
Start the Phoenix server with a single command:You’ll see output indicating the server has started:Open your browser and navigate to http://localhost:6006 to see the Phoenix UI.
Install OpenAI and Instrumentation
In a new terminal, install the OpenAI SDK and Phoenix’s OpenAI instrumentation:
Trace Your First Application
Create a file called Replace
app.py with the following code:app.py
"your-api-key-here" with your actual OpenAI API key.Run Your Application
Execute your application:You should see the LLM’s response printed to your console.
View Traces in Phoenix
Return to the Phoenix UI at http://localhost:6006. You should now see:
- Your project “my-first-app” in the projects list
- A trace showing the complete LLM interaction
- Detailed information including:
- Input messages
- Model response
- Token usage
- Latency
- Model parameters (temperature, etc.)
What’s Next?
Congratulations! You’ve successfully traced your first LLM application with Phoenix. Here’s what to explore next:Run Evaluations
Learn how to evaluate your LLM outputs for quality, hallucinations, and relevance.
Explore Integrations
Instrument LangChain, LlamaIndex, or other frameworks you’re using.
Create Datasets
Build datasets from your traces for experimentation and evaluation.
Deploy to Production
Learn how to deploy Phoenix for production use.
Explore Integrations
Instrument LangChain, LlamaIndex, or other frameworks you’re using.
Create Datasets
Build datasets from your traces for experimentation and evaluation.
Deploy to Production
Learn how to deploy Phoenix for production use.
Tracing Multiple Applications
You can trace multiple applications by using different project names:Using Phoenix Cloud
If you prefer not to self-host, you can use Phoenix Cloud instead:- Sign up for a free account at app.phoenix.arize.com
- Get your API key from the settings page
- Update your code to point to Phoenix Cloud:
Troubleshooting
I don't see any traces in Phoenix
I don't see any traces in Phoenix
Check that:
- The Phoenix server is running (visit http://localhost:6006)
- Your application is using the correct endpoint (http://localhost:6006/v1/traces)
- The instrumentation is properly configured before making LLM calls
- There are no firewall rules blocking localhost:6006
ImportError: cannot import name 'OpenAIInstrumentor'
ImportError: cannot import name 'OpenAIInstrumentor'
Make sure you’ve installed the instrumentation package:
Connection refused errors
Connection refused errors
Ensure the Phoenix server is running. You should see it listening on port 6006:
Need help? Join our Slack community and ask in the #phoenix-support channel.