Quick Start
Set your API key
Configure your OpenAI API key as a secret:
Secrets are encrypted at rest and injected at the network level. The agent never sees them in logs or LLM context.
Configuration Options
Pydantic AI supports various configuration options when creating anAgent:
Model Selection
Pydantic AI supports multiple LLM providers:openai:gpt-4o- GPT-4 Omni (recommended)openai:gpt-4-turbo- GPT-4 Turboanthropic:claude-3-5-sonnet-20241022- Claude 3.5 Sonnetgoogle:gemini-pro- Google Gemini Pro
Type-Safe Responses
Pydantic AI uses Pydantic models for type-safe structured outputs:agent.py
Adding Tools
Pydantic AI supports tools with type-safe parameters:agent.py
Async Support
For better performance with I/O operations, use async:agent.py
Dependency Injection
Pydantic AI supports dependency injection for sharing state across tools:agent.py
Deployment Configuration
Create asuperserve.yaml file for advanced deployment options:
superserve.yaml
Dependencies
Create arequirements.txt with your dependencies:
requirements.txt
pyproject.toml:
pyproject.toml
Session Persistence
The/workspace directory persists across turns and restarts. Here’s an example that saves conversation history:
agent.py
Validation and Error Handling
Pydantic AI provides built-in validation for structured outputs:agent.py
Troubleshooting
Import error: No module named 'pydantic_ai'
Import error: No module named 'pydantic_ai'
Make sure you have a
requirements.txt or pyproject.toml with pydantic-ai listed. Redeploy your agent:API key not found
API key not found
Set your API key as a secret:
Validation errors with structured outputs
Validation errors with structured outputs
Check your Pydantic model definitions and ensure the system prompt guides the LLM to produce valid data:
Next Steps
Core Concepts
Learn about isolation, persistence, and credentials
CLI Reference
Explore deployment options and CLI commands
Secrets Management
Manage API keys and environment variables
Session Management
Work with persistent sessions