Overview
TypeAgent requires API credentials and configuration via environment variables for LLM and embedding providers. The system supports both public OpenAI and Azure OpenAI services, with automatic provider detection.Recommended: Use a
.env file and call load_dotenv() at the start of your program:OpenAI Environment Variables
For public OpenAI services (api.openai.com).Required
Optional
The chat model to use for knowledge extraction and structured output.Common values:
gpt-4o(default)gpt-4o-minigpt-4-turbogpt-3.5-turbo
The embedding model to use for semantic search and indexing.Common values:
text-embedding-3-small(recommended, 1536 dims)text-embedding-3-large(best quality, 3072 dims)text-embedding-ada-002(legacy, 1536 dims)
Custom base URL for OpenAI-compatible embedding servers (e.g., Infinity).Note:
OPENAI_API_KEY must still be set (can be any value for self-hosted servers).Example:Custom endpoint URL for OpenAI-compatible Chat Completions API.Important: Ensure
OPENAI_MODEL matches the deployed model name.Example:Azure OpenAI Environment Variables
For OpenAI service hosted by Microsoft Azure.Required
Your Azure OpenAI API key, or
"identity" to use Azure Managed Identity.Format: String key or "identity"Examples:Full URL of the Azure OpenAI REST API endpoint for chat completions.Format:
https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT_NAME/chat/completions?api-version=YYYY-MM-DDExample:Full URL of the Azure OpenAI REST API endpoint for embeddings.Format:
https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_EMBEDDING_DEPLOYMENT_NAME/embeddings?api-version=YYYY-MM-DDExample:Optional (Model-Specific Endpoints)
For scenarios with multiple embedding model deployments:Azure endpoint specifically for
text-embedding-3-small model.If set, takes precedence over AZURE_OPENAI_ENDPOINT_EMBEDDING when using this model.Azure endpoint specifically for
text-embedding-3-large model.If set, takes precedence over AZURE_OPENAI_ENDPOINT_EMBEDDING when using this model.Separate API key for embedding endpoints (optional).If not set, falls back to
AZURE_OPENAI_API_KEY.Provider Conflict Resolution
When both OpenAI and Azure OpenAI credentials are present: Example scenarios:Other Provider Environment Variables
TypeAgent supports 25+ providers via pydantic-ai. Common providers:Anthropic
Groq
Cohere
AWS Bedrock
Uses standard AWS credentials (boto3):Ollama
No environment variables required. Defaults tohttp://localhost:11434.
Optional:
Example .env Files
Public OpenAI
Azure OpenAI
Azure OpenAI with Managed Identity
Mixed Providers
Local Development (Ollama)
Loading Environment Variables
Using python-dotenv (Recommended)
Using python-dotenv with Custom Path
Manual Setting (Not Recommended)
Security Best Practices
Never Commit Keys
Add
.env to .gitignore. Never commit API keys to version control.Use Secret Managers
In production, use Azure Key Vault, AWS Secrets Manager, or similar.
Rotate Keys
Regularly rotate API keys and revoke old keys.
Managed Identity
Use Azure Managed Identity (
AZURE_OPENAI_API_KEY=identity) in Azure environments.Troubleshooting
”No API key found”
OPENAI_API_KEY or AZURE_OPENAI_API_KEY environment variable.
”Embedding model mismatch”
- Use the same model:
create_embedding_model("openai:text-embedding-ada-002") - Create a new database file
Azure endpoint format errors
Correct format:“OPENAI_MODEL ignored; Azure deployment determined by AZURE_OPENAI_ENDPOINT”
This is a warning, not an error. When using Azure, the deployment name in the endpoint URL determines the model, notOPENAI_MODEL.
Verification Script
Test your configuration:Related
- Model Adapters - Using models in code
- ConversationSettings - Configuring conversations
- Embedding Models - Embedding interfaces