@agentlib/openai
OpenAI model provider for AgentLIB — supports GPT-4o, GPT-4, o1, o3-mini, and any OpenAI-compatible API.Installation
Overview
The@agentlib/openai package provides a model provider that integrates OpenAI’s chat completion API with AgentLIB agents. It handles:
- Message format conversion between AgentLIB and OpenAI formats
- Tool calling (function calling) support
- Token usage tracking
- Streaming support
- Model-specific handling (e.g., o1/o3 models)
- Custom base URLs for OpenAI-compatible APIs
Quick Start
Configuration
OpenAIProviderConfig
Usage Examples
Basic Configuration
GPT-4o Mini (Cost-Effective)
OpenAI o1 Models
Custom Base URL (OpenAI-Compatible APIs)
With Organization
Token Limits
Tool Calling Support
The provider automatically handles tool calling (function calling):Streaming Support
The provider supports streaming responses:Token Usage Tracking
Token usage is automatically tracked and returned:Exports
Classes
OpenAIProvider- The main provider class implementingModelProvider
Functions
openai(config)- Factory function to create an OpenAI provider
Types
OpenAIProviderConfig- Configuration interface
Error Handling
- Invalid API key
- Rate limiting
- Maximum token length exceeded
- Invalid tool call JSON
Requirements
- Node.js: >= 18.0.0
- Dependencies:
@agentlib/core(workspace dependency)openai^4.52.0
Environment Variables
Related Packages
- @agentlib/core - Core runtime
- @agentlib/reasoning - Reasoning engines that work with this provider