OpenAI SDK Integration
The OpenAI SDK provides official client libraries for Python and Node.js. Since LLM Gateway is fully OpenAI-compatible, you can use these SDKs with minimal configuration changes.Quick Start
To use LLM Gateway with the OpenAI SDK, you only need to:- Set the base URL to
https://api.llmgateway.io/v1 - Use your LLM Gateway API key for authentication
- Specify models using the format
provider/modelor useautofor automatic routing
- Python
- Node.js
Installation
- Python
- Node.js
Before and After Comparison
Python
Node.js
Streaming
LLM Gateway fully supports streaming responses:- Python
- Node.js
Function Calling (Tools)
LLM Gateway supports OpenAI’s function calling API:- Python
- Node.js
Environment Variables
You can use environment variables instead of hardcoding credentials:- Python
- Node.js
.env
Model Selection
LLM Gateway supports multiple ways to specify models:Advanced Features
JSON Output
Force the model to output valid JSON:Reasoning Models
Control reasoning effort for models like o1:Caveats and Limitations
- Model Names: Use LLM Gateway’s model naming scheme (e.g.,
gpt-5instead ofgpt-4o) - Authentication: Use your LLM Gateway API key, not provider-specific keys
- Base URL: Always set
base_url(Python) orbaseURL(Node.js) tohttps://api.llmgateway.io/v1 - Response Metadata: LLM Gateway adds extra metadata in the response (provider used, routing info, costs)