Route, manage, and analyze your LLM requests
A unified API gateway for multiple LLM providers with built-in analytics, caching, and cost tracking
Quick start
Get up and running with LLM Gateway in minutes
Sign up for an account
Visit llmgateway.io to create your account, or self-host the gateway on your own infrastructure.
Get your API key
After signing in, navigate to your project settings and generate an API key. This key will authenticate your requests to the gateway.
Make your first request
Use the OpenAI-compatible API to route requests through the gateway:
cURL
Expected response
Expected response
View analytics
Check your dashboard to see request logs, token usage, costs, and performance metrics across all your LLM calls.
Core features
Everything you need to manage your LLM infrastructure
Unified API interface
OpenAI-compatible API that works with all major LLM providers. Drop-in replacement for existing integrations.
Multi-provider support
Connect to OpenAI, Anthropic, Google, AWS Bedrock, and more through a single gateway.
Usage analytics
Track requests, tokens, costs, and performance metrics with detailed dashboards and exportable reports.
Response caching
Reduce costs and latency with intelligent Redis-based response caching across providers.
API key management
Generate, rotate, and manage API keys with fine-grained permissions and usage limits.
Guardrails
Implement content filters, rate limits, and safety policies to protect your applications.
Explore by topic
Learn how to use LLM Gateway for your use case
Projects & organizations
Organize your work with projects and manage team access with organizations.
Playground
Test and compare models interactively with the built-in playground.
MCP integration
Connect LLM Gateway with Model Context Protocol-compatible tools.
OpenAI SDK
Use the OpenAI Python or Node.js SDK with LLM Gateway.
LangChain
Integrate LLM Gateway with your LangChain applications.
Vercel AI SDK
Build AI-powered apps with Vercel AI SDK and LLM Gateway.
API reference
Complete API documentation for the Gateway and Management APIs
Gateway API
OpenAI-compatible endpoints for chat completions, images, and models.
Management API
Manage API keys, provider keys, projects, organizations, and view activity logs.
Ready to get started?
Start routing your LLM requests through a unified gateway with built-in analytics and caching.
Get Started