Skip to main content
Portkey AI Gateway seamlessly integrates with popular agent frameworks and SDKs, enabling you to route to 250+ LLMs while maintaining familiar APIs and patterns.

Why Integrate with Portkey?

Integrating Portkey with your existing frameworks provides:
  • Universal Access: Connect to 250+ LLMs through a single, unified interface
  • Production Reliability: Automatic retries, fallbacks, and load balancing
  • Performance: Smart caching to reduce latency and costs by up to 20x
  • Observability: Complete logging and tracing for all requests
  • Easy Migration: Minimal code changes to switch between providers

Agent Frameworks

Portkey integrates with leading agent frameworks to bring production-grade LLM routing to your agentic workflows.

LangChain

Build LLM applications with LangChain’s composable framework

LlamaIndex

Create RAG applications with LlamaIndex data frameworks

Autogen

Build multi-agent conversations with Microsoft Autogen

CrewAI

Orchestrate role-playing autonomous AI agents

Phidata

Build AI assistants with memory and knowledge

SDK Integrations

Use Portkey through native SDKs or existing OpenAI-compatible clients.

Python SDK

Native Python SDK with full type support

JavaScript SDK

TypeScript/JavaScript SDK for Node.js and browsers

OpenAI SDK

Use Portkey as a drop-in replacement for OpenAI SDK

REST API

Direct HTTP API access for any language

Quick Comparison

FrameworkCall 250+ LLMsAdvanced RoutingCachingLogging & TracingObservability
LangChain
LlamaIndex
Autogen
CrewAI
Phidata
Python SDK
JavaScript SDK

Getting Started

Most integrations follow a similar pattern:
1

Install Portkey

Install the Portkey SDK or configure your existing client
pip install portkey-ai
2

Configure Gateway

Point your client to Portkey’s gateway URL and add headers
from portkey_ai import Portkey

client = Portkey(
    api_key="your-portkey-api-key",
    virtual_key="your-provider-virtual-key"
)
3

Make Requests

Use your framework’s standard API - no code changes needed
response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello!"}]
)

Next Steps

Select your framework or SDK from the cards above to see specific integration guides with code examples.
All integrations support Portkey’s advanced features like fallbacks, load balancing, caching, and observability.

Build docs developers (and LLMs) love