Skip to main content

Introduction

LangChain provides a rich ecosystem of integrations through partner packages that connect your applications to popular AI services and tools. Each integration is distributed as a standalone package, giving you flexibility to install only what you need.

How integrations work

LangChain integrations are built on standardized abstractions from langchain-core, ensuring consistent interfaces across different providers. This means you can:
  • Switch providers easily: Change from OpenAI to Anthropic with minimal code changes
  • Mix and match: Use embeddings from one provider with chat models from another
  • Install selectively: Only install the integrations you need for your project

Installation patterns

Partner packages follow a consistent naming convention: langchain-{provider}. Install them using pip:
# Install a single integration
pip install langchain-openai

# Install multiple integrations
pip install langchain-openai langchain-anthropic langchain-chroma
Each package is independently versioned and maintained, allowing you to update integrations without affecting your entire LangChain installation.

Available integrations

Chat Models

Chat models provide conversational AI capabilities from leading providers. These integrations support features like streaming, function calling, and vision where available.

OpenAI

GPT-4, GPT-3.5, and other OpenAI models with tool calling and vision support

Anthropic

Claude models with extended context windows and advanced reasoning

Ollama

Run open-source models locally with Llama, Mistral, and more

Groq

Ultra-fast inference with Llama and Mixtral models

Fireworks

High-performance inference for open-source models

DeepSeek

Powerful reasoning and coding models

Mistral AI

Mistral and Mixtral models with function calling

OpenRouter

Access to multiple model providers through a unified API

Perplexity

Online models with real-time web search capabilities

xAI

Grok models from xAI

Embeddings

Embeddings models convert text into vector representations for semantic search and retrieval.

OpenAI Embeddings

Industry-standard text-embedding-3 models for semantic search

HuggingFace

Access thousands of open-source embedding models

Nomic

High-quality embeddings optimized for retrieval tasks

Vector Stores

Vector stores enable efficient similarity search and retrieval of embedded documents.

Chroma

Open-source embedding database for AI applications

Qdrant

High-performance vector search engine with advanced filtering

Tools

Tools extend your AI applications with external capabilities like search, data retrieval, and API interactions.

Exa Search

Neural search engine for finding high-quality web content

Using integrations in your code

Here’s a typical pattern for using LangChain integrations:
from langchain_openai import ChatOpenAI
from langchain_anthropic import ChatAnthropic
from langchain_chroma import Chroma
from langchain_openai import OpenAIEmbeddings

# Initialize a chat model
llm = ChatOpenAI(model="gpt-4")

# Or switch to a different provider
# llm = ChatAnthropic(model="claude-3-5-sonnet-20241022")

# Set up embeddings and vector store
embeddings = OpenAIEmbeddings(model="text-embedding-3-small")
vector_store = Chroma(embedding_function=embeddings)

# Use them together
response = llm.invoke("What can you help me with?")

Authentication

Most integrations require API keys for authentication. Set them as environment variables:
# OpenAI
export OPENAI_API_KEY="your-api-key"

# Anthropic
export ANTHROPIC_API_KEY="your-api-key"

# HuggingFace
export HUGGINGFACEHUB_API_KEY="your-api-key"

# Exa
export EXA_API_KEY="your-api-key"
Alternatively, pass API keys directly when initializing:
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(api_key="your-api-key", model="gpt-4")
Never commit API keys to version control. Use environment variables or secure secret management systems.

Next steps

Explore providers

Dive into detailed guides for each integration

Build with LangChain

Start building AI applications with these integrations

API Reference

View complete API documentation for all integrations

Contributing

Learn how to contribute new integrations or improve existing ones

Package reference

All LangChain partner packages are available on PyPI and follow semantic versioning:
PackageDescriptionPyPI
langchain-openaiOpenAI models and embeddingsPyPI
langchain-anthropicAnthropic (Claude) integrationPyPI
langchain-ollamaLocal model support via OllamaPyPI
langchain-groqGroq inference enginePyPI
langchain-fireworksFireworks AI modelsPyPI
langchain-deepseekDeepSeek modelsPyPI
langchain-mistralaiMistral AI modelsPyPI
langchain-openrouterOpenRouter unified APIPyPI
langchain-perplexityPerplexity online modelsPyPI
langchain-huggingfaceHuggingFace models and embeddingsPyPI
langchain-nomicNomic embeddingsPyPI
langchain-chromaChroma vector databasePyPI
langchain-qdrantQdrant vector databasePyPI
langchain-exaExa search integrationPyPI
langchain-xaixAI (Grok) modelsPyPI

Build docs developers (and LLMs) love