Introduction
LangChain provides a rich ecosystem of integrations through partner packages that connect your applications to popular AI services and tools. Each integration is distributed as a standalone package, giving you flexibility to install only what you need.How integrations work
LangChain integrations are built on standardized abstractions fromlangchain-core, ensuring consistent interfaces across different providers. This means you can:
- Switch providers easily: Change from OpenAI to Anthropic with minimal code changes
- Mix and match: Use embeddings from one provider with chat models from another
- Install selectively: Only install the integrations you need for your project
Installation patterns
Partner packages follow a consistent naming convention:langchain-{provider}. Install them using pip:
Available integrations
Chat Models
Chat models provide conversational AI capabilities from leading providers. These integrations support features like streaming, function calling, and vision where available.OpenAI
GPT-4, GPT-3.5, and other OpenAI models with tool calling and vision support
Anthropic
Claude models with extended context windows and advanced reasoning
Ollama
Run open-source models locally with Llama, Mistral, and more
Groq
Ultra-fast inference with Llama and Mixtral models
Fireworks
High-performance inference for open-source models
DeepSeek
Powerful reasoning and coding models
Mistral AI
Mistral and Mixtral models with function calling
OpenRouter
Access to multiple model providers through a unified API
Perplexity
Online models with real-time web search capabilities
xAI
Grok models from xAI
Embeddings
Embeddings models convert text into vector representations for semantic search and retrieval.OpenAI Embeddings
Industry-standard text-embedding-3 models for semantic search
HuggingFace
Access thousands of open-source embedding models
Nomic
High-quality embeddings optimized for retrieval tasks
Vector Stores
Vector stores enable efficient similarity search and retrieval of embedded documents.Chroma
Open-source embedding database for AI applications
Qdrant
High-performance vector search engine with advanced filtering
Tools
Tools extend your AI applications with external capabilities like search, data retrieval, and API interactions.Exa Search
Neural search engine for finding high-quality web content
Using integrations in your code
Here’s a typical pattern for using LangChain integrations:Authentication
Most integrations require API keys for authentication. Set them as environment variables:Next steps
Explore providers
Dive into detailed guides for each integration
Build with LangChain
Start building AI applications with these integrations
API Reference
View complete API documentation for all integrations
Contributing
Learn how to contribute new integrations or improve existing ones
Package reference
All LangChain partner packages are available on PyPI and follow semantic versioning:| Package | Description | PyPI |
|---|---|---|
langchain-openai | OpenAI models and embeddings | PyPI |
langchain-anthropic | Anthropic (Claude) integration | PyPI |
langchain-ollama | Local model support via Ollama | PyPI |
langchain-groq | Groq inference engine | PyPI |
langchain-fireworks | Fireworks AI models | PyPI |
langchain-deepseek | DeepSeek models | PyPI |
langchain-mistralai | Mistral AI models | PyPI |
langchain-openrouter | OpenRouter unified API | PyPI |
langchain-perplexity | Perplexity online models | PyPI |
langchain-huggingface | HuggingFace models and embeddings | PyPI |
langchain-nomic | Nomic embeddings | PyPI |
langchain-chroma | Chroma vector database | PyPI |
langchain-qdrant | Qdrant vector database | PyPI |
langchain-exa | Exa search integration | PyPI |
langchain-xai | xAI (Grok) models | PyPI |