Skip to main content
LangChain.js provides a rich ecosystem of integrations for building LLM-powered applications. These integrations are distributed across multiple packages to keep your dependencies minimal and your bundle sizes small.

Integration Categories

Chat Models

Connect to various LLM providers like OpenAI, Anthropic, Google, and more

Embeddings

Generate vector embeddings for semantic search and similarity matching

Vector Stores

Store and retrieve embeddings with popular vector databases

Document Loaders

Load and process documents from various sources and formats

Package Structure

LangChain.js integrations are organized into several package types:

Provider Packages

First-party integrations maintained by the LangChain team:
  • @langchain/openai - OpenAI models and embeddings
  • @langchain/anthropic - Anthropic Claude models
  • @langchain/google-genai - Google Gemini models
  • @langchain/cohere - Cohere models and embeddings
  • @langchain/mistralai - Mistral AI models
  • And many more…
Each provider package includes only the dependencies needed for that specific integration, keeping your bundle size minimal.

Community Package

The @langchain/community package contains community-maintained integrations:
  • Additional chat models
  • Vector store implementations
  • Document loaders
  • Utility integrations
Install community integrations:
npm install @langchain/community

Installation

Install only the integrations you need:
# Install specific provider packages
npm install @langchain/openai
npm install @langchain/anthropic
npm install @langchain/pinecone

# Or install the community package for broader access
npm install @langchain/community

Environment Support

LangChain.js integrations work across multiple JavaScript environments:
  • Node.js (20.x, 22.x, 24.x)
  • Browser
  • Cloudflare Workers
  • Vercel Edge Functions
  • Deno
  • Bun
Most integrations support all environments, but some may have environment-specific limitations. Check the individual integration documentation for details.

Next Steps

Chat Models

Start building with LLM chat models

Embeddings

Learn about embedding models

Vector Stores

Set up vector storage

Document Loaders

Load documents into your application

Build docs developers (and LLMs) love