Skip to main content

Requirements

LangChain requires Python 3.10 or higher. You can check your Python version:
python --version
Python versions supported: 3.10, 3.11, 3.12, 3.13, and 3.14

Install LangChain

1

Choose your package manager

LangChain can be installed using pip, uv, or poetry:
pip install langchain
2

Verify installation

Verify LangChain is installed correctly:
import langchain
print(langchain.__version__)

Core dependencies

When you install langchain, the following core dependencies are automatically included:
  • langchain-core (>=1.2.10) - Base abstractions and interfaces
  • langgraph (>=1.0.8) - Agent orchestration framework
  • pydantic (>=2.7.4) - Data validation and settings management

Optional integrations

LangChain follows a modular architecture. Install only the integrations you need:
Install specific model provider packages:
pip install langchain-openai
Install embedding provider packages:
pip install langchain-openai
Install vector store packages:
pip install langchain-chroma
For document processing and chunking:
pip install langchain-text-splitters
For additional community-maintained integrations:
pip install langchain-community
The community package is large and includes many dependencies. Only install if you need integrations not available as standalone packages.

Install with extras

You can install LangChain with optional dependencies using extras:
pip install langchain[openai]

Development setup

For development or contributing to LangChain:
1

Clone the repository

git clone https://github.com/langchain-ai/langchain.git
cd langchain
2

Install uv

LangChain uses uv for dependency management:
curl -LsSf https://astral.sh/uv/install.sh | sh
3

Install dependencies

Install all packages in editable mode:
uv sync --all-groups
Or install specific dependency groups:
# Test dependencies only
uv sync --group test

# Lint dependencies
uv sync --group lint
4

Run tests

make test

Environment variables

Configure API keys for the integrations you’re using:
.env
# OpenAI
OPENAI_API_KEY=your-openai-key

# Anthropic
ANTHROPIC_API_KEY=your-anthropic-key

# Google GenAI
GOOGLE_API_KEY=your-google-key

# Groq
GROQ_API_KEY=your-groq-key

# LangSmith (optional, for monitoring)
LANGSMITH_API_KEY=your-langsmith-key
LANGSMITH_TRACING=true
Use a .env file with python-dotenv to manage environment variables locally:
from dotenv import load_dotenv
load_dotenv()

Verify your setup

Create a simple test script to verify everything works:
test_setup.py
from langchain_core.messages import HumanMessage
from langchain_core.prompts import ChatPromptTemplate

# Test core imports
print("✓ LangChain core imported successfully")

# If you installed a chat model provider:
try:
    from langchain_openai import ChatOpenAI
    model = ChatOpenAI(model="gpt-4o-mini")
    response = model.invoke([HumanMessage(content="Hello!")])
    print(f"✓ Chat model works: {response.content[:50]}")
except ImportError:
    print("! Chat model provider not installed")
except Exception as e:
    print(f"! Error testing chat model: {e}")

print("\n✅ LangChain setup complete!")
Run the test:
python test_setup.py

Next steps

Quickstart

Build your first LLM application

Core concepts

Learn about the framework architecture

Integrations

Explore available providers

Guides

Follow step-by-step guides

Troubleshooting

Make sure you installed LangChain in the correct Python environment:
python -m pip install langchain
If you encounter dependency conflicts, try creating a fresh virtual environment:
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install langchain
If you encounter SSL errors when installing:
pip install --trusted-host pypi.org --trusted-host files.pythonhosted.org langchain
Ensure your environment variables are set correctly. Check with:
import os
print(os.getenv("OPENAI_API_KEY"))

Build docs developers (and LLMs) love