Skip to main content

Quick Install

Install dlt using your preferred package manager:
pip install dlt

Python Version Requirements

dlt supports Python 3.9 through Python 3.14.
Python 3.14 support is experimental. Some optional extras may not yet be available for this version.
Supported Python versions:
  • Python 3.9.2+
  • Python 3.10
  • Python 3.11
  • Python 3.12
  • Python 3.13
  • Python 3.14 (experimental)

Installing with Destinations

To use dlt with specific destinations, install with the appropriate extras:
pip install "dlt[duckdb]"

All Destination Extras

Here’s the complete list of destination extras:
ExtraDescriptionKey Dependencies
duckdbDuckDB databaseduckdb>=0.9
ducklakeDuckDB with Iceberg supportduckdb>=1.2.0, pyarrow>=16.0.0
postgresPostgreSQL databasepsycopg2-binary>=2.9.1
bigqueryGoogle BigQuerygoogle-cloud-bigquery>=2.26.0, pyarrow>=16.0.0
snowflakeSnowflake data warehousesnowflake-connector-python>=3.12.0
redshiftAmazon Redshiftpsycopg2-binary>=2.9.1
athenaAWS Athenapyathena>=3.8.2, pyarrow>=16.0.0
databricksDatabricksdatabricks-sql-connector>=3.0.0
mssqlMicrosoft SQL Serverpyodbc>=4.0.39
clickhouseClickHouseclickhouse-driver>=0.2.7
motherduckMotherDuck (cloud DuckDB)duckdb>=1.0.0
qdrantQdrant vector databaseqdrant-client>=1.11.3
weaviateWeaviate vector databaseweaviate-client>=4.9.3
lancedbLanceDB vector databaselancedb>=0.10.0, pyarrow>=16.0.0
filesystemFile system (S3, GCS, Azure)s3fs>=2022.4.0, botocore>=1.28
You can install multiple extras at once: pip install "dlt[postgres,bigquery,filesystem]"

Format and Feature Extras

Install additional features based on your data format needs:
pip install "dlt[parquet]"

Common Feature Extras

ExtraDescriptionUse Case
parquetParquet file format supportEfficient columnar storage and loading
gcpGoogle Cloud Platform servicesBigQuery, GCS access
hubdlt Hub integrationAccess to 5000+ verified sources
cliEnhanced CLI toolsPipeline scaffolding and management

Complete Installation Example

For a typical production setup with PostgreSQL and Parquet support:
pip install "dlt[postgres,parquet]"

Development Installation

To install dlt from source for development:
git clone https://github.com/dlt-hub/dlt.git
cd dlt
uv pip install -e ".[duckdb,postgres,parquet]"
The dlt repository uses uv for dependency management. Check the Makefile for common development commands.

Verifying Installation

Verify that dlt is installed correctly:
import dlt

print(f"dlt version: {dlt.__version__}")

# Test a simple pipeline
pipeline = dlt.pipeline(
    pipeline_name='test_pipeline',
    destination='duckdb',
    dataset_name='test_data'
)

print("Installation successful!")
Expected output:
dlt version: 1.22.2
Installation successful!

Core Dependencies

dlt has minimal required dependencies that are always installed:
# Key dependencies (version 1.22.2)
requests>=2.26.0
pendulum>=2.1.2
simplejson>=3.17.5
PyYAML>=5.4.1
semver>=3.0.0
click>=7.1
fsspec>=2022.4.0
sqlglot>=25.4.0
orjson>=3.6.7  # Fast JSON processing
tenacity>=8.0.2  # Retry logic
dlt follows semantic versioning. We recommend using the compatible release specifier: dlt~=1.22 to allow only patch updates.

Dependency Versioning

When adding dlt as a dependency to your project:

Using Compatible Release Specifier

pyproject.toml
[project]
dependencies = [
    "dlt~=1.22",  # Allows >=1.22.0, <1.23.0
]

Using Poetry

pyproject.toml
[tool.poetry.dependencies]
dlt = "^1.22"  # Allows >=1.22.0, <2.0.0
  • Major version changes may include breaking changes and removed deprecations
  • Minor version changes add new features, sometimes with automatic migrations
  • Patch version changes include bug fixes only

Platform Support

dlt works on all major platforms:
  • Linux (all distributions)
  • macOS (macOS X)
  • Windows (Windows 10+)
  • WebAssembly/Pyodide (experimental)

Configuration After Installation

After installing dlt, you’ll typically need to configure credentials for your destinations.

Initialize a New Project

Use the dlt CLI to scaffold a new project:
dlt init my_source postgres
This creates:
  • my_source.py - Your source/pipeline code
  • .dlt/config.toml - Configuration file
  • .dlt/secrets.toml - Credentials file (gitignored)
  • requirements.txt - Python dependencies

Manual Configuration

Create configuration files manually:
.dlt/secrets.toml
[destination.postgres.credentials]
host = "localhost"
port = 5432
username = "your_username"
password = "your_password"
database = "your_database"
.dlt/config.toml
[runtime]
log_level = "INFO"

[pipeline]
workers = 4
Never commit .dlt/secrets.toml to version control. Add it to your .gitignore file.

Environment Variables

Alternatively, configure dlt using environment variables:
export DESTINATION__POSTGRES__CREDENTIALS__HOST="localhost"
export DESTINATION__POSTGRES__CREDENTIALS__PORT="5432"
export DESTINATION__POSTGRES__CREDENTIALS__USERNAME="your_username"
export DESTINATION__POSTGRES__CREDENTIALS__PASSWORD="your_password"
export DESTINATION__POSTGRES__CREDENTIALS__DATABASE="your_database"

Docker Installation

Run dlt in a Docker container:
Dockerfile
FROM python:3.11-slim

WORKDIR /app

COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

COPY . .

CMD ["python", "pipeline.py"]
requirements.txt
dlt[postgres,parquet]==1.22.2

Troubleshooting

Ensure you’ve activated the correct Python environment and that dlt is installed:
pip list | grep dlt
Try installing dlt in a fresh virtual environment:
python -m venv dlt-env
source dlt-env/bin/activate  # On Windows: dlt-env\Scripts\activate
pip install dlt[duckdb]
Some dependencies require compilation. Install build tools:Linux (Ubuntu/Debian):
sudo apt-get install build-essential python3-dev
macOS:
xcode-select --install
Windows: Install Microsoft C++ Build Tools
Python 3.14 support is experimental. Some dependencies may not have wheels available yet. Use Python 3.11 or 3.12 for production workloads.

Upgrading dlt

Upgrade to the latest version:
pip install --upgrade dlt
Check the release notes for migration guides and breaking changes.

Next Steps

Now that dlt is installed:

Quickstart

Build your first pipeline in 5 minutes

Core Concepts

Learn about sources, resources, and pipelines

Configuration

Set up credentials and configuration

Destinations

Configure your destination

Build docs developers (and LLMs) love