dlt pipeline command provides operations to inspect pipeline working directories, tables, data in the destination, and troubleshoot loading problems.
Synopsis
Description
Every pipeline run creates a working directory in~/.dlt/pipelines/[PIPELINE_NAME] that contains:
- Pipeline state (metadata, resource states, source states)
- Schemas with table and column definitions
- Load packages (extracted, normalized, and completed)
- Trace information from the last run
dlt pipeline command lets you inspect all of this without writing code.
Global Options
PIPELINE_NAME
The name of the pipeline to inspect. Required for all operations except--list-pipelines.
—pipelines-dir
Path to the pipelines working directory. Defaults to~/.dlt/pipelines.
—verbose, -v
Increases output verbosity. Can be used multiple times for more detail.Operations
list (default)
Lists all pipelines in the working directory, sorted by last run time.info
Displays comprehensive information about the pipeline state, schemas, and working directory contents.-v flag, also shows:
- Full source state JSON
- Detailed column information for each table
- Whether tables have received data
- Incomplete columns count
show
Launches the interactive workspace dashboard for exploring pipeline data, schemas, and state.marimo to be installed:
Options
—streamlit Launch the legacy Streamlit dashboard instead:streamlit to be installed.
—edit
Creates an editable version of the workspace dashboard in the current directory and launches it in edit mode:
--streamlit.
trace
Displays the execution trace from the last pipeline run, including timing information and any errors.-v or -vv flags, shows more detailed timing and load information.
failed-jobs
Displays information about all failed load jobs in completed packages.-v flag, shows the full job details and error stack traces.
schema
Displays the default schema for the pipeline.Options
—format Output format for the schema. Choices:json, yaml, dbml, dot, mermaid. Default: yaml.
true.
load-package
Displays detailed information about a specific load package.LOAD_ID is omitted, shows the most recent package.
Example output:
-v flag, also displays the schema update that was applied during this load.
sync
Drops the local pipeline state and restores it from the destination. Useful for:- Recovering from corrupted local state
- Syncing state across different machines
- Resetting after manual destination changes
Options
—destination Specify the destination name when local pipeline state is missing:drop
Selectively drop tables and reset resource state. Use this to force a full refresh of specific resources.Arguments
RESOURCES One or more resources to drop. Can be:- Exact resource names:
issues pull_requests - Regex patterns (prefix with
re:):"re:^repo"
Options
—drop-all Drop all resources in the schema. Supersedes theRESOURCES argument.
Example
- All indicated tables are dropped in the destination
- Tables are removed from the schema
- Resource state is reset
- Updated schema and state are stored in the destination
drop-pending-packages
Deletes all extracted and normalized packages, including partially loaded ones.dlt pipeline ... sync to restore from destination.
mcp
Launches an MCP (Model Context Protocol) server for the pipeline, enabling programmatic access to pipeline data and schemas.Options
—port Port number for the MCP server. Default:43656.
Working Directory Location
Default pipeline working directory:Troubleshooting
Cannot Restore Pipeline
If the working directory is corrupted:sync to restore from destination:
Credentials Not Found
Thedrop and sync commands require destination credentials. Run from the same directory as your pipeline script, or set credentials as environment variables:
See Also
- dlt init - Initialize new pipelines
- dlt deploy - Deploy pipelines to production
- Pipeline Working Directory