dlt deploy command prepares your pipeline for deployment and provides step-by-step instructions for deploying to GitHub Actions or Apache Airflow.
Synopsis
Installation
The deploy command requires additional dependencies:pipdeptree- For dependency analysiscron-descriptor- For schedule validation- Additional deployment utilities
Description
Thedlt deploy command:
- Validates that your pipeline has been run successfully at least once locally
- Detects the Git repository for your pipeline script
- Generates deployment configuration files
- Extracts required credentials and environment variables
- Provides detailed setup instructions for the selected deployment method
- Detect the pipeline configuration
- Identify required credentials
- Validate the pipeline works correctly
Arguments
PIPELINE_SCRIPT
Path to your pipeline script (relative or absolute).DEPLOYMENT_METHOD
The deployment target. Available methods:github-action- GitHub Actions (free tier available)airflow-composer- Apache Airflow / Google Cloud Composer
Global Options
—location
Advanced option. URL or local path to the deployment templates repository. Default:https://github.com/dlt-hub/dlt-deploy-template.git
—branch
Advanced option. Specific branch of the deployment repository to use.Deployment Methods
GitHub Actions
Deploys your pipeline to GitHub Actions, a CI/CD platform with a generous free tier.Usage
Required Options
—schedule
Cron expression defining when the pipeline runs. Must be quoted.minute hour day month weekday
Optional Flags
—run-manually
Allows triggering the pipeline manually from the GitHub Actions UI. Default:true.
—run-on-push
Runs the pipeline on every push to the repository. Default:false.
Example
Generated Files
.github/workflows/run_PIPELINE_workflow.yml
GitHub Actions workflow file:requirements_github_action.txt
Frozen dependencies from your current environment (excluding system packages):Setting Up GitHub Secrets
- Navigate to your repository on GitHub
- Go to Settings > Secrets and variables > Actions
- Click New repository secret
- Add each secret listed in the deploy output:
- Name:
SOURCES__GITHUB__API_KEY - Value: Your GitHub API token from
.dlt/secrets.toml
- Name:
- Repeat for all destination credentials
Airflow / Google Cloud Composer
Deploys your pipeline to Apache Airflow or Google Cloud Composer.Usage
Options
—secrets-format
Format for providing secrets to Airflow. Choices:env, toml. Default: toml.
env format - Each secret as a separate environment variable:
toml format - All secrets in a single TOML string:
Example
Generated Files
.airflow/dags/dag_PIPELINE.py
Airflow DAG template:.airflow/build/cloudbuild.yaml
Google Cloud Build configuration for syncing DAGs:Setting Up in Google Cloud Composer
-
Create a Composer Environment
-
Install dlt Package
- Go to Cloud Composer in Google Cloud Console
- Select your environment
- Navigate to PyPI Packages
- Add
dlt[bigquery]>=1.0.0
-
Add Secrets
For
tomlformat:- Navigate to Environment Variables
- Add variable
DLT_SECRETS_TOML - Paste the entire contents of your
.dlt/secrets.toml
envformat:- Add each credential as a separate environment variable:
SOURCES__GITHUB__API_KEYDESTINATION__BIGQUERY__CREDENTIALS__PROJECT_ID- etc.
-
Configure Cloud Build
- Edit
.airflow/build/cloudbuild.yaml - Set
_BUCKET_NAMEto your Composer bucket (found in environment details) - Set up Cloud Build trigger for your repository
- Edit
-
Push to GitHub
Prerequisites
Git Repository Required
The deploy command requires your pipeline to be in a Git repository:Pipeline Must Be Run Locally
Run your pipeline at least once before deploying:Troubleshooting
Error: No git repository found
Error: Pipeline was not run
Error: Invalid schedule expression
Modified Files Warning
Best Practices
Test Locally First
Always test your pipeline locally before deploying:Use Descriptive Schedules
Choose schedules that match your data freshness requirements:- Real-time APIs: Every 5-15 minutes
- Daily reports: Once per day at off-peak hours
- Historical data: Weekly or monthly
Secure Your Secrets
- Never commit
.dlt/secrets.tomlto Git - Use environment variables in production
- Rotate credentials regularly
- Use least-privilege service accounts
Monitor Your Pipelines
- Set up alerts for failures
- Check logs regularly
- Monitor data quality
- Track load times and volumes