Source: dvc/api/experiments.py:62-120
Description
Retrieves DVC experiments tracked in a repository. Without arguments, this function retrieves all experiments derived from the GitHEAD.
This function provides programmatic access to experiment data, including parameters, metrics, and metadata, making it easy to analyze, compare, and report on experiments.
Signature
Parameters
Location of the DVC repository.
- Defaults to the current project (found by walking up from current working directory)
- Can be a URL or a file system path
- Both HTTP and SSH protocols are supported for online Git repos
Git revision(s) to use as a reference point for listing experiments.
- Defaults to
None, which usesHEADas the starting point - Can be a single string or a list of strings
- Each revision can be a branch, tag, or commit SHA
Show experiments from the last
num commits (first parents) starting from the revs baseline.- Give a negative value to include all first-parent commits (similar to
git log -n) - Defaults to
1(only experiments from the most recent commit)
Include only parameters that are stage dependencies.
- When
True, filters to show only parameters explicitly listed as dependencies - When
False, shows all parameters
Force re-collection of experiments instead of loading from cache.
- DVC caches experiment data for performance
- Use
force=Trueto reload all experiment data and ignore cached results - Useful when you need the most up-to-date data
Config dictionary to be passed through to the DVC project.
Returns
A list of dictionaries, where each dictionary contains information about an individual experiment.Each experiment dict includes:
- Experiment: Name of the experiment
- rev: Git revision/commit hash
- Created: Timestamp when created
- State: Experiment state (Queued, Running, Success, Failed)
- metrics.*: All metrics (e.g.,
metrics.accuracy) - params.*: All parameters (e.g.,
params.train.lr)
Examples
Basic Usage - Show All Experiments
Show Experiments from Multiple Commits
Show Experiments from Specific Branch
Compare Experiment Metrics
Find Best Performing Experiment
Filter Experiments by Criteria
Analyze Parameter Impact
Export Experiments to CSV
Compare Across Multiple Branches
Remote Repository Access
Force Refresh Cache
Only Show Dependency Parameters
Build Leaderboard
Use Cases
Experiment Tracking
Track and analyze all experiments in your project.
Model Selection
Find the best performing model based on metrics.
Hyperparameter Analysis
Understand the impact of different parameters.
Team Collaboration
Share and compare experiments across team members.
Return Value Structure
Each experiment dictionary contains:Experiment metadata
Experiment metadata
Metrics (prefixed with 'metrics.')
Metrics (prefixed with 'metrics.')
Parameters (prefixed with 'params.')
Parameters (prefixed with 'params.')
Nested parameters
Nested parameters
Parameters are flattened with dot notation:
Best Practices
Use for model selection
Use for model selection
Systematically find the best model:
Track experiment progress
Track experiment progress
Monitor running experiments:
Compare against baseline
Compare against baseline
Always maintain and compare against a baseline:
Handle missing values
Handle missing values
Experiments may have different metrics/parameters:
Integration Examples
Streamlit Dashboard
MLflow Integration
Related Functions
params_show()
Show parameters only
metrics_show()
Show metrics only
exp_save()
Create new experiments