Skip to main content
This example demonstrates how to use dbt’s MetricFlow semantic layer with DuckDB (local) and MotherDuck (cloud). Define metrics once and query them from anywhere.

Overview

MetricFlow is dbt’s semantic layer that allows you to:
  • Define metrics once in YAML
  • Query metrics with a simple CLI
  • Ensure consistency across your organization
  • Support both local DuckDB and MotherDuck
DuckDB and MotherDuck are fully supported by MetricFlow. The same metric definitions work seamlessly in both environments.

Installation

1

Create virtual environment

python3 -m venv venv
source venv/bin/activate
2

Install packages

pip install dbt-duckdb dbt-metricflow

Configuration

profiles.yml

Configure both local and cloud targets:
profiles.yml
ecommerce_metrics:
  target: local
  outputs:
    local:
      type: duckdb
      path: ecommerce_local.duckdb
      threads: 4

    motherduck:
      type: duckdb
      path: md:ecommerce_test_db
      threads: 4
      settings:
        # MotherDuck token picked up from MOTHERDUCK_TOKEN env var

Project Structure

metricflow-example/
├── profiles.yml                    # dbt configs (local & cloud)
└── ecommerce_metrics/
    ├── seeds/raw_orders.csv        # 20 sample orders
    └── models/
        ├── fct_orders.sql          # Fact table
        ├── metricflow_time_spine.sql
        └── semantic_models.yml     # Metrics definitions

Models

Fact Table

Create a fact table from your raw data:
models/fct_orders.sql
{{
  config(
    materialized='table'
  )
}}

SELECT
    order_id,
    customer_id,
    CAST(order_date AS DATE) as order_date,
    status,
    amount
FROM {{ ref('raw_orders') }}

Time Spine

Define a time dimension for time-based queries:
models/metricflow_time_spine.sql
{{
  config(
    materialized='table'
  )
}}

WITH date_spine AS (
    SELECT
        CAST(date_day AS DATE) AS date_day
    FROM (
        SELECT
            UNNEST(generate_series(
                DATE '2024-01-01',
                DATE '2025-12-31',
                INTERVAL '1 day'
            )) AS date_day
    ) dates
)

SELECT
    date_day
FROM date_spine

Semantic Models

Define your semantic model and metrics:
models/semantic_models.yml
time_spines:
  - name: time_spine
    model: ref('metricflow_time_spine')
    time_column: date_day
    grains:
      - name: day
        column: date_day

semantic_models:
  - name: orders
    description: E-commerce order transactions
    model: ref('fct_orders')
    defaults:
      agg_time_dimension: order_date

    entities:
      - name: order_id
        type: primary
        expr: order_id
      - name: customer
        type: foreign
        expr: customer_id

    dimensions:
      - name: order_date
        type: time
        type_params:
          time_granularity: day
      - name: status
        type: categorical

    measures:
      - name: order_count
        agg: count
        expr: order_id
      - name: total_revenue
        agg: sum
        expr: amount
      - name: average_order_value
        agg: average
        expr: amount
      - name: unique_customers
        agg: count_distinct
        expr: customer_id

metrics:
  - name: revenue
    description: Total revenue from all orders
    type: simple
    label: Total Revenue
    type_params:
      measure: total_revenue

  - name: orders
    description: Total number of orders
    type: simple
    label: Order Count
    type_params:
      measure: order_count

  - name: avg_order_value
    description: Average order value
    type: simple
    label: Average Order Value
    type_params:
      measure: average_order_value

  - name: revenue_per_customer
    description: Average revenue per customer
    type: derived
    label: Revenue Per Customer
    type_params:
      expr: revenue / customers
      metrics:
        - revenue
        - customers

  - name: customers
    description: Count of unique customers
    type: simple
    label: Customer Count
    type_params:
      measure: unique_customers

Usage

Local DuckDB

1

Load data

DBT_PROFILES_DIR=.. dbt seed
DBT_PROFILES_DIR=.. dbt run
2

Query metrics

DBT_PROFILES_DIR=.. mf query --metrics revenue,orders,customers --group-by metric_time__month
Expected output:
metric_time__month      revenue    orders    customers
--------------------  ---------  --------  -----------
2024-01-01T00:00:00     1962.29        10            6
2024-02-01T00:00:00     2621.73        10           10

MotherDuck

1

Set token

export motherduck_token='your_token_here'
2

Create database (first time)

python -c "import duckdb; conn = duckdb.connect('md:'); conn.execute('CREATE DATABASE ecommerce_test_db')"
3

Load data to cloud

DBT_PROFILES_DIR=.. dbt seed --target motherduck
DBT_PROFILES_DIR=.. dbt run --target motherduck
4

Query metrics from cloud

DBT_PROFILES_DIR=.. DBT_TARGET=motherduck mf query --metrics revenue,orders,customers --group-by metric_time__month

Query Examples

mf query --metrics revenue

Available Metrics

Total revenue from all orders
  • Type: Simple
  • Aggregation: Sum of order amounts
  • Supports: Time dimensions, filtering
Total number of orders
  • Type: Simple
  • Aggregation: Count of orders
  • Supports: Time dimensions, filtering
Count of unique customers
  • Type: Simple
  • Aggregation: Count distinct customer IDs
  • Supports: Time dimensions, filtering
Average order value
  • Type: Simple
  • Aggregation: Average of order amounts
  • Supports: Time dimensions, filtering
Average revenue per customer
  • Type: Derived
  • Calculation: revenue / customers
  • Supports: Time dimensions, filtering

How It Works

Your Query → MetricFlow → dbt-duckdb Adapter → DuckDB/MotherDuck

         Generates SQL
MetricFlow has a native DuckDB SQL renderer. The dbt-duckdb adapter handles connections to both local DuckDB and MotherDuck using the same code path - just different connection strings.

Benefits

  • Define metrics once in YAML
  • Same logic everywhere it’s used
  • No SQL copy-paste errors
  • Single source of truth
  • Query from CLI, BI tools, or applications
  • Support multiple time grains (day/month/year)
  • Filter and group dynamically
  • Work locally or in cloud
  • Update metrics in one place
  • Version control metric definitions
  • Document metrics alongside code
  • Easy to audit and review
Use the --explain flag to see the SQL that MetricFlow generates. This helps you understand what’s happening under the hood and optimize your semantic models.

Resources

Build docs developers (and LLMs) love