Skip to main content
Zuko provides a comprehensive collection of normalizing flow architectures for density estimation and generative modeling. All flows are implemented as PyTorch modules that can be easily integrated into your models.

Available Flows

Zuko includes the following flow architectures:

Neural Spline Flow (NSF)

Monotonic rational-quadratic spline transformations

Masked Autoregressive Flow (MAF)

Autoregressive transformations with masked networks

RealNVP

Coupling transformations with affine layers

NICE

Non-linear independent components estimation

Neural Autoregressive Flow (NAF)

Monotonic neural networks for autoregressive flows

Unconstrained NAF (UNAF)

Unconstrained monotonic neural networks

Continuous Normalizing Flow (CNF)

ODE-based continuous-time flows

Neural Circular Spline Flow (NCSF)

Spline flows for circular/periodic data

Sum-of-Squares Polynomial Flow (SOSPF)

Polynomial transformations with SOS constraints

Bernstein Polynomial Flow (BPF)

Bounded Bernstein polynomial transformations

Gaussianization Flow (GF)

Element-wise Gaussianization transformations

Gaussian Mixture Model (GMM)

Mixture of Gaussians for density estimation

Flow Comparison

FlowTypeKey FeaturesUse Cases
NSFAutoregressiveHigh expressivity, smooth transformationsGeneral-purpose, complex distributions
MAFAutoregressiveFast training, flexibleGeneral-purpose density estimation
RealNVPCouplingFast sampling, parallelReal-time generation
NICECouplingSimple, interpretableBaseline, feature learning
NAFAutoregressiveUniversal approximationHigh-dimensional data
UNAFAutoregressiveFlexible monotonicityComplex dependencies
CNFContinuousTheoretically unlimited capacityResearch, complex dynamics
NCSFAutoregressiveHandles periodicityCircular/angular data
SOSPFAutoregressivePolynomial expressivitySmooth distributions
BPFAutoregressiveBounded transformationsConstrained domains
GFElement-wiseRotation-invariantTabular data
GMMMixtureInterpretable clustersClustering, simple densities

General Usage Pattern

All flows in Zuko follow a consistent API:
import torch
import zuko

# Create a flow
flow = zuko.flows.NSF(
    features=3,      # Number of features
    context=4,       # Number of context features (optional)
    transforms=5,    # Number of transformations
    hidden_features=[128, 128],  # Hidden layer sizes
)

# Sample from the flow
context = torch.randn(4)  # Optional context
dist = flow(context)
samples = dist.sample((1000,))

# Compute log probabilities
log_prob = dist.log_prob(samples)

# Training loop
optimizer = torch.optim.Adam(flow.parameters(), lr=1e-3)

for x in dataloader:
    optimizer.zero_grad()
    loss = -flow(context).log_prob(x).mean()
    loss.backward()
    optimizer.step()

Common Parameters

Most flows share these parameters:
features
int
required
The number of features in the data.
context
int
default:"0"
The number of context features for conditional flows.
transforms
int
default:"3"
The number of transformation layers to stack.
hidden_features
List[int]
default:"[64, 64]"
The sizes of hidden layers in the neural networks.

Methods

All flows provide these key methods:

forward(c=None)

Returns a PyTorch distribution object. Arguments:
  • c (Tensor, optional): Context tensor of shape (*, context)
Returns:
  • Distribution: A PyTorch distribution with sample() and log_prob() methods

sample()

Sample from the flow via the returned distribution:
dist = flow(context)
samples = dist.sample((n_samples,))

log_prob()

Compute log probability of samples:
dist = flow(context)
log_p = dist.log_prob(samples)

Choosing a Flow

For General-Purpose Density Estimation

  • NSF: Best overall performance, smooth transformations
  • MAF: Fast training, good baseline

For Fast Sampling

  • RealNVP: Parallel inverse computation
  • NICE: Simple and fast

For High-Dimensional Data

  • NAF/UNAF: Universal approximation with neural networks
  • CNF: Continuous-time dynamics

For Specialized Data

  • NCSF: Circular/periodic features (angles, phases)
  • BPF: Bounded domains

For Interpretability

  • GMM: Clear cluster structure
  • GF: Element-wise transformations

Next Steps

Transforms

Learn about the underlying transformations

Examples

See flows in action with complete examples

Build docs developers (and LLMs) love