Available Flows
Zuko includes the following flow architectures:Neural Spline Flow (NSF)
Monotonic rational-quadratic spline transformations
Masked Autoregressive Flow (MAF)
Autoregressive transformations with masked networks
RealNVP
Coupling transformations with affine layers
NICE
Non-linear independent components estimation
Neural Autoregressive Flow (NAF)
Monotonic neural networks for autoregressive flows
Unconstrained NAF (UNAF)
Unconstrained monotonic neural networks
Continuous Normalizing Flow (CNF)
ODE-based continuous-time flows
Neural Circular Spline Flow (NCSF)
Spline flows for circular/periodic data
Sum-of-Squares Polynomial Flow (SOSPF)
Polynomial transformations with SOS constraints
Bernstein Polynomial Flow (BPF)
Bounded Bernstein polynomial transformations
Gaussianization Flow (GF)
Element-wise Gaussianization transformations
Gaussian Mixture Model (GMM)
Mixture of Gaussians for density estimation
Flow Comparison
| Flow | Type | Key Features | Use Cases |
|---|---|---|---|
| NSF | Autoregressive | High expressivity, smooth transformations | General-purpose, complex distributions |
| MAF | Autoregressive | Fast training, flexible | General-purpose density estimation |
| RealNVP | Coupling | Fast sampling, parallel | Real-time generation |
| NICE | Coupling | Simple, interpretable | Baseline, feature learning |
| NAF | Autoregressive | Universal approximation | High-dimensional data |
| UNAF | Autoregressive | Flexible monotonicity | Complex dependencies |
| CNF | Continuous | Theoretically unlimited capacity | Research, complex dynamics |
| NCSF | Autoregressive | Handles periodicity | Circular/angular data |
| SOSPF | Autoregressive | Polynomial expressivity | Smooth distributions |
| BPF | Autoregressive | Bounded transformations | Constrained domains |
| GF | Element-wise | Rotation-invariant | Tabular data |
| GMM | Mixture | Interpretable clusters | Clustering, simple densities |
General Usage Pattern
All flows in Zuko follow a consistent API:Common Parameters
Most flows share these parameters:The number of features in the data.
The number of context features for conditional flows.
The number of transformation layers to stack.
The sizes of hidden layers in the neural networks.
Methods
All flows provide these key methods:forward(c=None)
Returns a PyTorch distribution object.
Arguments:
c(Tensor, optional): Context tensor of shape(*, context)
Distribution: A PyTorch distribution withsample()andlog_prob()methods
sample()
Sample from the flow via the returned distribution:
log_prob()
Compute log probability of samples:
Choosing a Flow
For General-Purpose Density Estimation
- NSF: Best overall performance, smooth transformations
- MAF: Fast training, good baseline
For Fast Sampling
- RealNVP: Parallel inverse computation
- NICE: Simple and fast
For High-Dimensional Data
- NAF/UNAF: Universal approximation with neural networks
- CNF: Continuous-time dynamics
For Specialized Data
- NCSF: Circular/periodic features (angles, phases)
- BPF: Bounded domains
For Interpretability
- GMM: Clear cluster structure
- GF: Element-wise transformations
Next Steps
Transforms
Learn about the underlying transformations
Examples
See flows in action with complete examples
