Skip to main content
PARC Teaser

What is PARC?

PARC (Physics-based Augmentation with Reinforcement Learning for Character Controllers) is a self-consuming, self-correcting generative model framework that combines kinematic motion generation with physics-based tracking controllers. Presented at SIGGRAPH 2025, PARC generates spatial, temporal, and functional variations of motions while iteratively improving quality through physics validation. The framework trains both a motion diffusion model for kinematic motion generation and a PPO-based tracking controller for physics validation, creating an iterative self-improvement loop that produces realistic, physically-plausible character animations.
PARC was developed by Michael Xu, Yi Shi, KangKang Yin, and Xue Bin Peng. Project page: https://michaelx.io/parc

Key Features

Motion Diffusion Model

Transformer-based architecture with local heightmap and target direction conditioning for procedural motion generation

Physics-Based Tracking

PPO reinforcement learning controller built on Isaac Gym for physics validation and motion tracking

Iterative Self-Improvement

4-stage PARC loop: train generator → generate motions → train tracker → record physics-validated motions

Terrain-Aware Generation

Procedural terrain generation with A* path planning and autoregressive motion synthesis along paths

How PARC Works

The PARC training loop consists of four main stages:
1

Train Motion Generator

Train a motion diffusion model on the dataset using heightmap and target direction conditions
python scripts/parc_1_train_gen.py --config path/to/config
2

Generate New Motions

Generate terrain, plan paths, and synthesize motions using the trained MDM with kinematic optimization
python scripts/parc_2_kin_gen.py --config path/to/config
3

Train Tracking Controller

Train a physics-based tracking controller to follow the generated reference motions
python scripts/parc_3_tracker.py --config path/to/config
4

Record Physics-Validated Motions

Record physically-simulated motions that successfully track the references, filtering out failures
python scripts/parc_4_phys_record.py --config path/to/config
The physics-validated motions from stage 4 are then added back to the dataset, and the loop repeats, continuously improving motion quality and diversity.

Installation

Set up your development environment with conda, PyTorch, and Isaac Gym

Quick Start

Get started with Motionscope viewer to visualize motions and terrains

GitHub Repository

View source code and contribute to the project

Datasets & Models

Download pre-trained models and motion datasets from HuggingFace

Use Cases

While the SIGGRAPH 2025 demonstration focuses on terrain-traversal motions (running, climbing, vaulting), PARC’s architecture is designed to be extensible to other character animation tasks:
  • Object interaction sequences
  • Multi-character interactions
  • Complex locomotion behaviors
  • Athletic movements and stunts
The procedural generation modules are optimized for terrain traversal. Applying PARC to different tasks will require custom heuristics and domain knowledge for motion synthesis.

Technical Highlights

  • Motion Diffusion Model: Transformer-based (4 layers, 256 hidden dim, 8 attention heads) with DDIM sampling
  • Reinforcement Learning: PPO algorithm for physics-based tracking in Isaac Gym simulator
  • Motion Representation: Root position/rotation + joint rotations with contact labels
  • Conditioning: Local heightmaps (31×31 grid, 0.2m resolution) + target direction vectors
  • Augmentation: Procedural terrain generation, heightfield augmentation, noise injection

What’s Next?

Ready to get started? Check out the Installation Guide to set up your environment, or jump straight to the Quick Start to run the Motionscope viewer and explore pre-generated motions.

Build docs developers (and LLMs) love