Skip to main content
This project spans Weeks 2-3 and covers the complete browser rendering pipeline from HTML parsing to GPU compositing.

Overview

Build a fully functional browser engine capable of rendering complex web pages like Wikipedia. This project implements the core rendering pipeline: HTML parsing, CSS engine, layout calculation, and paint/composite phases.

Project requirements

Language

C++ or Rust

Target site

Render Wikipedia pages

Assets

Handle fonts and images

Animations

Support CSS animations

Core components

1

HTML parsing

Implement a complete HTML parser with:
  • Tokenization state machine (13 states, 67 transitions)
  • Tree construction algorithm
  • Error recovery and quirks mode
  • Parser-blocking scripts and async/defer handling
The tokenizer must handle all HTML5 parsing states and properly construct the DOM tree.
2

CSS engine

Build a CSS processor that handles:
  • Tokenizer and parser (grammar implementation)
  • Selector matching with specificity calculation
  • Cascade resolution algorithm
  • Style computation and inheritance
  • Media query evaluation
The engine must correctly apply CSS rules according to specificity and cascade order.
3

Layout engine

Implement layout algorithms for:
  • Box model calculation
  • Normal flow, floats, positioning schemes
  • Flexbox algorithm: main/cross axis, flex basis, grow, shrink
  • Grid algorithm: track sizing, placement, spanning
  • Line breaking and text shaping
The layout engine must produce pixel-perfect positioning for all elements.
4

Paint and composite

Create the rendering backend with:
  • Layer tree construction
  • Paint recording and display lists
  • Rasterization strategies
  • GPU compositing and texture management
This phase converts layout information into pixels on screen.

Performance measurement

Measure every phase of the rendering pipeline to understand performance characteristics.
Your engine must instrument and report timing for:
  • Parse: HTML tokenization and tree construction
  • Style: CSS parsing and rule matching
  • Layout: Box model and positioning calculations
  • Paint: Display list creation
  • Composite: GPU texture upload and compositing

Deliverables

A working implementation that can load and render Wikipedia pages with all content visible and properly laid out.
Support for web fonts and text shaping with proper glyph positioning.
Load and display images (PNG, JPEG, GIF) with proper sizing and positioning.
Smooth CSS animations and transitions at 60fps.
Detailed timing data for each rendering phase with profiling output.

Technical challenges

State machine complexity

The HTML tokenizer requires 13 states and 67 transitions for spec compliance.

CSS specificity

Implement the complete cascade algorithm with specificity calculation.

Flexbox algorithm

Handle complex flex layouts with nested containers and varying content.

GPU compositing

Optimize layer creation and texture management for smooth rendering.

Success criteria

  • Wikipedia pages render correctly with all layout modes
  • Text is readable with proper font rendering
  • Images load and display at correct sizes
  • CSS animations run smoothly at 60fps
  • Performance metrics show timing for all pipeline phases
  • Engine handles malformed HTML gracefully

Build docs developers (and LLMs) love