Learning game rules from event sequences
A causal transformer trained on gameplay traces learns the grammar of videogames — physics, rules, and player behaviors — through next-token prediction on event streams. Built in pure Python with zero dependencies.
Quick Start
Get up and running with Game Grammar in three steps
Generate gameplay episodes
episodes.json with 200 tokenized gameplay sequences.Train the transformer
Core Concepts
Understand the theoretical foundation and architecture
Wittgensteinian Theory
Event Streams
Tokenization
Transformer Architecture
Key Features
What makes Game Grammar unique
Zero Dependencies
Built entirely in pure Python with custom autograd. No PyTorch, TensorFlow, or external frameworks required.
Game-Agnostic
Event stream abstraction works across any game. Tokenization layer handles game-specific encoding.
Three-Tier Validation
Validates structural correctness, physical plausibility, and rule consistency of generated sequences.
Learned Archetypes
Player behaviors emerge as statistical regularities without explicit labels or supervision.
Explore the API
Deep dive into the implementation
GameGPT Model
EventCodec
Snake Game
Agent Types
Data Pipeline
Validation
Ready to explore?
Start training your own transformer or dive into the theoretical foundation
