Welcome to Flower Engine Development
Flower Engine is a split-architecture narrative system with a Python FastAPI backend and Rust Ratatui terminal UI. This guide will help you set up your development environment and understand the project structure.Architecture Overview
The Flower uses a decoupled architecture:- Python Brain (
engine/): FastAPI backend with WebSocket server, SQLite database, LLM integration - Rust Face (
tui/): Ratatui-based terminal UI communicating via WebSockets
Project Structure
Prerequisites
Before contributing, ensure you have:- Python 3.12+
- Rust & Cargo (latest stable)
- Git
- API Keys (OpenRouter, Google Gemini, Groq, or DeepSeek)
System Requirements
- OS: Linux, macOS, or Windows (WSL2 recommended)
- Memory: 4GB+ RAM (embeddings run on CPU)
- Disk: ~1GB
Initial Setup
1. Clone the Repository
2. Run Setup Script
The automated setup handles virtual environment creation and dependencies:- Check for Python 3.12+ and Rust/Cargo
- Create a Python virtual environment
- Install Python dependencies (optimized for CPU)
- Copy
assets_example/toassets/ - Copy
config.yaml.exampletoconfig.yaml
3. Configure API Keys
Editconfig.yaml and add your API keys:
config.yaml is gitignored and should NEVER be committed.
4. Manual Installation (Alternative)
If you prefer manual setup:Development Workflow
Running the Full System
Use the start script to launch both backend and TUI:- Start the FastAPI backend on port 8000
- Wait for backend readiness
- Launch the Rust TUI in full-screen mode
Running Components Separately
For development, you often want to run components independently:Backend Only
http://localhost:8000
WebSocket endpoint: ws://localhost:8000/ws/rpc
TUI Only
Asset Files
Assets are YAML files defining worlds, characters, and rules:- Location:
assets/ - Format: YAML (
.yaml) - Structure: Each asset has an
idfield plus type-specific fields - Loading:
engine/utils.py::load_yaml_assets(pattern)
Example World Asset
Example Character Asset
Configuration
Editconfig.yaml for:
database_path: SQLite storage locationdefault_model: Default LLM modelsupported_models: List of available models- API keys for providers (OpenRouter, DeepSeek, Groq, Gemini)
MODEL_NAMEOPENAI_API_KEYDEEPSEEK_API_KEYGEMINI_API_KEYGROQ_API_KEY
WebSocket Protocol
The Python backend and Rust TUI communicate via JSON messages:Known Events
sync_state: State synchronizationchat_history: Historical messagessystem_update: System notificationschat_chunk: Streaming LLM responsechat_end: Stream completionerror: Error messages
tui/src/models.rs for complete message schemas.
Common Development Tasks
Adding a New Command
- Add handler in
engine/commands.py - Parse command string with
parts = cmd_str.split(" ", 2) - Send response via
await websocket.send_text(build_ws_payload(...))
Adding a New WebSocket Event
- Define event in both Python and Rust
- Python: Send via
build_ws_payload() - Rust: Handle in
main.rs::run_app()match statement
Modifying Database Schema
- Add migration logic in
database.py::init_db() - Use
try/except sqlite3.OperationalErrorpattern for ALTER TABLE - Update Pydantic models accordingly
Getting Help
For development questions:- Check the detailed guides:
-
Review the codebase:
docs/AGENTS.mdcontains agent-specific guidanceREADME.mdfor user documentation
- Open an issue on GitHub for bugs or feature requests
Next Steps
- Learn about Python backend development
- Explore Rust TUI development
- Understand the testing workflow