Installation
This guide covers installing all prerequisites and setting up MoneyPrinter for local development.System Requirements
- Operating System: Linux, macOS, or Windows
- Python: 3.11 or higher
- Memory: 8GB RAM minimum (16GB recommended for larger models)
- Storage: 10GB+ for models and temporary video files
Prerequisites
1. Python 3.11+
Verify your Python version:2. uv Package Manager
MoneyPrinter uses uv for fast, reliable dependency management.3. FFmpeg
FFmpeg is required for video processing and encoding.4. ImageMagick
ImageMagick is used for subtitle rendering and text overlays.5. Ollama
Ollama provides local LLM inference for script generation.Recommended models:
- llama3.1:8b - Balanced performance and quality
- mistral:7b - Fast inference
- llama3.1:70b - Best quality (requires high-end GPU)
Install MoneyPrinter
Clone Repository
Install Python Dependencies
Using uv (recommended):pyproject.toml and installs all required packages:
pyproject.toml
Alternative: Using pip
Alternative: Using pip
If you prefer pip over uv:
Create Environment File
Verify Installation
Run the startup checks:.env to add:
Optional: Development Dependencies
To run tests and development tools:pyproject.toml:
pyproject.toml
Directory Structure
After installation, your project should look like:Next Steps
Configuration
Configure environment variables and API keys
Quickstart
Generate your first video
Docker Setup
Deploy with Docker Compose
Troubleshooting
Common installation issues
Common Issues
'uv sync' fails with dependency conflicts
'uv sync' fails with dependency conflicts
Ensure you’re using Python 3.11+:If you have multiple Python versions, specify explicitly:
ImageMagick policy error when rendering subtitles
ImageMagick policy error when rendering subtitles
Edit ImageMagick’s policy file to enable video processing.Linux: Or add permissions for video formats:
/etc/ImageMagick-6/policy.xml or /etc/ImageMagick-7/policy.xmlFind and comment out or remove this line:Ollama connection refused
Ollama connection refused
Ensure Ollama is running:Check if it’s accessible:If running on another machine, set
OLLAMA_BASE_URL in .env.