Prerequisites
System Requirements
- Operating System: Linux (Ubuntu 20.04+, Debian 11+, CentOS 8+), macOS, Windows with WSL2
- CPU: 2 cores minimum, 4+ recommended
- RAM: 4GB minimum, 8GB+ recommended
- Disk Space: 50GB minimum for application and data
Software Requirements
Install Docker Engine and Docker Compose:- Docker Engine: 20.10.0 or higher
- Docker Compose: 2.0.0 or higher
Quick Start
1. Clone the Repository
2. Set Environment Variables
Set the hostname that CVAT will be accessible from:3. Start CVAT
Pull the latest images and start all services:- Pull all required Docker images (~5-10 minutes depending on connection)
- Create Docker volumes for persistent storage
- Start all CVAT services in the background
4. Create a Superuser
After services start, create an admin account:- Username
- Email address
- Password (twice)
5. Access CVAT
Open your browser and navigate to:Service Architecture
The default Docker Compose deployment includes these services:Core Services
- cvat_server: Main API server (Django)
- cvat_ui: Frontend React application
- traefik: Reverse proxy and load balancer
Data Services
- cvat_db: PostgreSQL 15 database
- cvat_redis_inmem: Redis 7.2.11 for caching
- cvat_redis_ondisk: Apache Kvrocks 2.12.1 for persistent cache
- cvat_clickhouse: ClickHouse for analytics
Worker Services
- cvat_worker_import: Dataset imports (NUMPROCS=2)
- cvat_worker_export: Annotation exports (NUMPROCS=2)
- cvat_worker_annotation: Annotation processing
- cvat_worker_webhooks: Webhook delivery
- cvat_worker_quality_reports: Quality metrics
- cvat_worker_chunks: Media chunk processing (NUMPROCS=2)
- cvat_worker_consensus: Consensus calculations
- cvat_worker_utils: Notifications and cleanup
Analytics Services
- cvat_vector: Log collection and forwarding
- cvat_grafana: Analytics dashboards
Security
- cvat_opa: Open Policy Agent for authorization
Configuration Options
Environment Variables
Create a.env file or export variables:
Using Specific Versions
To use a specific CVAT version instead ofdev:
Scaling Workers
Editdocker-compose.yml to adjust worker processes:
Production Deployment
HTTPS Configuration
Use the HTTPS overlay for Let’s Encrypt SSL certificates:- Redirects HTTP (port 80) to HTTPS (port 443)
- Automatically obtains SSL certificates from Let’s Encrypt
- Renews certificates automatically
- 80: HTTP (redirects to HTTPS)
- 443: HTTPS
External Database
For production, consider using an external PostgreSQL database: Createdocker-compose.external_db.yml:
Persistent Storage
Docker volumes are created automatically:Common Operations
Start/Stop Services
View Logs
Update CVAT
Restart a Single Service
Check Service Status
Execute Commands in Containers
Troubleshooting
Services Won’t Start
Check logs:- Port 8080 already in use: Change port or stop conflicting service
- Insufficient memory: Increase Docker memory limit
- Permission errors: Check file ownership and Docker socket access
Database Connection Errors
Wait for database initialization:Worker Process Issues
Check worker status:Out of Disk Space
Check volume sizes:Performance Issues
Increase worker processes: Editdocker-compose.yml and increase NUMPROCS for workers:
Cannot Access UI
Check Traefik:Development Setup
For development with hot-reload and debugging:- Builds images from local source
- Enables debug ports
- Exposes service ports for direct access
- Mounts source code for development
Security Considerations
- Change default passwords: PostgreSQL, Redis, and ClickHouse
- Use HTTPS in production: Always use SSL/TLS certificates
- Configure ALLOWED_HOSTS: Restrict to your domain(s)
- Enable firewall: Only expose ports 80 and 443
- Regular updates: Keep images and dependencies updated
- Backup regularly: Automate backups of volumes and database
- Monitor logs: Set up log aggregation and alerts
Resource Requirements
Minimum Configuration
- 2 CPU cores
- 4GB RAM
- 50GB disk space
- Suitable for 1-5 users, small datasets
Recommended Configuration
- 4+ CPU cores
- 8GB+ RAM
- 100GB+ SSD storage
- Suitable for 5-20 users, medium datasets
Production Configuration
- 8+ CPU cores
- 16GB+ RAM
- 500GB+ SSD storage
- External PostgreSQL with replication
- Shared network storage for media
- Suitable for 20+ users, large datasets