Overview
Watch N Chill can be deployed to various platforms. This guide covers production considerations, CORS configuration, Redis hosting, and platform-specific instructions.Pre-Deployment Checklist
Environment Variables
Ensure all required production variables are set:
NODE_ENV=production- Enables production optimizationsREDIS_URL- Connection to production Redis instanceALLOWED_ORIGINS- Comma-separated list of allowed domainsPORT- HTTP port (usually auto-configured by platform)
RATE_LIMIT_WINDOW_MS- Rate limit window (default: 60000)RATE_LIMIT_MAX_REQUESTS- Max requests per window (default: 360)RATE_LIMIT_SOCKET_MAX_PER_IP- Max concurrent connections (default: 10)
Redis Instance
Set up a production Redis instance. See Redis Hosting Options.
Redis Hosting Options
Upstash (Recommended)
Upstash provides serverless Redis with automatic TLS configuration:Create Database
- Sign up at upstash.com
- Create a new Redis database
- Choose a region close to your application
- Free tier with 10,000 commands/day
- Global edge replication
- Built-in metrics and monitoring
- Automatic backups
Redis Cloud
Redis Cloud (by Redis Labs) offers managed Redis:- Sign up at redis.com/cloud
- Create a database
- Copy the connection URL
- Set
REDIS_URLin your environment
Self-Hosted Redis
For self-hosted deployments, ensure:- Redis persistence is enabled (
appendonly yes) - Backups are configured
- Access is restricted by IP or VPN
- TLS is enabled if exposed to internet
CORS Configuration
Setting ALLOWED_ORIGINS
TheALLOWED_ORIGINS environment variable controls Socket.IO CORS:
CORS Implementation
The Socket.IO CORS configuration insrc/backend/socket/index.ts:20-28:
- Development mode auto-allows
http://localhost:3000 - Production requires explicit
ALLOWED_ORIGINS - Origins are split by comma
- Credentials are enabled for cookie support
Rate Limiting
Rate limiting is automatically enabled in production (NODE_ENV=production):
HTTP Rate Limiting
Implemented inserver.ts:56-67:
- Window: 60 seconds (configurable via
RATE_LIMIT_WINDOW_MS) - Limit: 360 requests (configurable via
RATE_LIMIT_MAX_REQUESTS) - Per: IP address
- Headers:
X-RateLimit-Limit,X-RateLimit-Remaining,Retry-After
Socket.IO Connection Limiting
Implemented insrc/backend/rate-limit.ts:86-102:
- Limit: 10 concurrent connections per IP (configurable via
RATE_LIMIT_SOCKET_MAX_PER_IP) - Behavior: New connections rejected if limit exceeded
- Fallback: Allows connections if Redis is unavailable
Adjusting Rate Limits
Platform-Specific Guides
Render
Render provides easy deployment with Docker support:Create Web Service
- Go to render.com
- Click “New +” → “Web Service”
- Connect your GitHub repository
Configure Service
- Name:
watchnchill - Environment:
Docker - Region: Choose closest to your users
- Branch:
main - Dockerfile Path:
./Dockerfile
- Free tier available (with cold starts)
- Automatic SSL certificates
- Health checks via
/healthendpoint - Auto-deploys on git push
Railway
Railway offers simple deployment with built-in Redis:Create Project
- Go to railway.app
- Click “New Project” → “Deploy from GitHub repo”
- $5/month free credit
- Built-in Redis plugin
- Automatic domains
- Zero-config deployments
Vercel
DigitalOcean App Platform
Create App
- Go to cloud.digitalocean.com/apps
- Click “Create App” → “GitHub”
Add Managed Redis
- Click “Add Resource” → “Database”
- Select “Redis” → Create
- DigitalOcean auto-configures
REDIS_URL
Fly.io
Fly.io provides edge deployment with built-in Redis (Upstash integration):Docker on VPS (AWS EC2, Hetzner, etc.)
For self-hosted deployments:Monitoring and Health Checks
Health Endpoint
The application exposes a health check endpoint at/health (implemented in server.ts:48-54):
- Container health checks
- Load balancer health checks
- Uptime monitoring (UptimeRobot, Pingdom)
Monitoring Redis
Check Redis connection:- Connection count
- Memory usage
- Command rate
- Error rate
Application Logs
The application logs important events:- CloudWatch (AWS)
- Logtail/BetterStack
- Datadog
- New Relic
Performance Optimization
CDN for Static Assets
Use a CDN for Next.js static assets:next.config.ts
Redis Connection Pooling
The application uses ioredis with optimized settings (insrc/backend/redis/client.ts:4-36):
Socket.IO Optimization
For high-traffic deployments:-
Use Redis adapter for horizontal scaling:
-
Enable compression:
Security Best Practices
Security Checklist
-
NODE_ENV=productionis set -
ALLOWED_ORIGINSis configured - Redis uses TLS (
rediss://) - Redis password is strong
- Rate limiting is enabled
- HTTPS is enabled (via reverse proxy/platform)
- Security headers are set (via reverse proxy)
- Dependencies are up to date
Troubleshooting Production Issues
Socket.IO Won’t Connect
Check CORS:- Ensure reverse proxy forwards
Upgradeheader - Check firewall allows WebSocket connections
- Verify platform supports persistent connections
Redis Connection Errors
High Memory Usage
Monitor Redis memory:src/backend/redis/room-handler.ts if needed.
Rate Limit Too Restrictive
Adjust limits:Scaling Considerations
For high-traffic deployments:- Horizontal Scaling: Use Redis adapter for Socket.IO
- Redis Scaling: Use Redis Cluster or replicas
- Load Balancing: Use sticky sessions for Socket.IO
- CDN: Serve static assets from CDN
- Monitoring: Set up APM (Application Performance Monitoring)
Next Steps
Environment Variables
Complete environment variable reference
Docker Deployment
Deploy with Docker and Docker Compose
Architecture
Understand the system architecture
Rate Limiting
Configure rate limiting for production