Overview
As your Chatwoot usage grows, you’ll need to scale your infrastructure. This guide covers vertical scaling (more powerful servers) and horizontal scaling (more servers).Performance Indicators
Monitor these metrics to determine when scaling is needed:Key Metrics
- Response Time: API requests taking >500ms consistently
- Queue Latency: Sidekiq queues with >10 second latency
- Database Connections: Pool exhaustion (waiting connections)
- CPU Usage: Sustained >70% CPU usage
- Memory Usage: >80% memory consumption
- Disk I/O: I/O wait time >10%
- Concurrent Users: Growing user base
Monitoring Commands
Vertical Scaling
Vertical scaling means increasing resources on existing servers.Database Scaling
Increase Connection Pool
Fromconfig/database.yml:
Optimize PostgreSQL
Updatepostgresql.conf:
Database Maintenance
Redis Scaling
Increase Memory
Editredis.conf:
Redis Optimization
Application Scaling
Increase Sidekiq Concurrency
Fromconfig/sidekiq.yml:
Multiple Sidekiq Processes
Run specialized workers:Increase Rails Threads
Horizontal Scaling
Horizontal scaling distributes load across multiple servers.Architecture Overview
Multiple Rails Instances
Load Balancer Configuration
Nginx load balancer example:Session Affinity
For WebSocket connections, use sticky sessions:Shared Storage
Rails instances need shared file storage for Active Storage.S3-Compatible Storage
Recommended for horizontal scaling:NFS Shared Storage
Alternatively, use NFS:Multiple Sidekiq Workers
Distribute background jobs across multiple servers:Database Scaling
Read Replicas
For read-heavy workloads:Connection Pooling
Use PgBouncer for connection pooling:Database Partitioning
For very large tables, consider partitioning:Redis Scaling
Redis Sentinel (High Availability)
Redis Cluster
For very high throughput:Docker Swarm / Kubernetes
Docker Swarm
Kubernetes
Caching Strategies
Application-Level Caching
HTTP Caching with Nginx
Performance Optimization
Database Indexing
Background Job Optimization
Scaling Checklist
- Monitor performance metrics
- Identify bottlenecks (database, Redis, application)
- Start with vertical scaling if feasible
- Optimize database queries and add indexes
- Configure database connection pooling
- Increase Sidekiq concurrency
- Set up multiple Sidekiq workers for different queues
- Implement horizontal scaling for Rails instances
- Configure load balancer with health checks
- Use shared storage (S3 or NFS)
- Set up database read replicas
- Configure Redis Sentinel or Cluster
- Implement caching strategies
- Monitor after scaling changes
- Load test to verify capacity
- Document your scaling configuration

