Skip to main content
Monitor and maintain your ArcHive deployment to ensure reliability and performance.

PM2 Monitoring

PM2 provides built-in monitoring and management tools for Node.js processes.

Real-Time Monitoring

# Interactive process monitor
pm2 monit
Shows real-time:
  • CPU usage per process
  • Memory consumption
  • Process status
  • Logs stream

Process Status

# List all processes
pm2 list

# Detailed process information
pm2 show archive-api

# Process metrics
pm2 describe archive-api

PM2 Dashboard

For web-based monitoring, use PM2 Plus (optional):
# Link to PM2 Plus (free tier available)
pm2 link <secret_key> <public_key>

# Start monitoring
pm2 monitor
Access dashboard at app.pm2.io

Log Management

Viewing Logs

# Stream all application logs
pm2 logs

# Show last 100 lines
pm2 logs --lines 100

# Follow logs in real-time
pm2 logs --raw

Log File Locations

/var/www/archive/logs/
├── api-error.log
├── api-out.log
├── screenshot-worker-error.log
├── screenshot-worker-out.log
├── tag-worker-error.log
└── tag-worker-out.log

Log Rotation

Automatically rotate logs to prevent disk space issues:
1

Install PM2 Log Rotate

pm2 install pm2-logrotate
2

Configure Log Rotation

# Maximum log file size before rotation
pm2 set pm2-logrotate:max_size 10M

# Number of rotated logs to keep
pm2 set pm2-logrotate:retain 7

# Compress rotated logs
pm2 set pm2-logrotate:compress true

# Rotate logs at specific time (default: midnight)
pm2 set pm2-logrotate:rotateInterval '0 0 * * *'
3

Verify Configuration

pm2 conf pm2-logrotate

System Logs

# Access logs
sudo tail -f /var/log/nginx/access.log

# Error logs
sudo tail -f /var/log/nginx/error.log

# Site-specific logs (CloudPanel)
sudo tail -f /home/archive-api/logs/nginx/access.log
sudo tail -f /home/archive-api/logs/nginx/error.log

Clear Logs

# Clear all PM2 logs
pm2 flush

# Clear logs for specific process
pm2 flush archive-api

Database Backups

Automated MongoDB Backup

Create a backup script for automated daily backups:
/root/backup-mongodb.sh
#!/bin/bash

BACKUP_DIR="/home/archive-api/backups/mongodb"
TIMESTAMP=$(date +"%Y%m%d_%H%M%S")
MONGODB_USER="archiveuser"
MONGODB_PASS="YOUR_ARCHIVE_DB_PASSWORD"
MONGODB_DB="archive"

# Create backup directory
mkdir -p $BACKUP_DIR

# Create backup
mongodump --username=$MONGODB_USER --password=$MONGODB_PASS \
  --authenticationDatabase=$MONGODB_DB --db=$MONGODB_DB \
  --out=$BACKUP_DIR/$TIMESTAMP

# Compress backup
tar -czf $BACKUP_DIR/archive_$TIMESTAMP.tar.gz -C $BACKUP_DIR $TIMESTAMP
rm -rf $BACKUP_DIR/$TIMESTAMP

# Keep only last 7 days of backups
find $BACKUP_DIR -name "archive_*.tar.gz" -mtime +7 -delete

echo "Backup completed: $BACKUP_DIR/archive_$TIMESTAMP.tar.gz"
1

Create Backup Script

sudo nano /root/backup-mongodb.sh
# Paste the script above
2

Make Executable

sudo chmod +x /root/backup-mongodb.sh
3

Test Backup

sudo /root/backup-mongodb.sh
4

Schedule with Cron

# Edit crontab
sudo crontab -e

# Add this line for daily backups at 2 AM:
0 2 * * * /root/backup-mongodb.sh >> /home/archive-api/htdocs/logs/backup.log 2>&1

Manual Backup

# Create manual backup
mongodump --username=archiveuser --password=YOUR_ARCHIVE_DB_PASSWORD \
  --authenticationDatabase=archive --db=archive \
  --out=/home/archive-api/backups/manual_backup

# Compress backup
tar -czf manual_backup_$(date +"%Y%m%d").tar.gz manual_backup/

Restore Database

# Extract backup
tar -xzf archive_20240315_020000.tar.gz

# Restore to database
mongorestore --username=archiveuser --password=YOUR_ARCHIVE_DB_PASSWORD \
  --authenticationDatabase=archive --db=archive \
  --drop \
  archive_20240315_020000/archive/
The --drop flag will drop existing collections before restoring. Use with caution in production.

System Monitoring

Resource Usage

# Monitor CPU and memory
htop

# Check disk usage
df -h

# Check memory usage
free -h

# Monitor disk I/O
sudo iotop

# Network connections
sudo netstat -tulpn

Service Status

# Check all services
sudo systemctl status nginx
sudo systemctl status mongod
sudo systemctl status redis-server

# CloudPanel status
sudo systemctl status clp

Monitor Specific Ports

# Check if API is listening
sudo netstat -tulpn | grep :3000

# Check MongoDB
sudo netstat -tulpn | grep :27017

# Check Redis
sudo netstat -tulpn | grep :6379

Performance Monitoring

Application Metrics

# PM2 metrics
pm2 describe archive-api

# Memory usage per process
pm2 status

# CPU usage
pm2 monit

Database Performance

# Connect to MongoDB
mongosh -u archiveuser -p YOUR_ARCHIVE_DB_PASSWORD --authenticationDatabase archive

# Database statistics
use archive
db.stats()

# Collection statistics
db.contentitems.stats()

# Current operations
db.currentOp()

# Slow queries (> 100ms)
db.setProfilingLevel(1, { slowms: 100 })
db.system.profile.find().limit(5).sort({ ts: -1 }).pretty()

Alerts and Notifications

PM2 Process Alerts

Configure PM2 to send alerts when processes crash:
# Install PM2 notification module
pm2 install pm2-auto-pull

# Monitor process restarts
pm2 set pm2:max-restarts 10

Disk Space Monitoring

Create a disk space alert script:
/root/check-disk-space.sh
#!/bin/bash

THRESHOLD=80
USAGE=$(df / | grep / | awk '{ print $5 }' | sed 's/%//g')

if [ $USAGE -gt $THRESHOLD ]; then
    echo "WARNING: Disk usage is at ${USAGE}%" | logger -t disk-space
    # Add email notification or webhook here
fi
Schedule with cron:
# Run every hour
0 * * * * /root/check-disk-space.sh

Maintenance Tasks

Weekly Maintenance

# Update system packages
sudo apt update && sudo apt upgrade -y

# Clean package cache
sudo apt autoremove -y
sudo apt autoclean

# Check disk usage
df -h

# Review logs for errors
pm2 logs --lines 100 --err

Monthly Maintenance

# Verify backups
ls -lh /home/archive-api/backups/mongodb/

# Test backup restoration (on staging)
# mongorestore --username=archiveuser ...

# Review MongoDB indexes
mongosh -u archiveuser -p PASSWORD --authenticationDatabase archive --eval "db.contentitems.getIndexes()"

# Optimize Redis memory
redis-cli MEMORY PURGE

Update Application

Deployments are typically automated via GitHub Actions (see deployment guide), but for manual updates:
# Pull latest changes
cd /home/archive-api/htdocs
git pull origin main

# Install dependencies
cd backend
bun install

# Reload PM2 processes
pm2 reload ecosystem.config.js

# Verify status
pm2 status
pm2 logs --lines 50

Troubleshooting Common Issues

High Memory Usage

# Check which process is using memory
pm2 monit

# Restart high-memory process
pm2 restart archive-screenshot-worker

# Reduce max_memory_restart in ecosystem.config.js
# Then reload:
pm2 reload ecosystem.config.js

Process Keeps Restarting

# Check error logs
pm2 logs archive-api --err --lines 100

# Check application errors
cat /home/archive-api/htdocs/logs/api-error.log

# Verify environment variables
pm2 env 0  # Replace 0 with process ID

# Check database connection
mongosh -u archiveuser -p PASSWORD --authenticationDatabase archive

Disk Space Full

# Check disk usage
df -h

# Find large files
sudo du -h --max-depth=1 /var/log | sort -hr
sudo du -h --max-depth=1 /home/archive-api | sort -hr

# Clean old logs
pm2 flush
sudo journalctl --vacuum-time=7d

# Remove old backups
find /home/archive-api/backups -name "*.tar.gz" -mtime +30 -delete

Health Check Endpoint

Monitor API health programmatically:
# Check API health
curl http://localhost:3000/api/health

# Expected response:
# {"status":"ok","timestamp":"2024-03-15T10:30:00.000Z","uptime":3600}
Set up external monitoring with services like:

Next Steps

Configuration

Review environment variables

Backend Deployment

Review deployment process

Build docs developers (and LLMs) love