CockroachDB provides enterprise-grade backup and restore capabilities to protect your data. Learn how to create full and incremental backups, schedule automated backups, and restore data when needed.
Backup Types
Full Backup Complete snapshot of all data at a specific point in time
Incremental Backup Only changes since the last backup, reducing time and storage
Revision History Point-in-time recovery with historical data versions
Scheduled Backups Automated backups running on a regular schedule
Creating Backups
Full Database Backup
Create a complete backup of your entire database:
BACKUP INTO 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
BACKUP DATABASE mydb
INTO 's3://my-bucket/backups/mydb?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
BACKUP TABLE mydb . users , mydb . orders
INTO 's3://my-bucket/backups/tables?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
Full backups create a new backup in the specified location. Use BACKUP INTO LATEST for incremental backups.
Incremental Backups
Create incremental backups to capture only changes since the last backup:
BACKUP INTO LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
Create Initial Full Backup
BACKUP INTO 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
Create First Incremental
BACKUP INTO LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
Continue Incrementals
Run the same BACKUP INTO LATEST command periodically to create a chain of incremental backups.
Backup Destinations
CockroachDB supports multiple storage backends for backups:
Amazon S3
Google Cloud Storage
Azure Blob Storage
Local/Network Storage
HTTP(S) Storage
BACKUP DATABASE mydb
INTO 's3://bucket-name/path?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx&AWS_REGION=us-east-1' ;
nodelocal:// is only suitable for development. Use cloud storage for production backups to ensure durability and accessibility.
Scheduled Backups
Automate backups with schedules:
CREATE SCHEDULE daily_backup
FOR BACKUP DATABASE mydb
INTO 's3://my-bucket/scheduled-backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
RECURRING '@daily'
FULL BACKUP '@weekly'
WITH SCHEDULE OPTIONS first_run = 'now' ;
Schedule Options
Common Schedule Expressions
@hourly: Every hour
@daily: Every day at midnight UTC
@weekly: Every Sunday at midnight UTC
'*/15 * * * *': Every 15 minutes (cron format)
'0 2 * * *': Every day at 2 AM UTC
'0 0 * * 0': Every Sunday at midnight UTC
Managing Schedules
ALTER BACKUP SCHEDULE 123456 SET RECURRING '@hourly' ;
Backup Options
Enhance your backups with additional options:
Backup with Revision History
BACKUP DATABASE mydb
INTO 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
WITH revision_history;
Revision history enables point-in-time restore (PITR) capabilities, allowing you to restore to any moment within the backup’s coverage.
BACKUP DATABASE mydb
INTO 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
WITH encryption_passphrase = 'my-secret-passphrase' ;
BACKUP DATABASE mydb
INTO 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
WITH detached;
revision_history Captures historical versions of data for point-in-time recovery
encryption_passphrase Encrypts backup data for security compliance
detached Returns immediately without waiting for backup completion
execution_locality Runs backup on specific nodes based on locality
Restoring from Backups
Full Database Restore
RESTORE FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
Restore Specific Database
RESTORE DATABASE mydb
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
RESTORE TABLE mydb . users , mydb . orders
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
Restore operations replace existing data. The target database or tables must not exist unless using specific restore options.
Point-in-Time Restore
Restore to a specific timestamp using revision history:
RESTORE DATABASE mydb
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
AS OF SYSTEM TIME '2024-01-15 14:30:00' ;
Identify Target Time
Determine the exact timestamp to restore to, either from application logs or monitoring data.
Verify Backup Coverage
Ensure your backup with revision history covers the desired timestamp: SHOW BACKUP FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
Execute PITR Restore
RESTORE DATABASE mydb
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
AS OF SYSTEM TIME '2024-01-15 14:30:00' ;
Restore Options
Restore to Different Database
RESTORE DATABASE mydb
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
WITH into_db = 'mydb_restored' ;
RESTORE DATABASE mydb
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
WITH encryption_passphrase = 'my-secret-passphrase' ;
Skip Missing Foreign Keys
RESTORE TABLE mydb . orders
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
WITH skip_missing_foreign_keys;
Viewing Backup Details
SHOW BACKUPS IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
SHOW BACKUP FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
SHOW BACKUP VALIDATE FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
Monitoring Backup Jobs
SHOW JOBS
WHERE job_type IN ( 'BACKUP' , 'RESTORE' )
AND status = 'running' ;
SELECT
job_id,
job_type,
description ,
status ,
fraction_completed,
created,
started ,
finished
FROM crdb_internal . jobs
WHERE job_type IN ( 'BACKUP' , 'RESTORE' )
ORDER BY created DESC
LIMIT 10 ;
Backup Best Practices
Use Incremental Backups
Implement a strategy with full weekly backups and daily/hourly incrementals to optimize storage and backup windows.
Enable Revision History
Use revision_history for critical databases to enable point-in-time recovery.
Encrypt Sensitive Data
Always encrypt backups containing sensitive information using encryption_passphrase.
Test Restores Regularly
Periodically restore backups to a test environment to verify data integrity and restore procedures.
Distribute Backup Storage
Store backups in a different region or cloud provider than your primary cluster for disaster recovery.
Automate with Schedules
Use scheduled backups instead of manual processes to ensure consistency and reduce human error.
Monitor Backup Jobs
Set up alerts for failed backup jobs and monitor backup durations for anomalies.
Backup Strategy Example
Recommended Enterprise Backup Strategy
-- Create a schedule for daily incremental backups
CREATE SCHEDULE production_incremental
FOR BACKUP DATABASE production
INTO 's3://prod-backups/daily?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
RECURRING '@daily'
FULL BACKUP '@weekly'
WITH revision_history,
SCHEDULE OPTIONS first_run = 'now' ;
-- Create a separate schedule for weekly full backups to long-term storage
CREATE SCHEDULE production_weekly_archive
FOR BACKUP DATABASE production
INTO 's3://prod-backups-archive/weekly?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
RECURRING '@weekly'
FULL BACKUP ALWAYS
WITH revision_history,
encryption_passphrase = 'archive-encryption-key' ,
SCHEDULE OPTIONS first_run = 'now' ;
This strategy provides:
Daily incremental backups for quick recovery
Weekly full backups for long-term retention
Revision history for point-in-time recovery
Encrypted archive backups for compliance
Troubleshooting
Backup Failures
Permission Errors Verify cloud storage credentials and IAM permissions
Timeout Issues Increase backup job timeout or split into smaller backups
Storage Quota Check available storage space in backup destination
Network Connectivity Ensure nodes can reach backup storage endpoints
Restore Failures
SHOW BACKUP VALIDATE FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx' ;
Next Steps
Security Secure your backups and cluster
Upgrade Upgrade CockroachDB versions
Scaling Scale your cluster capacity