Skip to main content
CockroachDB provides enterprise-grade backup and restore capabilities to protect your data. Learn how to create full and incremental backups, schedule automated backups, and restore data when needed.

Backup Types

Full Backup

Complete snapshot of all data at a specific point in time

Incremental Backup

Only changes since the last backup, reducing time and storage

Revision History

Point-in-time recovery with historical data versions

Scheduled Backups

Automated backups running on a regular schedule

Creating Backups

Full Database Backup

Create a complete backup of your entire database:
Full Cluster Backup
BACKUP INTO 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';
Specific Database Backup
BACKUP DATABASE mydb 
INTO 's3://my-bucket/backups/mydb?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';
Table-Level Backup
BACKUP TABLE mydb.users, mydb.orders 
INTO 's3://my-bucket/backups/tables?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';
Full backups create a new backup in the specified location. Use BACKUP INTO LATEST for incremental backups.

Incremental Backups

Create incremental backups to capture only changes since the last backup:
Incremental Backup
BACKUP INTO LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';
1

Create Initial Full Backup

BACKUP INTO 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';
2

Create First Incremental

BACKUP INTO LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';
3

Continue Incrementals

Run the same BACKUP INTO LATEST command periodically to create a chain of incremental backups.

Backup Destinations

CockroachDB supports multiple storage backends for backups:
BACKUP DATABASE mydb 
INTO 's3://bucket-name/path?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx&AWS_REGION=us-east-1';
nodelocal:// is only suitable for development. Use cloud storage for production backups to ensure durability and accessibility.

Scheduled Backups

Automate backups with schedules:
Create Backup Schedule
CREATE SCHEDULE daily_backup
FOR BACKUP DATABASE mydb 
INTO 's3://my-bucket/scheduled-backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
RECURRING '@daily'
FULL BACKUP '@weekly'
WITH SCHEDULE OPTIONS first_run = 'now';

Schedule Options

  • @hourly: Every hour
  • @daily: Every day at midnight UTC
  • @weekly: Every Sunday at midnight UTC
  • '*/15 * * * *': Every 15 minutes (cron format)
  • '0 2 * * *': Every day at 2 AM UTC
  • '0 0 * * 0': Every Sunday at midnight UTC

Managing Schedules

View All Schedules
SHOW SCHEDULES;
Pause a Schedule
PAUSE SCHEDULE 123456;
Resume a Schedule
RESUME SCHEDULE 123456;
Alter a Schedule
ALTER BACKUP SCHEDULE 123456 SET RECURRING '@hourly';
Drop a Schedule
DROP SCHEDULE 123456;

Backup Options

Enhance your backups with additional options:
Backup with Revision History
BACKUP DATABASE mydb 
INTO 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
WITH revision_history;
Revision history enables point-in-time restore (PITR) capabilities, allowing you to restore to any moment within the backup’s coverage.
Encrypted Backup
BACKUP DATABASE mydb 
INTO 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
WITH encryption_passphrase = 'my-secret-passphrase';
Detached Backup
BACKUP DATABASE mydb 
INTO 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
WITH detached;

revision_history

Captures historical versions of data for point-in-time recovery

encryption_passphrase

Encrypts backup data for security compliance

detached

Returns immediately without waiting for backup completion

execution_locality

Runs backup on specific nodes based on locality

Restoring from Backups

Full Database Restore

Restore Entire Cluster
RESTORE FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';
Restore Specific Database
RESTORE DATABASE mydb 
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';
Restore Specific Tables
RESTORE TABLE mydb.users, mydb.orders 
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';
Restore operations replace existing data. The target database or tables must not exist unless using specific restore options.

Point-in-Time Restore

Restore to a specific timestamp using revision history:
PITR Restore
RESTORE DATABASE mydb 
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
AS OF SYSTEM TIME '2024-01-15 14:30:00';
1

Identify Target Time

Determine the exact timestamp to restore to, either from application logs or monitoring data.
2

Verify Backup Coverage

Ensure your backup with revision history covers the desired timestamp:
SHOW BACKUP FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';
3

Execute PITR Restore

RESTORE DATABASE mydb 
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
AS OF SYSTEM TIME '2024-01-15 14:30:00';

Restore Options

Restore to Different Database
RESTORE DATABASE mydb 
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
WITH into_db = 'mydb_restored';
Restore with Decryption
RESTORE DATABASE mydb 
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
WITH encryption_passphrase = 'my-secret-passphrase';
Skip Missing Foreign Keys
RESTORE TABLE mydb.orders 
FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx'
WITH skip_missing_foreign_keys;

Viewing Backup Details

List Backups in Location
SHOW BACKUPS IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';
Show Backup Contents
SHOW BACKUP FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';
Check Backup Validity
SHOW BACKUP VALIDATE FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';

Monitoring Backup Jobs

View Running Jobs
SHOW JOBS 
WHERE job_type IN ('BACKUP', 'RESTORE') 
  AND status = 'running';
View Job Details
SELECT 
  job_id,
  job_type,
  description,
  status,
  fraction_completed,
  created,
  started,
  finished
FROM crdb_internal.jobs
WHERE job_type IN ('BACKUP', 'RESTORE')
ORDER BY created DESC
LIMIT 10;
Cancel a Backup Job
CANCEL JOB 123456789;

Backup Best Practices

1

Use Incremental Backups

Implement a strategy with full weekly backups and daily/hourly incrementals to optimize storage and backup windows.
2

Enable Revision History

Use revision_history for critical databases to enable point-in-time recovery.
3

Encrypt Sensitive Data

Always encrypt backups containing sensitive information using encryption_passphrase.
4

Test Restores Regularly

Periodically restore backups to a test environment to verify data integrity and restore procedures.
5

Distribute Backup Storage

Store backups in a different region or cloud provider than your primary cluster for disaster recovery.
6

Automate with Schedules

Use scheduled backups instead of manual processes to ensure consistency and reduce human error.
7

Monitor Backup Jobs

Set up alerts for failed backup jobs and monitor backup durations for anomalies.

Backup Strategy Example

Troubleshooting

Backup Failures

Permission Errors

Verify cloud storage credentials and IAM permissions

Timeout Issues

Increase backup job timeout or split into smaller backups

Storage Quota

Check available storage space in backup destination

Network Connectivity

Ensure nodes can reach backup storage endpoints

Restore Failures

Check Backup Integrity
SHOW BACKUP VALIDATE FROM LATEST IN 's3://my-bucket/backups?AWS_ACCESS_KEY_ID=xxx&AWS_SECRET_ACCESS_KEY=xxx';

Next Steps

Security

Secure your backups and cluster

Upgrade

Upgrade CockroachDB versions

Scaling

Scale your cluster capacity

Build docs developers (and LLMs) love