Skip to main content

Migration Guide

This guide covers strategies for migrating between versions of Zero, upgrading dependencies, and evolving your application’s schema and queries.

Version Migration Strategy

Semantic Versioning

Zero follows semantic versioning (semver):
  • Patch releases (0.0.x): Bug fixes, no breaking changes
  • Minor releases (0.x.0): New features, backward compatible
  • Major releases (x.0.0): Breaking changes, migration required
Current status: Zero is pre-1.0, so minor version bumps may include breaking changes. Always check release notes.

Upgrade Process

  1. Review release notes for breaking changes and new features
  2. Test in development environment first
  3. Update dependencies across all packages
  4. Run type checking to catch API changes
  5. Test queries and mutations thoroughly
  6. Deploy incrementally (canary or staged rollout)

Monorepo Coordination

Zero is developed in a monorepo with multiple packages. When upgrading:
# Update all Zero packages together
npm install @rocicorp/zero@latest \
             @rocicorp/zero-client@latest \
             @rocicorp/zero-cache@latest

# Verify versions match
npm list @rocicorp/zero*
Important: Client and server packages must be compatible versions. Don’t mix major versions.

Schema Migration

Adding Tables

Adding new tables is backward compatible:
// Before
const schema = {
  tables: {
    user: table('user').columns({...}),
    post: table('post').columns({...}),
  }
};

// After: Add new table
const schema = {
  tables: {
    user: table('user').columns({...}),
    post: table('post').columns({...}),
    comment: table('comment').columns({...}), // New table
  }
};
Steps:
  1. Add table to schema definition
  2. Create database migration (PostgreSQL)
  3. Deploy server first (server can handle old and new schemas)
  4. Deploy client update

Adding Columns

Adding optional columns is backward compatible:
// Before
const user = table('user').columns({
  id: string(),
  name: string(),
});

// After: Add optional column
const user = table('user').columns({
  id: string(),
  name: string(),
  bio: string().optional(), // New optional field
});
Database migration:
ALTER TABLE users ADD COLUMN bio TEXT;
Important: Always add new columns as optional initially. Making them required needs a two-phase migration.

Making Columns Required

Requires two-phase migration: Phase 1: Add optional column
const user = table('user').columns({
  id: string(),
  name: string(),
  email: string().optional(), // New optional
});
-- Database migration phase 1
ALTER TABLE users ADD COLUMN email TEXT;

-- Backfill existing rows
UPDATE users SET email = id || '@example.com' WHERE email IS NULL;
Phase 2: Make column required (after all clients updated)
const user = table('user').columns({
  id: string(),
  name: string(),
  email: string(), // Now required
});
-- Database migration phase 2
ALTER TABLE users ALTER COLUMN email SET NOT NULL;

Removing Columns

Requires two-phase migration: Phase 1: Make column optional
// Mark deprecated column as optional
const user = table('user').columns({
  id: string(),
  name: string(),
  oldField: string().optional(), // Was required, now optional
});
Deploy client and server updates. Phase 2: Remove column (after all clients updated)
const user = table('user').columns({
  id: string(),
  name: string(),
  // oldField removed
});
ALTER TABLE users DROP COLUMN old_field;

Renaming Columns

Renaming is a breaking change. Use a two-phase migration: Phase 1: Add new column, keep old
const user = table('user').columns({
  id: string(),
  // Old column (optional for backward compat)
  firstName: string().optional(),
  // New column (optional during transition)
  givenName: string().optional(),
});
-- Database migration phase 1
ALTER TABLE users ADD COLUMN given_name TEXT;
UPDATE users SET given_name = first_name;
Application code during transition:
// Read from both
const name = user.givenName ?? user.firstName;

// Write to both
z.mutate.updateUser({
  id,
  givenName: newName,
  firstName: newName, // Keep in sync during transition
});
Phase 2: Remove old column (after all clients updated)
const user = table('user').columns({
  id: string(),
  givenName: string(),
  // firstName removed
});
ALTER TABLE users DROP COLUMN first_name;

Changing Column Types

Type changes are complex and risky. Recommended approach: Option 1: Add new column (safest)
const post = table('post').columns({
  id: string(),
  createdAt: string(), // Old: ISO string
  createdAtMs: number().optional(), // New: Unix timestamp
});
Option 2: Database-level conversion (PostgreSQL only)
-- Example: string to integer
ALTER TABLE posts ALTER COLUMN id TYPE INTEGER USING id::INTEGER;
Option 3: Create new table and migrate data
// Create new table with correct types
// Migrate data in background
// Switch over when complete
// Drop old table

Query Migration

Updating Query API Usage

When Zero’s query API changes: Before (hypothetical old API):
z.query('users').filter({active: true}).limit(10);
After (current API):
z.query.users.where('active', '=', true).limit(10);
Migration strategy:
  1. Search codebase for old API usage
  2. Update incrementally, testing each change
  3. Use TypeScript to catch incompatibilities
Find all query usage:
# Search for old patterns
grep -r "z\.query(" src/

# Or use AST-based refactoring tools
npx jscodeshift -t transform.js src/

Migrating Complex Queries

If query semantics change between versions: Strategy 1: Side-by-side comparison
// Run both old and new queries in parallel
const oldResult = oldQuery();
const newResult = newQuery();

// Compare results
if (!deepEqual(oldResult, newResult)) {
  console.error('Query results differ!', {oldResult, newResult});
}

// Use new result
return newResult;
Strategy 2: Feature flag
const USE_NEW_QUERY = process.env.ENABLE_NEW_QUERY === 'true';

const {data} = USE_NEW_QUERY 
  ? z.query.users.newAPI()
  : z.query.users.oldAPI();
Strategy 3: Gradual rollout
// Use new query for subset of users
const useNewQuery = hashUserId(currentUser.id) % 100 < rolloutPercent;

Mutation Migration

Updating Mutation Signatures

When mutation APIs change: Before:
z.mutate.updateUser(userId, {name: 'Alice'});
After:
z.mutate.updateUser({id: userId, name: 'Alice'});
Migration:
  1. Update all call sites
  2. Use TypeScript to find incompatible calls
  3. Test thoroughly (mutations modify data!)

Custom Mutation Evolution

When evolving custom mutations: Backward-compatible approach:
// Old signature (keep for compatibility)
z.defineMutation('updateUser', async (tx, userId: string, data: UserUpdate) => {
  await tx.update('user', {id: userId, ...data});
});

// New signature (preferred)
z.defineMutation('updateUserV2', async (tx, params: {id: string} & UserUpdate) => {
  await tx.update('user', params);
});

// Deprecation period: Support both
// Later: Remove old mutation

Data Migrations

For complex data transformations:
// Define migration mutation
z.defineMutation('migrateUserData', async (tx) => {
  const users = await tx.scan('user');
  
  for (const user of users) {
    // Transform data
    const updated = {
      ...user,
      newField: computeValue(user),
    };
    
    await tx.update('user', updated);
  }
});

// Run migration (server-side)
await z.mutate.migrateUserData();
Best practices:
  • Run migrations in off-peak hours
  • Process in batches to avoid timeouts
  • Add progress logging
  • Make migrations idempotent (safe to re-run)
  • Test on copy of production data first

Dependency Updates

Updating Replicache

Zero uses Replicache internally. When Replicache updates:
# Zero manages Replicache version
# Update Zero, which will pull correct Replicache
npm install @rocicorp/zero@latest
Breaking changes in Replicache:
  • Check Zero release notes for compatibility
  • Test sync behavior thoroughly
  • Verify IndexedDB schema migrations

Updating Database Driver

When updating PostgreSQL driver:
// package.json
{
  "dependencies": {
    "pg": "^8.11.0" // Update version
  }
}
Test:
  • Connection pooling behavior
  • Query performance
  • Transaction handling
  • Error handling

Updating TypeScript

Zero uses TypeScript ~5.9.3:
# Update TypeScript
npm install typescript@~5.9.3

# Run type checking
npm run check-types
Common issues:
  • Stricter type checking may reveal hidden bugs
  • Update type definitions (@types/*)
  • Fix new errors before deploying

Development Environment Migration

Updating Local Database

When schema changes require database updates:
# In apps/zbugs or your app
cd apps/your-app

# Apply migrations
npm run db-migrate

# If needed, reseed data
npm run db-seed

# Restart Zero cache
npm run zero-cache-dev

Clearing Client State

Sometimes breaking changes require clearing client state:
// Clear IndexedDB (browser)
await z.clear();

// Or manually in DevTools:
// Application → Storage → IndexedDB → Delete database
When to clear:
  • Schema structure changes significantly
  • Data format changes
  • Persistent errors after update

Production Migration

Zero-Downtime Deployment

Strategy 1: Backward-compatible changes
  1. Deploy server with new schema (supports old and new)
  2. Deploy client update
  3. Remove old code in next release
Strategy 2: Database migration
  1. Run database migration (add columns, indexes)
  2. Deploy server update
  3. Deploy client update
  4. Clean up deprecated code
Strategy 3: Blue-green deployment
  1. Set up parallel environment with new version
  2. Route subset of traffic to new environment
  3. Monitor for issues
  4. Gradually shift traffic
  5. Shut down old environment

Rollback Strategy

Always have a rollback plan: Application rollback:
# Revert to previous version
git revert <commit>
npm install
npm run build
npm run deploy
Database rollback:
-- Reverse migration
-- Keep rollback scripts for every migration
ALTER TABLE users DROP COLUMN new_field;
Data rollback:
  • Restore from backup (last resort)
  • Use point-in-time recovery if available
  • Test restore process regularly

Monitoring Migration

Track metrics during migration:
// Custom metrics
metrics.increment('migration.success');
metrics.increment('migration.failure');
metrics.timing('query.materialization', duration);

// Error tracking
try {
  await runMigration();
} catch (error) {
  errorTracker.captureException(error, {
    tags: {migration: 'v2_to_v3'}
  });
  throw error;
}
Monitor:
  • Error rates
  • Query performance
  • Sync latency
  • Memory usage
  • User-reported issues

Testing Migrations

Test Checklist

  • Schema changes applied to test database
  • All queries return expected results
  • All mutations work correctly
  • No TypeScript errors
  • No runtime errors in console
  • Performance metrics acceptable
  • Sync works correctly (online/offline)
  • Data integrity maintained
  • Rollback tested successfully

Automated Tests

import {describe, test, expect} from 'vitest';

describe('Schema migration', () => {
  test('new column is optional', () => {
    const user = {id: '1', name: 'Alice'};
    expect(() => z.mutate.createUser(user)).not.toThrow();
  });
  
  test('old queries still work', () => {
    const {data} = z.query.users.where('active', '=', true).run();
    expect(data).toBeDefined();
  });
  
  test('new queries work', () => {
    const {data} = z.query.users.where('bio', 'LIKE', '%test%').run();
    expect(data).toBeDefined();
  });
});

Manual Testing

  1. Test in isolation: Fresh database, apply migrations, verify schema
  2. Test with production data: Copy production DB, test migration
  3. Test rollback: Apply migration, then rollback, verify state
  4. Test upgrade path: Migrate from each previous version

Common Migration Patterns

Adding Indexes

Safe to do anytime:
-- Create index concurrently (PostgreSQL)
CREATE INDEX CONCURRENTLY idx_posts_user_id ON posts(user_id);

-- Then run ANALYZE
ANALYZE;
Impact:
  • Improves query performance
  • No application code changes needed
  • Zero downtime

Splitting Tables

Complex migration requiring multiple steps:
  1. Create new tables
  2. Dual-write to old and new tables
  3. Backfill historical data
  4. Switch reads to new tables
  5. Remove dual-write logic
  6. Drop old tables

Merging Tables

Similar to splitting, in reverse:
  1. Create merged table
  2. Dual-write to old tables and new table
  3. Backfill historical data
  4. Switch reads to new table
  5. Remove dual-write logic
  6. Drop old tables

Version-Specific Guides

Future Migrations

As Zero evolves, version-specific migration guides will be added here. Check release notes for:
  • Breaking changes
  • Deprecated APIs
  • New features
  • Performance improvements
  • Security updates

Best Practices

  1. Always test migrations in development first
  2. Use semantic versioning to communicate changes
  3. Keep backward compatibility when possible
  4. Document breaking changes clearly
  5. Provide migration scripts for complex changes
  6. Monitor production during and after migration
  7. Have rollback plan ready
  8. Communicate with team about migration timeline
  9. Update dependencies together (don’t mix versions)
  10. Run ANALYZE after schema changes

Resources

Next Steps

Build docs developers (and LLMs) love