Skip to main content
RestAI uses PostgreSQL 17 with Drizzle ORM for type-safe database operations and migrations. This guide covers database setup, running migrations, and maintenance tasks.

Quick Start

The simplest way to set up the database is using Docker Compose, which handles everything automatically:
1

Start PostgreSQL service

docker-compose up -d postgres
This starts PostgreSQL 17 Alpine with health checks and persistent storage.
2

Run migrations

docker-compose up migrate
The migrate service runs once, applies all migrations, and exits.
3

Verify database

docker-compose exec postgres psql -U restai -d restai -c "\dt"
This lists all tables created by the migrations.

PostgreSQL Configuration

Docker Compose Setup

The postgres service is configured in docker-compose.yml:2-15:
postgres:
  image: postgres:17-alpine
  restart: unless-stopped
  environment:
    POSTGRES_USER: ${POSTGRES_USER:-postgres}
    POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-fenrinegro}
    POSTGRES_DB: restai
  volumes:
    - pgdata:/var/lib/postgresql/data
  healthcheck:
    test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-postgres}"]
    interval: 5s
    timeout: 5s
    retries: 5
Key features:
  • PostgreSQL 17 Alpine: Lightweight image (~80MB vs 300MB+ for full Debian)
  • Persistent storage: pgdata volume preserves data across container restarts
  • Health checks: pg_isready command runs every 5 seconds to verify database availability
  • Auto-restart: Container restarts automatically unless manually stopped

Environment Variables

Configure these in your .env file:
POSTGRES_USER=restai              # Database superuser
POSTGRES_PASSWORD=change-me       # Superuser password (REQUIRED for production)
DATABASE_URL=postgresql://restai:change-me@postgres:5432/restai
The default POSTGRES_PASSWORD value fenrinegro is for development only. Change this immediately in production to prevent unauthorized database access.

Database Migrations

RestAI uses Drizzle ORM for schema management and migrations. Migrations are version-controlled SQL files that modify the database schema.

Migration Architecture

Components:
  • Schema definition: TypeScript schema files in packages/db/src/schema/
  • Migration generator: drizzle-kit CLI generates SQL from schema changes
  • Migration runner: packages/db/src/migrate.ts applies migrations to database
  • Migration files: SQL files in packages/db/drizzle/

Drizzle Configuration

The Drizzle config is defined in packages/db/drizzle.config.ts:
import { defineConfig } from "drizzle-kit";

export default defineConfig({
  out: "./drizzle",                    // Migration output directory
  schema: "./src/schema/index.ts",    // TypeScript schema location
  dialect: "postgresql",              // Database dialect
  dbCredentials: {
    url: process.env.DATABASE_URL!,   // Connection string from env
  },
});

Current Migrations

As of this documentation, there are 2 migration files:
packages/db/drizzle/
├── 0000_bent_sentinels.sql      # Initial schema (29,465 bytes)
├── 0001_loving_namor.sql         # Schema updates (7,531 bytes)
└── meta/                         # Migration metadata

Running Migrations

Automatic (Docker Compose)

The migrate service runs automatically when you start the stack:
migrate:
  build:
    context: .
    dockerfile: Dockerfile.api
    target: builder
  working_dir: /app/packages/db
  command: ["bun", "run", "src/migrate.ts"]
  environment:
    DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/restai
  depends_on:
    postgres:
      condition: service_healthy
  restart: "no"  # Runs once and exits
Behavior:
  1. Waits for postgres to be healthy
  2. Runs src/migrate.ts with Bun
  3. Exits with code 0 on success, code 1 on failure
  4. API service waits for service_completed_successfully before starting

Manual (Local Development)

Run migrations manually outside Docker:
1

Set DATABASE_URL

export DATABASE_URL="postgresql://restai:dev-password@localhost:5432/restai"
2

Run migration script

cd packages/db
bun run src/migrate.ts
Expected output:
Running migrations...
Migrations completed successfully

Manual (Docker)

Force re-run migrations in Docker:
# Run migrate service manually
docker-compose run --rm migrate

# Or exec into running API container
docker-compose exec api bun run /app/packages/db/src/migrate.ts

Migration Script

The migration runner is defined in packages/db/src/migrate.ts:1-25:
import { drizzle } from "drizzle-orm/postgres-js";
import { migrate } from "drizzle-orm/postgres-js/migrator";
import postgres from "postgres";

const connectionString = process.env.DATABASE_URL;
if (!connectionString) {
  console.error("DATABASE_URL is required");
  process.exit(1);
}

const client = postgres(connectionString, { max: 1 });
const db = drizzle(client);

console.log("Running migrations...");

try {
  await migrate(db, { migrationsFolder: "./drizzle" });
  console.log("Migrations completed successfully");
} catch (error) {
  console.error("Migration failed:", error);
  process.exit(1);
} finally {
  await client.end();
}
Key features:
  • Environment validation: Exits immediately if DATABASE_URL is missing
  • Single connection: Uses max: 1 connection for migration safety
  • Error handling: Logs errors and exits with code 1 on failure
  • Cleanup: Closes database connection in finally block

Creating New Migrations

When you modify the database schema:
1

Update schema files

Edit TypeScript schema definitions in packages/db/src/schema/:
// Example: Add new column to tables table
export const tables = pgTable("tables", {
  id: uuid("id").primaryKey().defaultRandom(),
  code: varchar("code", { length: 50 }).notNull(),
  qr_code_url: text("qr_code_url"),
  capacity: integer("capacity"),
  // New column
  notes: text("notes"),
});
2

Generate migration

cd packages/db
bun run drizzle-kit generate
This creates a new SQL file in drizzle/ directory:
drizzle/0002_new_migration_name.sql
3

Review generated SQL

Open the new migration file and verify the SQL is correct:
ALTER TABLE "tables" ADD COLUMN "notes" text;
4

Apply migration

# Local
bun run src/migrate.ts

# Docker
docker-compose run --rm migrate
Drizzle migrations are incremental. The migration runner tracks which migrations have been applied and only runs new ones. It’s safe to run migrate.ts multiple times.

Database Schema Overview

The RestAI schema includes tables for multi-tenant restaurant management: Core entities:
  • organizations - Restaurant groups/chains
  • branches - Individual restaurant locations
  • users - Staff users (admin, manager, waiter, chef)
  • customers - Customer accounts for loyalty program
Menu management:
  • categories - Menu categories (Appetizers, Mains, Desserts)
  • items - Menu items with prices and descriptions
  • modifiers - Item customizations (size, extras, etc.)
Order processing:
  • orders - Customer orders
  • order_items - Items in each order
  • table_sessions - Active customer sessions at tables
Loyalty program:
  • loyalty_rewards - Available rewards
  • customer_rewards - Redeemed rewards by customers
Reference tables:
  • tables - Physical tables with QR codes
  • order_statuses - Order status tracking
For the complete schema definition, see the generated migration files in packages/db/drizzle/.

Database Maintenance

Backups

Manual Backup

Create a SQL dump of the database:
# Full database backup
docker-compose exec postgres pg_dump -U restai restai > backup-$(date +%Y%m%d-%H%M%S).sql

# Compressed backup
docker-compose exec postgres pg_dump -U restai restai | gzip > backup-$(date +%Y%m%d-%H%M%S).sql.gz

# Schema-only backup (no data)
docker-compose exec postgres pg_dump -U restai --schema-only restai > schema-backup.sql

Automated Backups

Set up a cron job for daily backups:
# Edit crontab
crontab -e

# Add daily backup at 2 AM
0 2 * * * cd /path/to/restai && docker-compose exec -T postgres pg_dump -U restai restai | gzip > /backups/restai-$(date +\%Y\%m\%d).sql.gz

Restore from Backup

Restoring a backup will overwrite all current data. Always verify the backup file before restoring.
1

Stop the API service

docker-compose stop api web
Prevent connections during restore.
2

Drop and recreate database

docker-compose exec postgres psql -U restai -c "DROP DATABASE IF EXISTS restai;"
docker-compose exec postgres psql -U restai -c "CREATE DATABASE restai;"
3

Restore from backup

# From uncompressed SQL
cat backup-20260302.sql | docker-compose exec -T postgres psql -U restai restai

# From compressed SQL
gunzip -c backup-20260302.sql.gz | docker-compose exec -T postgres psql -U restai restai
4

Restart services

docker-compose up -d api web

Database Health Checks

Monitor database health and performance:
# Check database size
docker-compose exec postgres psql -U restai restai -c "SELECT pg_size_pretty(pg_database_size('restai'));"

# List all tables with sizes
docker-compose exec postgres psql -U restai restai -c "
  SELECT schemaname, tablename, pg_size_pretty(pg_total_relation_size(schemaname||'.'||tablename)) AS size
  FROM pg_tables
  WHERE schemaname NOT IN ('pg_catalog', 'information_schema')
  ORDER BY pg_total_relation_size(schemaname||'.'||tablename) DESC;
"

# Check active connections
docker-compose exec postgres psql -U restai restai -c "SELECT count(*) FROM pg_stat_activity;"

# View slow queries (queries running > 5 seconds)
docker-compose exec postgres psql -U restai restai -c "
  SELECT pid, now() - pg_stat_activity.query_start AS duration, query
  FROM pg_stat_activity
  WHERE (now() - pg_stat_activity.query_start) > interval '5 seconds';
"

Vacuum and Analyze

PostgreSQL requires periodic vacuuming to reclaim space and update statistics:
# Manual vacuum (safe to run anytime)
docker-compose exec postgres psql -U restai restai -c "VACUUM ANALYZE;"

# Aggressive vacuum (requires exclusive lock, use during maintenance window)
docker-compose exec postgres psql -U restai restai -c "VACUUM FULL ANALYZE;"
PostgreSQL auto-vacuum runs automatically by default. Manual vacuuming is only needed for large bulk operations (imports, deletes).

External Database Setup

To use a managed PostgreSQL service (AWS RDS, Google Cloud SQL, etc.) instead of the Docker container:
1

Create database instance

  • PostgreSQL version: 17 or higher recommended
  • Enable SSL/TLS connections
  • Create database named restai
  • Create user with full privileges on restai database
2

Configure connection string

Set DATABASE_URL in .env:
# AWS RDS example
DATABASE_URL=postgresql://restai_user:[email protected]:5432/restai?sslmode=require

# Google Cloud SQL example
DATABASE_URL=postgresql://restai_user:[email protected]:5432/restai?sslmode=require
3

Run migrations

# Local (with DATABASE_URL in .env)
cd packages/db
bun run src/migrate.ts

# Docker (mount .env with external DATABASE_URL)
docker-compose run --rm -e DATABASE_URL="$DATABASE_URL" migrate
4

Update docker-compose.yml

Remove the postgres service and update dependencies:
# Remove postgres service completely

# Update API service to not depend on postgres
api:
  depends_on:
    redis:
      condition: service_healthy
    # Remove postgres dependency

Troubleshooting

Migration fails with “DATABASE_URL is required”

Cause: DATABASE_URL environment variable is not set. Fix:
# Add to .env file
DATABASE_URL=postgresql://restai:password@localhost:5432/restai

# Or export temporarily
export DATABASE_URL="postgresql://restai:password@localhost:5432/restai"

Migration fails with “relation already exists”

Cause: Migrations were run partially or database wasn’t clean. Fix: Drop and recreate the database:
# Docker
docker-compose exec postgres psql -U restai -c "DROP DATABASE restai;"
docker-compose exec postgres psql -U restai -c "CREATE DATABASE restai;"
docker-compose run --rm migrate

# Local
psql -U postgres -c "DROP DATABASE IF EXISTS restai;"
psql -U postgres -c "CREATE DATABASE restai;"
cd packages/db && bun run src/migrate.ts

Connection refused / Can’t connect to database

Symptoms: ECONNREFUSED or could not connect to server Cause: PostgreSQL isn’t running or wrong host/port. Fix:
# Check postgres is running
docker-compose ps postgres

# Check postgres logs
docker-compose logs postgres

# Verify connection string
echo $DATABASE_URL

# Test connection manually
docker-compose exec postgres psql -U restai restai

“Out of memory” during migration

Cause: Large migration files or insufficient container memory. Fix: Increase Docker memory limit in Docker Desktop settings (minimum 4GB recommended).

“password authentication failed”

Cause: Wrong password in DATABASE_URL or POSTGRES_PASSWORD. Fix: Verify credentials match:
# Check .env file
cat .env | grep POSTGRES

# Reset postgres password
docker-compose down
docker volume rm restai_pgdata  # WARNING: Deletes all data
docker-compose up -d postgres

Best Practices

  1. Always backup before migrations - Create a backup before running migrations in production
  2. Test migrations in staging - Never run untested migrations directly in production
  3. Use SSL for external databases - Always add ?sslmode=require to external DATABASE_URL
  4. Monitor connection pool - Set appropriate max_connections in PostgreSQL config
  5. Regular vacuum - Run VACUUM ANALYZE after large data imports/deletes
  6. Rotate backups - Keep at least 7 daily backups and 4 weekly backups
  7. Track migration history - Commit all migration files to git
  8. Use read replicas - For high-traffic deployments, configure read replicas for reporting queries

Build docs developers (and LLMs) love