Skip to main content

Overview

Cabina uses Supabase as its complete backend infrastructure:
  • Authentication - Email/password, OAuth providers
  • Database - PostgreSQL with Row Level Security
  • Storage - File uploads and CDN
  • Edge Functions - Serverless API endpoints
  • Realtime - Live updates (future feature)

Initial Setup

1. Create Supabase Project

1

Sign up

Go to supabase.com and create an account
2

Create project

  • Click “New Project”
  • Choose organization
  • Set project name: cabina-production
  • Set database password (save securely!)
  • Select region (closest to users)
  • Click “Create new project”
3

Wait for provisioning

Project setup takes ~2 minutes

2. Get API Credentials

  1. Go to SettingsAPI
  2. Copy the following:
    • Project URL: https://xxxxx.supabase.co
    • anon public key: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...
    • service_role key: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9... (keep secret!)
Never expose the service_role key in client-side code. It bypasses all RLS policies.

Client Configuration

Supabase Client Setup

The app uses a centralized Supabase client instance:
src/lib/supabaseClient.ts
import { createClient } from '@supabase/supabase-js';

const supabaseUrl = 'https://elesttjfwfhvzdvldytn.supabase.co';
const supabaseAnonKey = 'sb_publishable_DPfOzwwv2yXK1uvya4RYhQ_uOdKIqn_';

export const supabase = createClient(supabaseUrl, supabaseAnonKey);
From source: src/lib/supabaseClient.ts
For production, move these values to environment variables.

Environment Variables

Update your configuration to use environment variables:
src/lib/supabaseClient.ts
import { createClient } from '@supabase/supabase-js';

const supabaseUrl = import.meta.env.VITE_SUPABASE_URL!;
const supabaseAnonKey = import.meta.env.VITE_SUPABASE_ANON_KEY!;

export const supabase = createClient(supabaseUrl, supabaseAnonKey);
.env.local
VITE_SUPABASE_URL=https://your-project.supabase.co
VITE_SUPABASE_ANON_KEY=your-anon-key-here

Authentication

Email/Password Authentication

Cabina implements email/password auth with Supabase Auth:
import { supabase } from './lib/supabaseClient';

// Sign up
const signUp = async (email: string, password: string) => {
  const { data, error } = await supabase.auth.signUp({
    email,
    password,
    options: {
      data: {
        full_name: name
      }
    }
  });
  
  if (error) throw error;
  return data;
};

// Sign in
const signIn = async (email: string, password: string) => {
  const { data, error } = await supabase.auth.signInWithPassword({
    email,
    password
  });
  
  if (error) throw error;
  return data;
};

// Sign out
const signOut = async () => {
  const { error } = await supabase.auth.signOut();
  if (error) throw error;
};

Session Management

import { useEffect, useState } from 'react';
import { supabase } from './lib/supabaseClient';
import type { Session } from '@supabase/supabase-js';

const App = () => {
  const [session, setSession] = useState<Session | null>(null);

  useEffect(() => {
    // Get initial session
    supabase.auth.getSession().then(({ data: { session } }) => {
      setSession(session);
    });

    // Listen for auth changes
    const {
      data: { subscription },
    } = supabase.auth.onAuthStateChange((_event, session) => {
      setSession(session);
    });

    return () => subscription.unsubscribe();
  }, []);

  return session ? <Dashboard /> : <Login />;
};

Profile Creation Trigger

Auto-create profile when user signs up:
-- Create profile on user signup
CREATE OR REPLACE FUNCTION public.handle_new_user()
RETURNS TRIGGER AS $$
BEGIN
  INSERT INTO public.profiles (id, email, full_name)
  VALUES (
    NEW.id,
    NEW.email,
    NEW.raw_user_meta_data->>'full_name'
  );
  RETURN NEW;
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;

-- Trigger the function on user creation
CREATE TRIGGER on_auth_user_created
  AFTER INSERT ON auth.users
  FOR EACH ROW
  EXECUTE FUNCTION public.handle_new_user();

OAuth Providers (Optional)

Enable Google OAuth:
  1. Go to AuthenticationProviders
  2. Enable Google
  3. Add OAuth credentials from Google Cloud Console
  4. Configure authorized redirect URLs:
    https://<project-ref>.supabase.co/auth/v1/callback
    
Client code:
const signInWithGoogle = async () => {
  const { error } = await supabase.auth.signInWithOAuth({
    provider: 'google',
    options: {
      redirectTo: window.location.origin
    }
  });
  
  if (error) throw error;
};

Database Operations

Querying Data

// Select all events
const { data: events, error } = await supabase
  .from('events')
  .select('*')
  .order('created_at', { ascending: false });

// Select with filters
const { data: activeEvents } = await supabase
  .from('events')
  .select('*, partners(name)')
  .eq('is_active', true)
  .gte('end_date', new Date().toISOString());

// Single row
const { data: event } = await supabase
  .from('events')
  .select('*')
  .eq('event_slug', 'quince-sofia')
  .single();

Inserting Data

const { data, error } = await supabase
  .from('events')
  .insert({
    partner_id: partnerId,
    event_name: 'Quince de Sofia',
    event_slug: 'quince-sofia-2024',
    credits_allocated: 1000,
    selected_styles: ['pb_a', 'suit_b'],
    config: {
      primary_color: '#FF6B35',
      logo_url: 'https://...'
    }
  })
  .select()
  .single();

Updating Data

// Update single row
const { error } = await supabase
  .from('profiles')
  .update({ credits: credits + 500 })
  .eq('id', userId);

// Increment atomically
const { error } = await supabase.rpc('increment_event_credit', {
  p_event_id: eventId
});

Deleting Data

const { error } = await supabase
  .from('generations')
  .delete()
  .eq('id', photoId);

// Bulk delete
const { error } = await supabase
  .from('generations')
  .delete()
  .in('id', [id1, id2, id3]);

Storage

Create Buckets

  1. Go to Storage in Supabase Dashboard
  2. Click New bucket
  3. Create these buckets:
Bucket NamePublicPurpose
user_photosNoUploaded user selfies
generationsYesAI-generated images
event_assetsYesEvent logos and branding

Upload Files

// Upload user photo
const uploadPhoto = async (file: File, userId: string) => {
  const fileName = `uploads/${userId}_${Date.now()}.png`;
  
  const { data, error } = await supabase.storage
    .from('user_photos')
    .upload(fileName, file, {
      contentType: 'image/png',
      upsert: true
    });
  
  if (error) throw error;
  return data.path;
};

Get Public URL

const { data } = supabase.storage
  .from('generations')
  .getPublicUrl('results/photo_123.png');

const publicUrl = data.publicUrl;
// https://xxxxx.supabase.co/storage/v1/object/public/generations/results/photo_123.png

Download Files

const { data, error } = await supabase.storage
  .from('generations')
  .download('results/photo_123.png');

if (data) {
  const url = URL.createObjectURL(data);
  // Use url in <img> or download
}

Delete Files

const { error } = await supabase.storage
  .from('user_photos')
  .remove(['uploads/old_photo.png']);

Edge Functions

See Edge Functions API for complete reference.

Invoke from Client

const { data, error } = await supabase.functions.invoke('cabina-vision', {
  body: {
    user_photo: base64Image,
    model_id: 'pb_a',
    event_id: eventId
  }
});

Local Development

Install Supabase CLI

brew install supabase/tap/supabase

Initialize Local Project

# Initialize Supabase in project
supabase init

# Link to remote project
supabase link --project-ref your-project-ref

# Pull remote schema
supabase db pull

Start Local Supabase

# Start all services (DB, API, Storage, Auth)
supabase start
This starts:
  • API: http://localhost:54321
  • DB: postgresql://postgres:postgres@localhost:54322/postgres
  • Studio: http://localhost:54323
  • Inbucket (emails): http://localhost:54324

Stop Local Supabase

supabase stop

Supabase Configuration

supabase/config.toml

Key configuration from the project:
supabase/config.toml
project_id = "creativa-labs-cabina-de-fotos"

[api]
enabled = true
port = 54321
schemas = ["public", "graphql_public"]
max_rows = 1000

[db]
port = 54322
major_version = 17

[storage]
enabled = true
file_size_limit = "50MiB"

[auth]
enabled = true
site_url = "http://127.0.0.1:3000"
additional_redirect_urls = ["https://127.0.0.1:3000"]
enable_signup = true

[auth.email]
enable_signup = true
enable_confirmations = false

[edge_runtime]
enabled = true
policy = "per_worker"
From source: supabase/config.toml

TypeScript Types

Generate Types from Database

# Generate TypeScript types
supabase gen types typescript --local > src/types/database.types.ts
Usage:
import type { Database } from './types/database.types';

const supabase = createClient<Database>(url, key);

// Now you get full type safety
const { data } = await supabase
  .from('events')
  .select('*'); // data is typed as Event[]

Monitoring & Analytics

Database Usage

DashboardReports:
  • Database size
  • Active connections
  • Queries per second
  • Slow queries

API Usage

DashboardAPI:
  • Requests per day
  • Error rate
  • Response times
  • Top endpoints

Storage Usage

DashboardStorage:
  • Total storage used
  • Bandwidth
  • File count per bucket

Troubleshooting

Causes:
  • Project paused (free tier inactivity)
  • Incorrect URL or API key
  • Network/firewall blocking Supabase
Solution:
  • Check project status in dashboard
  • Verify credentials in .env.local
  • Try from different network
Symptom: Empty results despite data existingSolution:
  • Check RLS policies in SQL Editor
  • Temporarily disable RLS for testing (dev only):
    ALTER TABLE table_name DISABLE ROW LEVEL SECURITY;
    
  • Review RLS Policies
Common causes:
  • File exceeds size limit
  • Missing RLS policy on storage.objects
  • Invalid MIME type
Solution:
  • Check bucket settings
  • Add storage policies
  • Verify file type matches allowed types
Causes:
  • Function exceeds 150s limit
  • Infinite loop or deadlock
  • External API not responding
Solution:
  • Check function logs
  • Add timeout to external fetch calls
  • Optimize database queries

Security Best Practices

Enable RLS on all tables

Never disable RLS in production

Use service role sparingly

Only in edge functions, never client-side

Validate user input

Always sanitize data before database insertion

Monitor API usage

Set up alerts for unusual activity

Next Steps

Database Schema

Complete schema reference

RLS Policies

Row Level Security guide

Edge Functions

API documentation

Deployment

Deploy to production

Build docs developers (and LLMs) love