Skip to main content
Postcard is built for Vercel and uses @vercel/functions for serverless background processing. Vercel is the recommended deployment target.
Postcard uses the Vercel AI SDK’s waitUntil pattern to run the forensic pipeline as a background task after returning an immediate 202 Accepted response to the client. This allows long-running AI analysis to complete without blocking the HTTP response or hitting serverless function timeout limits.

Deploy to Vercel

1

Fork or clone the repository

Fork postcardhq/postcard on GitHub, or push your own clone to a GitHub, GitLab, or Bitbucket repository.
2

Connect to Vercel

Go to vercel.com/new and import your repository. Vercel automatically detects the Next.js framework and configures the build settings.
3

Add environment variables

In the Vercel project settings under Environment Variables, add the following:Required:
VariableValue
GOOGLE_GENERATIVE_AI_API_KEYYour Gemini API key from Google AI Studio
NEXT_PUBLIC_FAKE_PIPELINEfalse
Recommended for production:
VariableValue
TURSO_DATABASE_URLYour Turso database URL (e.g. libsql://your-database.turso.io)
TURSO_AUTH_TOKENYour Turso auth token
NEXT_PUBLIC_APP_URLYour deployment URL (e.g. https://postcard.example.com)
See Environment variables for the full list of options, including social media API keys for improved platform ingestion.
If you omit TURSO_DATABASE_URL, Vercel will use an in-memory SQLite instance that does not persist data between deployments. Set up a Turso cloud database for persistent storage.
4

Deploy

Click Deploy. Vercel runs next build and deploys the application. Subsequent pushes to your main branch trigger automatic redeployments.

API reference endpoint

The /api/reference endpoint is powered by Scalar and serves an interactive API reference in both development and production. Visit /api/reference on your deployed instance to explore and test the Postcard API directly from the browser. The underlying OpenAPI specification is available at /openapi.json.

CORS

The Postcard API allows requests from all origins by default. Responses include the following headers:
Content-Type: application/json
Access-Control-Allow-Methods: GET, POST, OPTIONS
Access-Control-Allow-Origin: *
No additional CORS configuration is required for cross-origin API access.

Using fake mode for demos

For public demos or presentations where API quota reliability matters, you can deploy with fake mode enabled:
NEXT_PUBLIC_FAKE_PIPELINE=true
This guarantees 100% responsiveness without consuming Gemini API credits or making live network requests.
Do not use NEXT_PUBLIC_FAKE_PIPELINE=true in a production instance where real forensic analysis is required. Fake mode returns static mock data and does not perform any actual source tracing or credibility scoring.

Production checklist

Before going live, confirm the following:
  • NEXT_PUBLIC_FAKE_PIPELINE is set to false
  • GOOGLE_GENERATIVE_AI_API_KEY is set to a valid, active key
  • TURSO_DATABASE_URL and TURSO_AUTH_TOKEN are configured for persistent storage
  • NEXT_PUBLIC_APP_URL is set to your canonical deployment URL for correct OG image and metadata resolution

Build docs developers (and LLMs) love