Skip to main content
Postcard uses Drizzle ORM with libSQL (SQLite) and Turso for type-safe persistence of forensic audit logs and analysis results.

Local development (default)

No database setup is needed for local development. When TURSO_DATABASE_URL is not set, Postcard automatically uses a local SQLite file named local.db in the project root.
# No configuration needed — this is the default behavior
npm run db:push   # creates local.db and syncs the schema
Use npm run db:studio to open Drizzle Studio, a visual browser-based interface for inspecting your local forensic audit logs. This is especially helpful for debugging pipeline outputs, verifying score calculations, and browsing raw corroboration data.

Turso cloud setup

For production or cloud testing, connect Postcard to a persistent Turso cloud database.
1

Create a Turso account

Create a free account at turso.tech and install the Turso CLI.
2

Create a database

turso db create postcard-db
3

Get the database URL

turso db show postcard-db --url
4

Generate an auth token

turso db tokens create postcard-db
5

Add credentials to .env

Add the URL and token to your .env file:
TURSO_DATABASE_URL=libsql://your-database.turso.io
TURSO_AUTH_TOKEN=your-auth-token
6

Apply the schema

npm run db:push
Drizzle automatically detects the libsql:// prefix and switches to the Turso dialect. The local local.db file is ignored when TURSO_DATABASE_URL is set.

Schema overview

Postcard’s schema is defined in src/db/schema.ts and contains two tables.

posts table

Stores the source post URL and its scraped content.
ColumnTypeDescription
idtext (PK)Unique identifier for the post.
urltextThe original post URL. Must be unique.
platformtextThe detected social media platform (e.g. twitter, reddit).
markdowntextFull scraped content of the post, converted to Markdown.
usernametextUsername or handle of the post author.
timestamp_texttextRaw timestamp string extracted from the post.
main_texttextPrimary textual content of the post.
created_atintegerUnix timestamp of when the record was created.
updated_atintegerUnix timestamp of the last update.
deleted_atintegerSoft-delete timestamp (null if not deleted).

postcards table

Stores analysis results, scores, pipeline logs, and processing state for each forensic trace.
ColumnTypeDescription
idtext (PK)Unique identifier for the postcard.
post_idtext (FK)Reference to the associated posts record.
urltextThe URL that was analyzed.
platformtextDetected platform for this analysis.
postcard_scorerealThe overall weighted credibility score (0–1).
origin_scorerealScore from the origin reachability audit.
corroboration_scorerealScore from corroboration against trusted sources.
bias_scorerealScore from the bias analysis stage.
temporal_scorerealScore from the temporal alignment check.
verdicttextHuman-readable credibility verdict.
summarytextAI-generated summary of the forensic findings.
confidence_scorerealConfidence level of the overall analysis.
primary_sourcestextJSON-encoded list of primary sources found.
queries_executedtextJSON-encoded list of search queries the agent ran.
corroboration_logtextFull log of the corroboration agent’s tool calls.
audit_logtextFull log of the origin audit stage.
hitsintegerNumber of times this postcard has been retrieved from cache.
statustextCurrent pipeline status: pending, processing, completed, or failed.
progressrealPipeline progress as a value from 0 to 1.
stagetextCurrent pipeline stage key (e.g. scraping, auditing).
messagetextHuman-readable status message for the current stage.
errortextError message if the pipeline failed.
started_atintegerUnix timestamp of when processing began.
created_atintegerUnix timestamp of when the record was created.
updated_atintegerUnix timestamp of the last update.
deleted_atintegerSoft-delete timestamp (null if not deleted).

Managing the schema

Use the following scripts to manage the database during development:
npm run db:push     # sync schema changes to the database
npm run db:studio   # open Drizzle Studio at http://local.drizzle.studio

Cache behavior

Postcard caches analyses at the resolved URL level. Submitting the same URL a second time returns the cached postcard immediately without re-running the pipeline. To force a fresh analysis, pass refresh: true in the POST /api/postcards request body:
{
  "url": "https://x.com/user/status/123",
  "userApiKey": "your-gemini-api-key",
  "refresh": true
}
The refresh parameter is only supported in the POST body. The GET endpoint is read-only and does not support forced re-analysis.
The hits field in the postcards table increments each time a cached result is served, giving you visibility into how frequently a particular URL is being analyzed.

Build docs developers (and LLMs) love