Local development (default)
No database setup is needed for local development. WhenTURSO_DATABASE_URL is not set, Postcard automatically uses a local SQLite file named local.db in the project root.
Turso cloud setup
For production or cloud testing, connect Postcard to a persistent Turso cloud database.Create a Turso account
Create a free account at turso.tech and install the Turso CLI.
libsql:// prefix and switches to the Turso dialect. The local local.db file is ignored when TURSO_DATABASE_URL is set.
Schema overview
Postcard’s schema is defined insrc/db/schema.ts and contains two tables.
posts table
Stores the source post URL and its scraped content.
| Column | Type | Description |
|---|---|---|
id | text (PK) | Unique identifier for the post. |
url | text | The original post URL. Must be unique. |
platform | text | The detected social media platform (e.g. twitter, reddit). |
markdown | text | Full scraped content of the post, converted to Markdown. |
username | text | Username or handle of the post author. |
timestamp_text | text | Raw timestamp string extracted from the post. |
main_text | text | Primary textual content of the post. |
created_at | integer | Unix timestamp of when the record was created. |
updated_at | integer | Unix timestamp of the last update. |
deleted_at | integer | Soft-delete timestamp (null if not deleted). |
postcards table
Stores analysis results, scores, pipeline logs, and processing state for each forensic trace.
| Column | Type | Description |
|---|---|---|
id | text (PK) | Unique identifier for the postcard. |
post_id | text (FK) | Reference to the associated posts record. |
url | text | The URL that was analyzed. |
platform | text | Detected platform for this analysis. |
postcard_score | real | The overall weighted credibility score (0–1). |
origin_score | real | Score from the origin reachability audit. |
corroboration_score | real | Score from corroboration against trusted sources. |
bias_score | real | Score from the bias analysis stage. |
temporal_score | real | Score from the temporal alignment check. |
verdict | text | Human-readable credibility verdict. |
summary | text | AI-generated summary of the forensic findings. |
confidence_score | real | Confidence level of the overall analysis. |
primary_sources | text | JSON-encoded list of primary sources found. |
queries_executed | text | JSON-encoded list of search queries the agent ran. |
corroboration_log | text | Full log of the corroboration agent’s tool calls. |
audit_log | text | Full log of the origin audit stage. |
hits | integer | Number of times this postcard has been retrieved from cache. |
status | text | Current pipeline status: pending, processing, completed, or failed. |
progress | real | Pipeline progress as a value from 0 to 1. |
stage | text | Current pipeline stage key (e.g. scraping, auditing). |
message | text | Human-readable status message for the current stage. |
error | text | Error message if the pipeline failed. |
started_at | integer | Unix timestamp of when processing began. |
created_at | integer | Unix timestamp of when the record was created. |
updated_at | integer | Unix timestamp of the last update. |
deleted_at | integer | Soft-delete timestamp (null if not deleted). |
Managing the schema
Use the following scripts to manage the database during development:Cache behavior
Postcard caches analyses at the resolved URL level. Submitting the same URL a second time returns the cached postcard immediately without re-running the pipeline. To force a fresh analysis, passrefresh: true in the POST /api/postcards request body:
refresh parameter is only supported in the POST body. The GET endpoint is read-only and does not support forced re-analysis.
The
hits field in the postcards table increments each time a cached result is served, giving you visibility into how frequently a particular URL is being analyzed.