Crawlith automatically creates a local SQLite database on first run:
~/.crawlith/crawlith.db
This database stores:
Sites: Tracked domains and their metadata
Snapshots: Timestamped crawl sessions with configuration
Pages: Normalized URLs, HTML content, HTTP status, and SEO metadata
Edges: Internal link relationships between pages
Metrics: Calculated scores, PageRank, HITS, and health indicators
Do not delete ~/.crawlith/crawlith.db unless you want to lose all crawl history. Crawlith uses this database for incremental crawling and snapshot comparisons.
Run Crawlith without arguments to see the welcome banner and available commands:
crawlith
You should see:
██████╗██████╗ █████╗ ██╗ ██╗██╗ ██╗████████╗██╗ ██╗ 0.1.1 ██╔════╝██╔══██╗██╔══██╗██║ ██║██║ ██║╚══██╔══╝██║ ██║ ██║ ██████╔╝███████║██║ █╗ ██║██║ ██║ ██║ ███████║ ██║ ██╔══██╗██╔══██║██║███╗██║██║ ██║ ██║ ██╔══██║ ╚██████╗██║ ██║██║ ██║╚███╔███╔╝███████╗██║ ██║ ██║ ██║ ╚═════╝╚═╝ ╚═╝╚═╝ ╚═╝ ╚══╝╚══╝ ╚══════╝╚═╝ ╚═╝ ╚═╝ ╚═╝Crawlith — Deterministic crawl intelligence.Usage: crawlith [options] [command]Options: -V, --version output the version number -h, --help display help for commandCommands: crawl [options] [url] Crawl an entire website and build its internal link graph page [options] [url] Analyze a single URL for on-page SEO signals ui Launch the interactive web dashboard probe [options] [url] Run infrastructure audit (TLS, DNS, Security) sites List all tracked sites in the database clean Clean up old snapshots and optimize database help [command] display help for command