Overview
Thelncrawl CLI is built with Typer and organized into subcommands. Running lncrawl with no arguments starts the web server on 127.0.0.1:8080.
Global options
| Flag | Short | Description |
|---|---|---|
--verbose | -l | Increase log verbosity. Use -l for warn, -ll for info, -lll for debug. |
--config PATH | -c PATH | Path to a config file to load instead of the default. |
--install-completion | Install shell tab-completion for the current shell. | |
--show-completion | Print the shell completion script to stdout. | |
--help | -h | Show the help message and exit. |
crawl
Download chapters from a novel page URL and export them to one or more formats.Flags
crawl options
crawl options
| Flag | Description |
|---|---|
--noin | Disable interactive mode. Requires all options to be passed as flags. |
--all | Download all chapters. |
--first N | Download the first N chapters (N must be ≥ 1). |
--last N | Download the latest N chapters (N must be ≥ 1). |
-f / --format FORMAT | Output format. Can be repeated for multiple formats. See Output formats for the full list. |
--user TEXT | Username or email for sources that require login. |
--pass TEXT | Password or token for sources that require login. |
URL | (Argument) Novel details page URL. |
Examples
When
--noin is set and no format is specified with -f, all available formats are produced.search
Search for novels across all searchable sources.Flags
search options
search options
| Flag | Short | Default | Description |
|---|---|---|---|
--source TEXT | -s | Filter which sources to search (matches against source URL). | |
--concurrency N | -c | 15 | Maximum number of concurrent source searches (1–25). |
--limit N | -l | 10 | Maximum number of results to return (1–25). |
--timeout SECONDS | -t | 30 | Per-source timeout in seconds. |
QUERY | (Argument) Search query string. Must be at least 2 characters. |
Examples
server
Start the Lightnovel Crawler web server with a browser UI and REST API.Flags
server options
server options
| Flag | Short | Default | Description |
|---|---|---|---|
--host TEXT | -h | 0.0.0.0 | Host address to bind the server to. |
--port INTEGER | -p | 8080 | Port to listen on. |
--watch | -w | False | Run in watch/reload mode (auto-restarts on code changes). |
--worker INTEGER | -n | 1 | Number of worker processes (only used with --watch). |
Examples
sources
Manage and inspect crawler sources.sources list
sources list
List all available crawler sources.
| Flag | Short | Default | Description |
|---|---|---|---|
--query TEXT | -q | Filter sources by keyword in the URL. | |
--can-search | -s | Show only sources that support novel search. | |
--can-login | -l | Show only sources that support login. | |
--mtl | -b | Show only machine-translated sources. | |
--manga | -m | Show only manga/manhua sources. | |
--with-rejected | False | Include rejected or disabled sources. | |
--output-type TEXT | -o | table | Output type: table, json, yaml, csv, or text. |
sources view
sources view
Inspect a single source crawler.The
QUERY argument accepts a crawler name, URL, or file path.sources create
sources create
Scaffold a new source crawler (optionally using OpenAI for auto-generation).
| Flag | Short | Description |
|---|---|---|
--noin | Disable interactive mode. | |
--locale TEXT | -l | Content language ISO 639-1 code (e.g. en, zh). |
--features FEATURE | -f | Crawler features to enable. Repeatable: search, login, manga, mtl, volumes. |
--openai | Use OpenAI to auto-generate the crawler implementation. | |
--overwrite | Replace an existing crawler file. | |
URL | Target website URL. |
config
View and modify configuration settings.config view
config view
Display all configuration sections and values.
config get
config get
Read a single configuration value.
config set
config set
Write a configuration value.