Skip to main content
Lightnovel Crawler ships a self-hosted web server that provides a browser UI for managing your novel library and a REST API for integrations.

Starting the server

1

Install Lightnovel Crawler

Make sure lncrawl is installed. See the Quickstart guide for installation options.
2

Start the server

Run the server subcommand:
lncrawl server
The server starts on 0.0.0.0:8080 by default. To enable info-level logging and specify a port explicitly:
lncrawl -ll server -p 8080
3

Open the browser UI

Navigate to http://localhost:8080 in your browser.
4

Log in

Use the default credentials:
FieldValue
Usernameadmin
Passwordadmin
Change the default password after your first login. The default credentials are well-known and should not be used on any network-accessible server.

Server options

FlagShortDefaultDescription
--host TEXT-h0.0.0.0Host address to bind to. Use 127.0.0.1 to restrict to localhost.
--port INTEGER-p8080Port to listen on.
--watch-wFalseEnable auto-reload on code changes (development use).
--worker INTEGER-n1Number of worker processes (only applies in watch mode).

What the web UI provides

  • Library management — add novels to your library, browse metadata, and track reading progress.
  • Crawling jobs — submit download jobs for individual novels or chapters and monitor their status in the job queue.
  • Artifact downloads — download generated e-books in any supported format directly from the browser.
  • Source browsing — explore and search across all supported crawler sources.
  • Configuration — adjust server and crawler settings through the UI.

REST API

The REST API is available at:
http://localhost:8080/api
Authentication uses the same credentials as the web UI. Endpoints are organized around the following resources:
Path prefixDescription
/api/authAuthentication tokens and user management
/api/novelsNovel library management (list/search/delete)
/api/novel/{id}/chaptersChapter listing for a novel
/api/novel/{id}/volumesVolume listing for a novel
/api/chapter/{id}Individual chapter data and content
/api/jobsBackground crawl job queue
/api/job/create/*Create crawl, chapter, and artifact jobs
/api/artifactsGenerated e-book files

Port configuration

To run on a different port:
lncrawl server -p 9000
To expose only on localhost (useful when proxying through Nginx or Caddy):
lncrawl server --host 127.0.0.1 --port 8080

Running in Docker

For persistent, production-style deployments use Docker. See the Docker page for full instructions, including a Docker Compose setup with PostgreSQL.
docker run -v ~/.lncrawl:/data -p 8080:8080 --rm -it lncrawl server
The LNCRAWL_DATA_PATH environment variable controls where the container stores its data (/data by default).

Build docs developers (and LLMs) love