Quick Start
The easiest way to get started is using the standalone server, which includes both the MCP endpoints and the web interface in a single process.- npx (Recommended)
- Docker
If you have Node.js 22+ installed, run the server directly with a single command:This runs the server on port 6280 by default. Open http://localhost:6280 to access the web interface.With OpenAI Embeddings (Recommended):Using an embedding model is optional but dramatically improves search quality by enabling semantic vector search.
Use
nvm use 22 and run npm rebuild if you recently changed Node versions.Installation Methods
Embedded Server
Run the MCP server directly embedded in your AI assistant without a separate process or web interface. This provides MCP integration only. Add this to your MCP settings (VS Code, Claude Desktop, etc.):Docker Compose (Scaling)
For production deployments or when you need to scale processing, use Docker Compose to run separate services. Service Architecture:- Worker (port 8080): Handles documentation processing jobs
- MCP Server (port 6280): Provides
/sseendpoint for AI tools - Web Interface (port 6281): Browser-based management interface
Next Steps
Configuration
Configure the server with environment variables and config files
Connect Clients
Connect Claude, VS Code, Cursor, and other MCP clients
