Skip to main content

Quick Start

The easiest way to get started is using the standalone server, which includes both the MCP endpoints and the web interface in a single process.

Installation Methods

Embedded Server

Run the MCP server directly embedded in your AI assistant without a separate process or web interface. This provides MCP integration only. Add this to your MCP settings (VS Code, Claude Desktop, etc.):
{
  "mcpServers": {
    "docs-mcp-server": {
      "command": "npx",
      "args": ["@arabold/docs-mcp-server@latest"],
      "disabled": false,
      "autoApprove": []
    }
  }
}
When running in embedded mode, you lose access to the Web Interface unless you launch it separately.

Docker Compose (Scaling)

For production deployments or when you need to scale processing, use Docker Compose to run separate services.
1

Clone the repository

git clone https://github.com/arabold/docs-mcp-server.git
cd docs-mcp-server
2

Set environment variables

export OPENAI_API_KEY="your-key-here"
3

Start all services

docker compose up -d
Service Architecture:
  • Worker (port 8080): Handles documentation processing jobs
  • MCP Server (port 6280): Provides /sse endpoint for AI tools
  • Web Interface (port 6281): Browser-based management interface

Next Steps

Configuration

Configure the server with environment variables and config files

Connect Clients

Connect Claude, VS Code, Cursor, and other MCP clients

Build docs developers (and LLMs) love