Skip to main content

Overview

This guide walks you through manually installing PentAGI without using the interactive installer. Manual installation provides more control over the configuration process and is useful for advanced deployments or custom environments.
For most users, we recommend using the Interactive Installer for a streamlined setup experience.

Prerequisites

Before starting the manual installation, ensure you have:
  • Docker and Docker Compose installed
  • Minimum 2 vCPU
  • Minimum 4GB RAM
  • 20GB free disk space
  • Internet access for downloading images
  • At least one LLM provider API key (OpenAI, Anthropic, Gemini, Bedrock, or Ollama)

Installation Steps

1

Create working directory

mkdir pentagi && cd pentagi
2

Download environment template

Download the .env.example file:
curl -o .env https://raw.githubusercontent.com/vxcontrol/pentagi/master/.env.example
3

Download provider configuration examples

These files provide templates for custom LLM configurations:
curl -o example.custom.provider.yml https://raw.githubusercontent.com/vxcontrol/pentagi/master/examples/configs/custom-openai.provider.yml
curl -o example.ollama.provider.yml https://raw.githubusercontent.com/vxcontrol/pentagi/master/examples/configs/ollama-llama318b.provider.yml
4

Configure LLM providers

Edit the .env file and add API keys for at least one LLM provider:
OPEN_AI_KEY=your_openai_key
OPEN_AI_SERVER_URL=https://api.openai.com/v1
You must configure at least one LLM provider for PentAGI to function. Without a configured provider, the system will not be able to perform AI-powered penetration testing.
5

Configure search engines (optional)

Add search engine API keys for enhanced information gathering:
# DuckDuckGo (no API key required)
DUCKDUCKGO_ENABLED=true

# Google Custom Search
GOOGLE_API_KEY=your_google_key
GOOGLE_CX_KEY=your_google_cx

# Tavily
TAVILY_API_KEY=your_tavily_key

# Traversaal
TRAVERSAAL_API_KEY=your_traversaal_key

# Perplexity
PERPLEXITY_API_KEY=your_perplexity_key
PERPLEXITY_MODEL=sonar-pro

# Searxng (self-hosted)
SEARXNG_URL=http://your-searxng-instance:8080
6

Configure security settings

Update security-related environment variables with strong, random values:
# Generate cookie signing salt (64 characters)
openssl rand -hex 32

# Generate encryption key (32 bytes in hex)
openssl rand -hex 32

# Generate database passwords
openssl rand -base64 32
Never use default passwords in production. Generate strong, unique passwords for all services.
7

Configure Graphiti knowledge graph (optional)

Enable the Graphiti knowledge graph for semantic memory:
# Enable Graphiti
GRAPHITI_ENABLED=true
GRAPHITI_TIMEOUT=30
GRAPHITI_URL=http://graphiti:8000
GRAPHITI_MODEL_NAME=gpt-5-mini

# Neo4j settings (required by Graphiti)
NEO4J_USER=neo4j
NEO4J_DATABASE=neo4j
NEO4J_PASSWORD=<random_password>
NEO4J_URI=bolt://neo4j:7687

# Graphiti requires OpenAI API key
OPEN_AI_KEY=your_openai_key
8

Configure assistant settings (optional)

Set default behavior for AI assistants:
# Default value for agent delegation
ASSISTANT_USE_AGENTS=false
9

Remove inline comments (for IDE compatibility)

If using the .env file with VSCode or other IDEs:
perl -i -pe 's/\s+#.*$//' .env
This step removes inline comments that may cause issues with some environment file parsers.
10

Download Docker Compose configuration

Download the main docker-compose.yml file:
curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose.yml
11

Start PentAGI

Launch the PentAGI stack:
docker compose up -d
This will start the core services:
  • Frontend UI (React + TypeScript)
  • Backend API (Go + GraphQL)
  • PostgreSQL with pgvector
  • Task queue and AI agents

Accessing PentAGI

Once the containers are running, access PentAGI at:
https://localhost:8443
Default credentials:
Change the default password immediately after your first login.

Optional Services

Langfuse (LLM Observability)

For advanced LLM monitoring and analytics:
1

Configure Langfuse environment variables

Add to your .env file:
# Langfuse database credentials
LANGFUSE_POSTGRES_USER=langfuse
LANGFUSE_POSTGRES_PASSWORD=<random_password>
LANGFUSE_CLICKHOUSE_USER=default
LANGFUSE_CLICKHOUSE_PASSWORD=<random_password>
LANGFUSE_REDIS_AUTH=<random_password>

# Encryption keys
LANGFUSE_SALT=<random_salt>
LANGFUSE_ENCRYPTION_KEY=<32_byte_hex>
LANGFUSE_NEXTAUTH_SECRET=<random_secret>

# Admin credentials
LANGFUSE_INIT_USER_EMAIL=[email protected]
LANGFUSE_INIT_USER_PASSWORD=<admin_password>
LANGFUSE_INIT_USER_NAME=Admin

# API keys (auto-generated)
LANGFUSE_INIT_PROJECT_PUBLIC_KEY=<public_key>
LANGFUSE_INIT_PROJECT_SECRET_KEY=<secret_key>

# S3 storage (MinIO)
LANGFUSE_S3_ACCESS_KEY_ID=<access_key>
LANGFUSE_S3_SECRET_ACCESS_KEY=<secret_key>

# Enable integration with PentAGI
LANGFUSE_BASE_URL=http://langfuse-web:3000
LANGFUSE_PUBLIC_KEY=<from_above>
LANGFUSE_SECRET_KEY=<from_above>
2

Download Langfuse compose file

curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-langfuse.yml
3

Start Langfuse services

docker compose -f docker-compose.yml -f docker-compose-langfuse.yml up -d
4

Access Langfuse UI

Visit http://localhost:4000 and login with your configured credentials.

Graphiti (Knowledge Graph)

For semantic relationship tracking:
1

Download Graphiti compose file

curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-graphiti.yml
2

Start Graphiti services

docker compose -f docker-compose.yml -f docker-compose-graphiti.yml up -d
3

Verify Graphiti is running

docker compose logs -f graphiti
Neo4j browser available at: http://localhost:7474

Observability Stack

For comprehensive monitoring:
1

Enable OpenTelemetry in .env

OTEL_HOST=otelcol:8148
2

Download observability compose file

curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-observability.yml
3

Start observability services

docker compose -f docker-compose.yml -f docker-compose-observability.yml up -d
4

Access Grafana

Visit http://localhost:3000 for dashboards showing:
  • System metrics (VictoriaMetrics)
  • Distributed traces (Jaeger)
  • Log aggregation (Loki)

Run All Services

To start PentAGI with all optional services:
docker compose -f docker-compose.yml \
  -f docker-compose-langfuse.yml \
  -f docker-compose-graphiti.yml \
  -f docker-compose-observability.yml \
  up -d
Create shell aliases for easier management:
alias pentagi="docker compose -f docker-compose.yml -f docker-compose-langfuse.yml -f docker-compose-graphiti.yml -f docker-compose-observability.yml"
alias pentagi-up="pentagi up -d"
alias pentagi-down="pentagi down"

Network Requirements

If you encounter network errors, create the required Docker networks first:
docker network create pentagi-network
docker network create observability-network
docker network create langfuse-network
Then run the compose files again.

Verification

Verify your installation:
1

Check container status

docker compose ps
All containers should show “running” status.
2

Check logs for errors

docker compose logs -f pentagi
3

Access web interface

Open https://localhost:8443 in your browser and login.
4

Verify LLM provider connection

Create a test assistant in the UI to verify your configured LLM provider is working.

Troubleshooting

Check logs for the specific container:
docker compose logs <container_name>
Common issues:
  • Missing or invalid API keys
  • Port conflicts (check with netstat -tlnp)
  • Insufficient resources
Verify:
  • API key is correct and active
  • No typos in environment variables
  • Network connectivity to provider’s API
  • Proxy settings if applicable (PROXY_URL)
Check PostgreSQL status:
docker compose logs pgvector
Ensure database credentials match in .env file.
The default configuration uses self-signed certificates. To use custom certificates:
  1. Place your certificates in a secure location
  2. Update .env:
    SERVER_SSL_CRT=/path/to/cert.pem
    SERVER_SSL_KEY=/path/to/key.pem
    
  3. Mount certificates in docker-compose.yml

Updating PentAGI

To update to the latest version:
# Pull latest images
docker compose pull

# Restart services
docker compose up -d

Next Steps

First Pentest

Run your first penetration test

Custom Assistants

Create custom AI assistants

Distributed Setup

Configure two-node deployment

Best Practices

Security and ethical guidelines

Build docs developers (and LLMs) love