Skip to main content

System Requirements

Before installing PentAGI, ensure your system meets these requirements:

Minimum Requirements

  • Operating System: Linux, macOS, or Windows
  • CPU: 2 vCPU minimum (4+ vCPU recommended)
  • RAM: 4GB minimum (8GB+ recommended for optimal performance)
  • Storage: 20GB free disk space
  • Software: Docker and Docker Compose
  • Network: Internet access for downloading images and API access

LLM Provider Requirements

You need at least one of the following:
  • OpenAI API Key - Access to GPT-4, GPT-5, o-series models
  • Anthropic API Key - Access to Claude 3.5, Claude 4 models
  • Google AI API Key - Access to Gemini 2.0, 2.5 models
  • AWS Bedrock Access - AWS credentials with Bedrock permissions
  • Ollama Instance - Local installation for zero-cost inference
AWS Bedrock Rate Limits: Default rate limits are extremely restrictive (2 requests/minute for Claude Sonnet 4). Request quota increases before using Bedrock in production.

Installation Methods

PentAGI offers three installation methods:
  1. Interactive Installer (Recommended) - Guided setup with terminal UI
  2. Manual Installation - Direct Docker Compose setup
  3. Two-Node Architecture - Distributed setup for production environments

The interactive installer provides a streamlined setup experience with system checks, provider configuration, and security hardening.

Supported Platforms

Linux

amd64: Downloadarm64: Download

Windows

amd64: Download

macOS

Intel: DownloadM-series: Download

Quick Installation (Linux amd64)

1

Create Installation Directory

mkdir -p pentagi && cd pentagi
2

Download Installer

wget -O installer.zip https://pentagi.com/downloads/linux/amd64/installer-latest.zip
3

Extract Archive

unzip installer.zip
4

Run Interactive Installer

The installer requires Docker API access. Choose one of these options:Option 1 (Recommended for production):
sudo ./installer
Option 2 (Development environments):
# Add your user to docker group
sudo usermod -aG docker $USER
newgrp docker

# Verify Docker access
docker ps

# Run installer
./installer
Adding a user to the docker group grants root-equivalent privileges. Only do this for trusted users in controlled environments.

What the Installer Does

The interactive installer guides you through:
  1. System Checks
    • Verifies Docker and Docker Compose installation
    • Tests network connectivity
    • Validates system requirements
  2. Environment Setup
    • Creates and configures .env file
    • Sets optimal defaults for your system
    • Generates secure random credentials
  3. Provider Configuration
    • OpenAI (GPT-4, GPT-5, o-series)
    • Anthropic (Claude 3.5, Claude 4)
    • Google AI (Gemini 2.0, 2.5)
    • AWS Bedrock (multi-provider access)
    • Ollama (local inference)
    • Custom providers (OpenRouter, DeepInfra, DeepSeek, Moonshot)
  4. Search Engine Setup
    • DuckDuckGo (no API key required)
    • Google Custom Search
    • Tavily AI
    • Traversaal AI
    • Perplexity AI
    • Searxng meta search
  5. Security Hardening
    • Generates secure passwords
    • Creates SSL certificates
    • Sets cookie signing salt
    • Configures database credentials
  6. Deployment
    • Downloads Docker images
    • Starts PentAGI services
    • Validates deployment

For Production Environments

For production or security-sensitive deployments, use a two-node architecture where worker operations are isolated:
  • Main Node: Runs PentAGI core services, UI, and databases
  • Worker Node: Executes penetration testing containers in isolation
Benefits:
  • Isolated execution environment
  • Separate network boundaries
  • Docker-in-Docker with TLS authentication
  • Support for out-of-band (OOB) attack techniques

Worker Node Setup Guide

Detailed instructions for distributed two-node architecture

Method 2: Manual Installation

For users who prefer direct control over the installation process.
1

Create Working Directory

mkdir pentagi && cd pentagi
2

Download Environment File

curl -o .env https://raw.githubusercontent.com/vxcontrol/pentagi/master/.env.example
3

Download Provider Configuration Files

curl -o example.custom.provider.yml https://raw.githubusercontent.com/vxcontrol/pentagi/master/examples/configs/custom-openai.provider.yml
curl -o example.ollama.provider.yml https://raw.githubusercontent.com/vxcontrol/pentagi/master/examples/configs/ollama-llama318b.provider.yml
4

Configure LLM Providers

Edit .env and add at least one LLM provider:
.env
# Required: At least one LLM provider
OPEN_AI_KEY=your_openai_key
ANTHROPIC_API_KEY=your_anthropic_key
GEMINI_API_KEY=your_gemini_key

# Optional: AWS Bedrock
BEDROCK_REGION=us-east-1
BEDROCK_ACCESS_KEY_ID=your_aws_access_key
BEDROCK_SECRET_ACCESS_KEY=your_aws_secret_key

# Optional: Ollama (local inference)
OLLAMA_SERVER_URL=http://localhost:11434
OLLAMA_SERVER_MODEL=llama3.1:8b-instruct-q8_0
5

Configure Search Engines (Optional)

.env
# DuckDuckGo (no API key required)
DUCKDUCKGO_ENABLED=true

# Google Custom Search
GOOGLE_API_KEY=your_google_key
GOOGLE_CX_KEY=your_google_cx

# Tavily AI
TAVILY_API_KEY=your_tavily_key

# Traversaal AI
TRAVERSAAL_API_KEY=your_traversaal_key

# Perplexity AI
PERPLEXITY_API_KEY=your_perplexity_key
PERPLEXITY_MODEL=sonar-pro
PERPLEXITY_CONTEXT_SIZE=medium

# Searxng meta search
SEARXNG_URL=http://your-searxng-instance:8080
SEARXNG_CATEGORIES=general
6

Configure Security Settings

Update security-related variables:
.env
# Main Security Settings
COOKIE_SIGNING_SALT=your_random_salt_value
PUBLIC_URL=https://pentagi.example.com

# SSL Certificates (optional)
SERVER_SSL_CRT=/path/to/your/cert.crt
SERVER_SSL_KEY=/path/to/your/cert.key

# Database Credentials
PENTAGI_POSTGRES_USER=pentagi_user
PENTAGI_POSTGRES_PASSWORD=your_secure_password
PENTAGI_POSTGRES_DB=pentagidb

# Neo4j Credentials (for Graphiti)
NEO4J_USER=neo4j
NEO4J_PASSWORD=your_neo4j_password
7

Configure Knowledge Graph (Optional)

Enable Graphiti for semantic relationship tracking:
.env
GRAPHITI_ENABLED=true
GRAPHITI_TIMEOUT=30
GRAPHITI_URL=http://graphiti:8000
GRAPHITI_MODEL_NAME=gpt-5-mini

NEO4J_USER=neo4j
NEO4J_DATABASE=neo4j
NEO4J_PASSWORD=your_secure_password
NEO4J_URI=bolt://neo4j:7687
8

Remove Inline Comments (Optional)

If using .env as envFile in VSCode or IDEs:
perl -i -pe 's/\s+#.*$//' .env
9

Download Docker Compose File

curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose.yml
10

Launch PentAGI

Start the core services:
docker compose up -d
Or with all optional services:
# Download optional compose files
curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-langfuse.yml
curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-graphiti.yml
curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-observability.yml

# Start all services
docker compose -f docker-compose.yml \
  -f docker-compose-langfuse.yml \
  -f docker-compose-graphiti.yml \
  -f docker-compose-observability.yml \
  up -d
11

Access Web Interface

Open your browser and navigate to:
https://localhost:8443
Default credentials:
Change the default password immediately after first login!

Advanced Configuration

Ollama Local Inference

For zero-cost local inference using Ollama:
1

Install Ollama

Install Ollama on your host system:
curl -fsSL https://ollama.ai/install.sh | sh
2

Create Custom Model with Extended Context

PentAGI requires models with larger context windows. Create a custom model:For Llama 3.1 8B:
Modelfile_llama318b
FROM llama3.1:8b-instruct-q8_0
PARAMETER num_ctx 110000
PARAMETER temperature 0.7
PARAMETER top_p 0.9
ollama create llama3.1:8b-instruct-tc -f Modelfile_llama318b
For Qwen3 32B:
Modelfile_qwen332b
FROM qwen3:32b-fp16
PARAMETER num_ctx 110000
PARAMETER temperature 0.3
PARAMETER top_p 0.8
ollama create qwen3:32b-fp16-tc -f Modelfile_qwen332b
The num_ctx parameter can only be set during model creation via Modelfile. It cannot be changed after creation or overridden at runtime.
3

Configure PentAGI

Update .env file:
.env
OLLAMA_SERVER_URL=http://host.docker.internal:11434
OLLAMA_SERVER_MODEL=llama3.1:8b-instruct-tc
OLLAMA_SERVER_CONFIG_PATH=/opt/pentagi/conf/ollama-llama318b.provider.yml
OLLAMA_SERVER_PULL_MODELS_ENABLED=false
OLLAMA_SERVER_LOAD_MODELS_ENABLED=false

Custom LLM Provider Configuration

For OpenRouter, DeepInfra, DeepSeek, Moonshot, or other OpenAI-compatible APIs:
.env
# Custom Provider Configuration
LLM_SERVER_URL=https://openrouter.ai/api/v1
LLM_SERVER_KEY=your_api_key
LLM_SERVER_MODEL=                              # Leave empty
LLM_SERVER_CONFIG_PATH=/opt/pentagi/conf/openrouter.provider.yml
LLM_SERVER_PROVIDER=                           # For LiteLLM proxy
LLM_SERVER_LEGACY_REASONING=false              # Set true for OpenAI
LLM_SERVER_PRESERVE_REASONING=false            # Set true for Moonshot

Docker Image Configuration

Control which Docker images are used for task execution:
.env
# Default image for general tasks
DOCKER_DEFAULT_IMAGE=debian:latest

# Default image for penetration testing
DOCKER_DEFAULT_IMAGE_FOR_PENTEST=vxcontrol/kali-linux

OAuth Integration

Enable GitHub and Google authentication:
.env
# GitHub OAuth
OAUTH_GITHUB_CLIENT_ID=your_github_client_id
OAUTH_GITHUB_CLIENT_SECRET=your_github_client_secret

# Google OAuth
OAUTH_GOOGLE_CLIENT_ID=your_google_client_id
OAUTH_GOOGLE_CLIENT_SECRET=your_google_client_secret

Proxy Configuration

For networks requiring proxy access:
.env
# Global proxy for all LLM providers and external APIs
PROXY_URL=http://your-proxy:8080

SSL/TLS Certificate Configuration

For custom CA certificates:
.env
# Path to CA certificate bundle (inside container)
EXTERNAL_SSL_CA_PATH=/opt/pentagi/ssl/ca-bundle.pem
EXTERNAL_SSL_INSECURE=false
Place your CA bundle in the host directory:
cp ca-bundle.pem ./pentagi-ssl/

Service Ports

Default ports used by PentAGI services:
ServicePortDescription
PentAGI Web UI8443Main web interface (HTTPS)
PostgreSQL5432Vector database
Web Scraper9443Browser automation service
Langfuse UI4000LLM observability dashboard
Grafana3000System monitoring
Neo4j Browser7474Knowledge graph interface
All ports can be customized via environment variables in .env.

Verifying Installation

1

Check Service Status

docker compose ps
All services should show status Up.
2

View Logs

docker compose logs -f pentagi
Look for startup messages indicating successful initialization.
3

Test LLM Connection

Run the LLM tester utility:
docker exec -it pentagi /opt/pentagi/bin/ctester -verbose
This validates your LLM provider configuration.
4

Test Web Interface

  1. Navigate to https://localhost:8443
  2. Login with default credentials
  3. Create a new assistant
  4. Send a test message

Troubleshooting

If you see network creation errors:
# Create networks manually
docker network create pentagi-network
docker network create observability-network
docker network create langfuse-network

# Then restart services
docker compose up -d
For Production:
sudo docker compose up -d
For Development:
sudo usermod -aG docker $USER
newgrp docker
Ensure PostgreSQL is fully started:
docker compose logs pgvector
docker compose restart pentagi
Verify API keys are correct:
docker exec -it pentagi /opt/pentagi/bin/ctester -verbose
Check for rate limits, invalid keys, or network issues.
For custom CA certificates:
  1. Place bundle in ./pentagi-ssl/ca-bundle.pem
  2. Set EXTERNAL_SSL_CA_PATH=/opt/pentagi/ssl/ca-bundle.pem
  3. Restart: docker compose restart pentagi

Next Steps

Configuration Guide

Detailed configuration options for all services

LLM Provider Setup

Configure and optimize LLM providers

Testing Utilities

Validate and test your configuration

Architecture

Understanding PentAGI’s architecture
Security Best Practices:
  • Change all default passwords
  • Use strong random values for all secrets
  • Configure SSL certificates for production
  • Never expose services directly to the internet
  • Review and restrict Docker permissions
  • Keep Docker images updated

Build docs developers (and LLMs) love