Skip to main content

Prerequisites

Before you begin, ensure you have:
For production deployments or security-sensitive environments, consider using a two-node architecture where worker operations are isolated on a separate server.

Quick Start with Docker Compose

1

Create Working Directory

Create a directory for PentAGI and navigate to it:
mkdir pentagi && cd pentagi
2

Download Environment File

Download the example environment configuration:
curl -o .env https://raw.githubusercontent.com/vxcontrol/pentagi/master/.env.example
3

Download Provider Configuration Files

Download the example provider configuration files:
curl -o example.custom.provider.yml https://raw.githubusercontent.com/vxcontrol/pentagi/master/examples/configs/custom-openai.provider.yml
curl -o example.ollama.provider.yml https://raw.githubusercontent.com/vxcontrol/pentagi/master/examples/configs/ollama-llama318b.provider.yml
4

Configure API Keys

Edit the .env file and add at least one LLM provider API key:
.env
# Required: At least one of these LLM providers
OPEN_AI_KEY=your_openai_key
ANTHROPIC_API_KEY=your_anthropic_key
GEMINI_API_KEY=your_gemini_key

# Optional: AWS Bedrock provider
BEDROCK_REGION=us-east-1
BEDROCK_ACCESS_KEY_ID=your_aws_access_key
BEDROCK_SECRET_ACCESS_KEY=your_aws_secret_key

# Optional: Local LLM provider (zero-cost)
OLLAMA_SERVER_URL=http://localhost:11434
OLLAMA_SERVER_MODEL=llama3.1:8b-instruct-q8_0

# Optional: Search engine capabilities
DUCKDUCKGO_ENABLED=true
GOOGLE_API_KEY=your_google_key
GOOGLE_CX_KEY=your_google_cx
TAVILY_API_KEY=your_tavily_key
You must set at least one Language Model provider to use PentAGI. Additional API keys for search engines are optional but recommended for better results.
5

Update Security Settings

For production use, change security-related variables in .env:
.env
# Main Security Settings
COOKIE_SIGNING_SALT=your_random_salt_here
PUBLIC_URL=https://pentagi.example.com

# Database Credentials
PENTAGI_POSTGRES_USER=your_db_user
PENTAGI_POSTGRES_PASSWORD=your_secure_password
6

Download Docker Compose File

Download the Docker Compose configuration:
curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose.yml
7

Launch PentAGI

Start all services with Docker Compose:
docker compose up -d
This will:
  • Pull the required Docker images
  • Start the PostgreSQL database with pgvector
  • Launch the PentAGI backend and frontend
  • Start the web scraper service
  • Initialize the system
8

Access the Web Interface

Open your browser and navigate to:
https://localhost:8443
Your browser will show a security warning because PentAGI uses a self-signed SSL certificate by default. This is normal for local installations. Click “Advanced” and proceed to accept the certificate.
Default credentials:
Change the default password immediately after first login!

Verify Installation

After accessing the web interface, verify your installation:
  1. Check Service Status: All containers should be running:
    docker compose ps
    
  2. View Logs: Monitor the startup logs:
    docker compose logs -f pentagi
    
  3. Test LLM Connection: Create a new assistant in the web UI and send a test message

Optional: Enable Advanced Features

Langfuse Analytics (LLM Observability)

1

Download Langfuse Compose File

curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-langfuse.yml
2

Configure Langfuse in .env

.env
LANGFUSE_BASE_URL=http://langfuse-web:3000
LANGFUSE_PUBLIC_KEY=your_public_key
LANGFUSE_SECRET_KEY=your_secret_key
3

Start Langfuse

docker compose -f docker-compose.yml -f docker-compose-langfuse.yml up -d
Access Langfuse at: http://localhost:4000

Graphiti Knowledge Graph

1

Download Graphiti Compose File

curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-graphiti.yml
2

Enable Graphiti in .env

.env
GRAPHITI_ENABLED=true
GRAPHITI_TIMEOUT=30
GRAPHITI_URL=http://graphiti:8000
GRAPHITI_MODEL_NAME=gpt-5-mini

# Neo4j settings
NEO4J_USER=neo4j
NEO4J_DATABASE=neo4j
NEO4J_PASSWORD=your_secure_password
NEO4J_URI=bolt://neo4j:7687
3

Start Graphiti

docker compose -f docker-compose.yml -f docker-compose-graphiti.yml up -d
Access Neo4j Browser at: http://localhost:7474

Observability Stack (Grafana, Prometheus, Jaeger)

1

Download Observability Compose File

curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-observability.yml
2

Enable Observability in .env

.env
OTEL_HOST=otelcol:8148
3

Start Observability Stack

docker compose -f docker-compose.yml -f docker-compose-observability.yml up -d
Access Grafana at: http://localhost:3000

All Stacks Together

To run PentAGI with all features enabled:
docker compose -f docker-compose.yml \
  -f docker-compose-langfuse.yml \
  -f docker-compose-graphiti.yml \
  -f docker-compose-observability.yml \
  up -d
Create shell aliases for convenience:
alias pentagi="docker compose -f docker-compose.yml -f docker-compose-langfuse.yml -f docker-compose-graphiti.yml -f docker-compose-observability.yml"
alias pentagi-up="pentagi up -d"
alias pentagi-down="pentagi down"
alias pentagi-logs="pentagi logs -f"

Common Issues

If you see an error about pentagi-network, observability-network, or langfuse-network:
  1. First run the main docker-compose.yml to create networks
  2. Then run the additional compose files
docker compose up -d
docker compose -f docker-compose-langfuse.yml up -d
If you see permission errors accessing Docker:Option 1 (Recommended for production):
sudo docker compose up -d
Option 2 (Development environments):
sudo usermod -aG docker $USER
newgrp docker
Default AWS Bedrock rate limits are very restrictive (2 requests/minute for Claude Sonnet 4).Solutions:
  • Request quota increases through AWS Service Quotas console
  • Use provisioned throughput models
  • Switch to alternative models with higher quotas
  • Use a different LLM provider (OpenAI, Anthropic, Gemini)
If you see certificate verification errors:
  1. Place your CA certificate bundle in ./pentagi-ssl/ca-bundle.pem
  2. Configure in .env:
    EXTERNAL_SSL_CA_PATH=/opt/pentagi/ssl/ca-bundle.pem
    EXTERNAL_SSL_INSECURE=false
    
  3. Restart: docker compose restart pentagi

Next Steps

Now that PentAGI is running:

Create Your First Assistant

Set up an AI agent for penetration testing

Configure LLM Providers

Optimize model selection for different agents

Architecture Overview

Understand how PentAGI works under the hood

Testing Utilities

Validate your LLM configuration with ctester
Important Security Reminders:
  • Change default passwords immediately
  • Never expose PentAGI directly to the internet without proper authentication
  • Always obtain authorization before testing any systems
  • Review security settings in the .env file

Build docs developers (and LLMs) love