Overview
This guide walks you through manually installing PentAGI without using the interactive installer. Manual installation provides more control over the configuration process and is useful for advanced deployments or custom environments.
Prerequisites
Before starting the manual installation, ensure you have:
Docker and Docker Compose installed
Minimum 2 vCPU
Minimum 4GB RAM
20GB free disk space
Internet access for downloading images
At least one LLM provider API key (OpenAI, Anthropic, Gemini, Bedrock, or Ollama)
Installation Steps
Create working directory
mkdir pentagi && cd pentagi
Download environment template
Download the .env.example file: curl -o .env https://raw.githubusercontent.com/vxcontrol/pentagi/master/.env.example
Download provider configuration examples
These files provide templates for custom LLM configurations: curl -o example.custom.provider.yml https://raw.githubusercontent.com/vxcontrol/pentagi/master/examples/configs/custom-openai.provider.yml
curl -o example.ollama.provider.yml https://raw.githubusercontent.com/vxcontrol/pentagi/master/examples/configs/ollama-llama318b.provider.yml
Configure LLM providers
Edit the .env file and add API keys for at least one LLM provider: OpenAI
Anthropic
Google Gemini
AWS Bedrock
Ollama (Local)
Custom OpenAI-Compatible
OPEN_AI_KEY = your_openai_key
OPEN_AI_SERVER_URL = https://api.openai.com/v1
You must configure at least one LLM provider for PentAGI to function. Without a configured provider, the system will not be able to perform AI-powered penetration testing.
Configure search engines (optional)
Add search engine API keys for enhanced information gathering: # DuckDuckGo (no API key required)
DUCKDUCKGO_ENABLED = true
# Google Custom Search
GOOGLE_API_KEY = your_google_key
GOOGLE_CX_KEY = your_google_cx
# Tavily
TAVILY_API_KEY = your_tavily_key
# Traversaal
TRAVERSAAL_API_KEY = your_traversaal_key
# Perplexity
PERPLEXITY_API_KEY = your_perplexity_key
PERPLEXITY_MODEL = sonar-pro
# Searxng (self-hosted)
SEARXNG_URL = http://your-searxng-instance:8080
Configure security settings
Update security-related environment variables with strong, random values: Generate Random Values
.env Configuration
# Generate cookie signing salt (64 characters)
openssl rand -hex 32
# Generate encryption key (32 bytes in hex)
openssl rand -hex 32
# Generate database passwords
openssl rand -base64 32
Never use default passwords in production. Generate strong, unique passwords for all services.
Configure Graphiti knowledge graph (optional)
Enable the Graphiti knowledge graph for semantic memory: # Enable Graphiti
GRAPHITI_ENABLED = true
GRAPHITI_TIMEOUT = 30
GRAPHITI_URL = http://graphiti:8000
GRAPHITI_MODEL_NAME = gpt-5-mini
# Neo4j settings (required by Graphiti)
NEO4J_USER = neo4j
NEO4J_DATABASE = neo4j
NEO4J_PASSWORD =< random_password >
NEO4J_URI = bolt://neo4j:7687
# Graphiti requires OpenAI API key
OPEN_AI_KEY = your_openai_key
Configure assistant settings (optional)
Set default behavior for AI assistants: # Default value for agent delegation
ASSISTANT_USE_AGENTS = false
Remove inline comments (for IDE compatibility)
If using the .env file with VSCode or other IDEs: perl -i -pe 's/\s+#.*$//' .env
This step removes inline comments that may cause issues with some environment file parsers.
Download Docker Compose configuration
Download the main docker-compose.yml file: curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose.yml
Start PentAGI
Launch the PentAGI stack: This will start the core services:
Frontend UI (React + TypeScript)
Backend API (Go + GraphQL)
PostgreSQL with pgvector
Task queue and AI agents
Accessing PentAGI
Once the containers are running, access PentAGI at:
Default credentials:
Change the default password immediately after your first login.
Optional Services
Langfuse (LLM Observability)
For advanced LLM monitoring and analytics:
Configure Langfuse environment variables
Add to your .env file: # Langfuse database credentials
LANGFUSE_POSTGRES_USER = langfuse
LANGFUSE_POSTGRES_PASSWORD =< random_password >
LANGFUSE_CLICKHOUSE_USER = default
LANGFUSE_CLICKHOUSE_PASSWORD =< random_password >
LANGFUSE_REDIS_AUTH =< random_password >
# Encryption keys
LANGFUSE_SALT =< random_salt >
LANGFUSE_ENCRYPTION_KEY =< 32_byte_hex >
LANGFUSE_NEXTAUTH_SECRET =< random_secret >
# Admin credentials
LANGFUSE_INIT_USER_EMAIL = [email protected]
LANGFUSE_INIT_USER_PASSWORD =< admin_password >
LANGFUSE_INIT_USER_NAME = Admin
# API keys (auto-generated)
LANGFUSE_INIT_PROJECT_PUBLIC_KEY =< public_key >
LANGFUSE_INIT_PROJECT_SECRET_KEY =< secret_key >
# S3 storage (MinIO)
LANGFUSE_S3_ACCESS_KEY_ID =< access_key >
LANGFUSE_S3_SECRET_ACCESS_KEY =< secret_key >
# Enable integration with PentAGI
LANGFUSE_BASE_URL = http://langfuse-web:3000
LANGFUSE_PUBLIC_KEY =< from_above >
LANGFUSE_SECRET_KEY =< from_above >
Download Langfuse compose file
curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-langfuse.yml
Start Langfuse services
docker compose -f docker-compose.yml -f docker-compose-langfuse.yml up -d
Access Langfuse UI
Visit http://localhost:4000 and login with your configured credentials.
Graphiti (Knowledge Graph)
For semantic relationship tracking:
Download Graphiti compose file
curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-graphiti.yml
Start Graphiti services
docker compose -f docker-compose.yml -f docker-compose-graphiti.yml up -d
Verify Graphiti is running
docker compose logs -f graphiti
Neo4j browser available at: http://localhost:7474
Observability Stack
For comprehensive monitoring:
Enable OpenTelemetry in .env
Download observability compose file
curl -O https://raw.githubusercontent.com/vxcontrol/pentagi/master/docker-compose-observability.yml
Start observability services
docker compose -f docker-compose.yml -f docker-compose-observability.yml up -d
Access Grafana
Visit http://localhost:3000 for dashboards showing:
System metrics (VictoriaMetrics)
Distributed traces (Jaeger)
Log aggregation (Loki)
Run All Services
To start PentAGI with all optional services:
docker compose -f docker-compose.yml \
-f docker-compose-langfuse.yml \
-f docker-compose-graphiti.yml \
-f docker-compose-observability.yml \
up -d
Create shell aliases for easier management: alias pentagi = "docker compose -f docker-compose.yml -f docker-compose-langfuse.yml -f docker-compose-graphiti.yml -f docker-compose-observability.yml"
alias pentagi-up = "pentagi up -d"
alias pentagi-down = "pentagi down"
Network Requirements
If you encounter network errors, create the required Docker networks first: docker network create pentagi-network
docker network create observability-network
docker network create langfuse-network
Then run the compose files again.
Verification
Verify your installation:
Check container status
All containers should show “running” status.
Check logs for errors
docker compose logs -f pentagi
Access web interface
Open https://localhost:8443 in your browser and login.
Verify LLM provider connection
Create a test assistant in the UI to verify your configured LLM provider is working.
Troubleshooting
Check logs for the specific container: docker compose logs < container_nam e >
Common issues:
Missing or invalid API keys
Port conflicts (check with netstat -tlnp)
Insufficient resources
Cannot connect to LLM provider
Verify:
API key is correct and active
No typos in environment variables
Network connectivity to provider’s API
Proxy settings if applicable (PROXY_URL)
Database connection errors
Check PostgreSQL status: docker compose logs pgvector
Ensure database credentials match in .env file.
The default configuration uses self-signed certificates. To use custom certificates:
Place your certificates in a secure location
Update .env:
SERVER_SSL_CRT = /path/to/cert.pem
SERVER_SSL_KEY = /path/to/key.pem
Mount certificates in docker-compose.yml
Updating PentAGI
To update to the latest version:
# Pull latest images
docker compose pull
# Restart services
docker compose up -d
Next Steps
First Pentest Run your first penetration test
Custom Assistants Create custom AI assistants
Distributed Setup Configure two-node deployment
Best Practices Security and ethical guidelines