Prerequisites
Before you begin, ensure you have:- Docker and Docker Compose installed
- At least one LLM provider API key:
- OpenAI API Key
- Anthropic API Key
- Google AI/Gemini API Key
- AWS Bedrock Access
- Or a locally running Ollama instance
For production deployments or security-sensitive environments, consider using a two-node architecture where worker operations are isolated on a separate server.
Quick Start with Docker Compose
Configure API Keys
Edit the
.env file and add at least one LLM provider API key:.env
You must set at least one Language Model provider to use PentAGI. Additional API keys for search engines are optional but recommended for better results.
Launch PentAGI
Start all services with Docker Compose:This will:
- Pull the required Docker images
- Start the PostgreSQL database with pgvector
- Launch the PentAGI backend and frontend
- Start the web scraper service
- Initialize the system
Access the Web Interface
Open your browser and navigate to:Default credentials:
- Email:
[email protected] - Password:
admin
Change the default password immediately after first login!
Verify Installation
After accessing the web interface, verify your installation:-
Check Service Status: All containers should be running:
-
View Logs: Monitor the startup logs:
- Test LLM Connection: Create a new assistant in the web UI and send a test message
Optional: Enable Advanced Features
Langfuse Analytics (LLM Observability)
Graphiti Knowledge Graph
Observability Stack (Grafana, Prometheus, Jaeger)
All Stacks Together
To run PentAGI with all features enabled:Create shell aliases for convenience:
Common Issues
Network already exists error
Network already exists error
If you see an error about
pentagi-network, observability-network, or langfuse-network:- First run the main
docker-compose.ymlto create networks - Then run the additional compose files
Permission denied on docker.sock
Permission denied on docker.sock
If you see permission errors accessing Docker:Option 1 (Recommended for production):Option 2 (Development environments):
Rate limit errors with AWS Bedrock
Rate limit errors with AWS Bedrock
Default AWS Bedrock rate limits are very restrictive (2 requests/minute for Claude Sonnet 4).Solutions:
- Request quota increases through AWS Service Quotas console
- Use provisioned throughput models
- Switch to alternative models with higher quotas
- Use a different LLM provider (OpenAI, Anthropic, Gemini)
SSL/TLS certificate errors
SSL/TLS certificate errors
If you see certificate verification errors:
- Place your CA certificate bundle in
./pentagi-ssl/ca-bundle.pem - Configure in
.env: - Restart:
docker compose restart pentagi
Next Steps
Now that PentAGI is running:Create Your First Assistant
Set up an AI agent for penetration testing
Configure LLM Providers
Optimize model selection for different agents
Architecture Overview
Understand how PentAGI works under the hood
Testing Utilities
Validate your LLM configuration with ctester