Welcome to Aurora
Aurora is an automated root cause analysis investigation tool that uses AI agents to help Site Reliability Engineers resolve incidents. Built with a powerful combination of AI orchestration and cloud integrations, Aurora accelerates incident response by automatically investigating issues across your infrastructure.Aurora works without any cloud provider accounts! The only external requirement is an LLM API key. Connectors for cloud providers and third-party services are optional and can be enabled later.
Key Features
AI-Powered Root Cause Analysis
Automated investigation using LangGraph agents that analyze logs, metrics, and infrastructure state to identify root causes.
Multi-Cloud Support
Built-in connectors for AWS, GCP, Azure, and more. Optional integrations allow Aurora to investigate across your entire infrastructure.
Self-Hosted & Secure
Run entirely on your infrastructure with HashiCorp Vault for secrets management. No external dependencies except your chosen LLM provider.
Real-Time Investigation
WebSocket-based chatbot interface provides live updates as agents investigate incidents and gather information.
Architecture Overview
Aurora consists of several key components that work together:- Python Backend: Flask REST API and WebSocket chatbot powered by LangGraph agents
- Next.js Frontend: Modern React interface for incident management and investigation
- PostgreSQL: Primary database for incident data and configurations
- Weaviate: Vector database for semantic search and knowledge retrieval
- Redis: Message queue for Celery background tasks
- HashiCorp Vault: Secure secrets storage for API keys and credentials
- SeaweedFS: S3-compatible object storage (Apache 2.0 licensed)
- Memgraph: Graph database for infrastructure topology
All services run in Docker containers for easy deployment. See the Quickstart guide to get up and running in minutes.
Repository Structure
Get Started
Quickstart
Get Aurora running locally in under 5 minutes with prebuilt images.
Installation
Detailed installation instructions for development and production deployments.
Configuration
Learn how to configure LLM providers, cloud connectors, and integrations.
API Reference
Complete API documentation for building custom integrations.
What’s Next?
After getting Aurora running, you can:- Add Cloud Connectors: Connect AWS, GCP, Azure, or other providers to enable infrastructure investigation
- Configure Integrations: Set up Slack, PagerDuty, GitHub, and other third-party services
- Customize Agents: Configure agent behavior and recursion limits for your use case
- Deploy to Production: Use Docker Compose or Kubernetes for production deployments
Open Source
Aurora is licensed under the Apache License 2.0. We welcome contributions from the community!- GitHub: arvo-ai/aurora
- Documentation: arvo-ai.github.io/aurora
- Issues: Report bugs and request features on GitHub
Ready to get started? Head to the Quickstart guide to deploy Aurora in minutes.