How Alert Ingestion Works
When an alert fires:- Observability platform sends a webhook to Aurora
- Aurora creates an incident in the
incidentstable - A Celery background task starts the RCA investigation
- Alert details are stored in source-specific tables (
datadog_events,grafana_alerts, etc.) - The LangGraph agent analyzes the alert and executes diagnostic tools
Datadog
Create a webhook integration
In Datadog:
- Go to Integrations → Webhooks
- Add a new webhook:
- Name: Aurora RCA
- URL:
https://your-aurora-url/api/datadog/webhook - Payload: Default (Aurora parses standard Datadog format)
Add to monitor notifications
Edit your monitors to include:Or set up a global notification rule for all critical alerts.
Datadog Metadata
Aurora extracts:- Alert ID:
alert_idfrom payload - Severity: Mapped from
alert_status(Alert → high, Warn → medium) - Service: Parsed from tags or
host - Source URL: Built using Datadog subdomain from OAuth settings
Grafana
Add a webhook contact point
In Grafana:
- Go to Alerting → Contact points
- Create a new contact point:
- Type: Webhook
- URL:
https://your-aurora-url/api/grafana/webhook - HTTP Method: POST
Create a notification policy
Route alerts to the Aurora contact point:
- Match all alerts, or filter by label (e.g.,
severity="critical")
Grafana Alert Format
Grafana sends alerts in the Alertmanager format:fingerprint as the source alert ID.
PagerDuty
Aurora supports PagerDuty via OAuth for deeper integration.Connect your PagerDuty account
- Go to Settings → Integrations → PagerDuty
- Click “Connect”
- Authorize Aurora to access incidents and escalation policies
Configure a webhook extension
In PagerDuty:
- Go to Services → Select a service → Integrations
- Add a new “Generic Webhook” extension
- Webhook URL:
https://your-aurora-url/api/pagerduty/webhook - Enable for incident triggers and updates
PagerDuty Features
Aurora consolidates all events for an incident:Netdata
Netdata Alert Structure
Netdata provides rich context:- Chart: Which metric triggered the alert
- Host: Affected system
- Value: Current metric value vs. threshold
{alert_name}:{host}:{chart}
Dynatrace
Create a webhook notification
In Dynatrace:
- Go to Settings → Integration → Problem notifications
- Add a custom integration:
- Webhook URL:
https://your-aurora-url/api/dynatrace/webhook - Call webhook when: Problem opened
- Webhook URL:
Splunk
Set up a webhook alert action
In Splunk:
- Create or edit a saved search
- Add a “Trigger Actions” → Webhook
- URL:
https://your-aurora-url/api/splunk/webhook - Include alert metadata in the payload
Custom Observability Tools
For tools not listed above, send alerts to Aurora’s generic webhook endpoint:Alert Correlation
Aurora automatically correlates related alerts:- Service match: Same service name
- Time window: Within 5-15 minutes
- Semantic similarity: Embedded alert titles using Weaviate
incident_alerts table, not as separate incidents.
Alert Source URLs
Aurora generates deep links back to your observability platform:Webhook Security
Enable rate limiting:Webhook Payloads
Aurora stores the full webhook payload for forensics:Troubleshooting
Webhooks not received
Webhooks not received
Check:
- Aurora server is accessible from the internet (or observability tool network)
- Webhook URL is correct (include
/api/{source}/webhook) - Firewall allows inbound HTTPS on port 443
Incidents not created
Incidents not created
View Aurora server logs:Common issues:
- Invalid JSON payload
- Missing required fields (alert_title, severity, etc.)
- Database connection error
RCA not starting
RCA not starting
Check Celery worker:Verify:
- Redis is running (Celery broker)
- LLM API keys are configured
- Cloud provider credentials are valid
Wrong severity or service
Wrong severity or service
Aurora infers severity and service from alert payload. Adjust mappings in:Or include explicit fields in webhook payload.
Next Steps
First Investigation
Run your first incident investigation
Custom Connectors
Build integrations for proprietary tools