Endpoint
Query Parameters
All query parameters are optional exceptformat. Supports the same filtering options as /logs/api/entries.
Export format. Currently only
json is supported.Example: format=jsonFilter by specific GitHub delivery IDExample:
hook_id=abc123-def456Filter by pull request numberExample:
pr_number=123Filter by repository name in
owner/repo formatExample: repository=my-org/my-repoFilter by GitHub event typeExample:
event_type=pull_requestFilter by GitHub usernameExample:
github_user=octocatFilter by log level:
DEBUG, INFO, WARNING, ERRORExample: level=ERRORStart time filter in ISO 8601 formatExample:
start_time=2025-01-30T00:00:00ZEnd time filter in ISO 8601 formatExample:
end_time=2025-01-30T23:59:59ZFull-text search in log messagesExample:
search=errorMaximum number of entries to export (1-50,000)Important: Maximum limit is 50,000 entries to prevent excessive server loadExample:
limit=1000Response
Returns aStreamingResponse with Content-Disposition: attachment header to trigger file download.
Response Headers
Content-Type:application/jsonContent-Disposition:attachment; filename=webhook_logs_YYYYMMDD_HHMMSS.json
JSON File Structure
Metadata about the export
Array of log entry objects (same structure as
/logs/api/entries response)Examples
Export all error logs
Export logs for specific PR
Export logs for repository within time range
Export logs matching search term
Export complete webhook history
Export Limits
Maximum Export Size: 50,000 entriesThe export endpoint enforces a maximum limit of 50,000 entries to prevent server overload and excessive memory usage. If you need to export more data:
- Use more specific filters to reduce the result set
- Export data in multiple batches using time ranges
- Use pagination with multiple export requests
Processing Limits
- Without filters: Processes up to
limit + 1000entries - With filters: Processes up to
min(limit + 20000, 100000)entries - Maximum log files examined: 25 most recent files
Performance Considerations
- Streaming architecture: Memory-efficient processing for large exports
- Early filtering: Applies filters during log parsing to minimize memory usage
- Automatic cleanup: Releases processed data immediately after transmission
- Response time: Sub-second to several seconds depending on filter complexity and dataset size
Error Responses
Invalid parameters (e.g., unsupported format)
Result set exceeds maximum export limit
Export generation failed
Use Cases
- Offline analysis: Export logs for analysis with external tools
- Archiving: Create backups of webhook processing logs
- Compliance: Export logs for audit trails and compliance reporting
- Debugging: Download logs for detailed investigation of issues
- Data processing: Import logs into analytics platforms or databases
- Incident response: Export error logs for post-mortem analysis