Endpoint
Query Parameters
All query parameters are optional. Use them to filter and paginate results.Filter by specific GitHub delivery ID (x-github-delivery header value)Example:
hook_id=abc123-def456Filter by pull request numberExample:
pr_number=123Filter by repository name in
owner/repo formatExample: repository=my-org/my-repoFilter by GitHub event type (e.g.,
pull_request, check_run, push)Example: event_type=pull_requestFilter by GitHub username (api_user field)Example:
github_user=octocatFilter by log levelAllowed values:
DEBUG, INFO, WARNING, ERRORExample: level=ERRORStart time filter in ISO 8601 formatExample:
start_time=2025-01-30T10:00:00ZEnd time filter in ISO 8601 formatExample:
end_time=2025-01-30T12:00:00ZFull-text search in log messages (case-insensitive)Example:
search=signature%20verificationMaximum number of entries to return (1-10,000)Example:
limit=50Pagination offset (number of entries to skip)Example:
offset=100Response
Array of log entry objects matching the applied filters
Number of log entries examined during this request. May be an integer or string with ”+” suffix (e.g.,
"50000+") indicating the streaming process reached its maximum processing limit and more entries exist.Minimum number of entries matching the current filters. Calculated as
len(entries) + offset, representing the definitive lower bound of matching entries. Useful for pagination calculations and showing “showing X of at least Y results” messages.Statistical estimate of total log entries across all log files (including rotated logs). Provides context about overall dataset size for UI statistics and capacity planning. Based on sampling the first 10 log files to balance accuracy with performance.Examples:
"1.2M", "45.3K", "892"Echo of the requested limit parameter
Echo of the requested offset parameter
Indicates whether the streaming process examined all available logs (
false) or stopped at the processing limit (true)Examples
Get recent errors for a specific PR
Search for specific text in logs
Filter by repository and time range
Pagination example
Track specific webhook delivery
Performance Notes
- Memory-efficient streaming: Handles log files of any size with constant memory footprint
- Early filtering: Applies filters during parsing to reduce memory usage
- Sub-second responses: Optimized for fast queries even with large datasets
- Processing limits:
- Without filters: processes up to 20,000 entries
- With filters: processes up to 50,000 entries
- Maximum 25 most recent log files examined
Error Responses
Invalid parameters (e.g., limit out of range, negative offset)
File access errors or internal server errors
Use Cases
- Debugging: Find error logs for specific PRs or webhooks
- Monitoring: Track webhook processing for repositories
- Analysis: Search for specific events or patterns
- Auditing: Review historical webhook activity
- Troubleshooting: Investigate failures by filtering on error levels