Skip to main content

Endpoint

GET /logs/api/export
Exports filtered webhook logs as a downloadable JSON file with metadata. Uses memory-efficient streaming for large datasets.
Security Warning: This endpoint is NOT protected by authentication. Never expose it outside your trusted network. Deploy behind a reverse proxy with authentication, use firewall rules to restrict access, and never expose log viewer ports directly to the internet. Exported files may contain GitHub tokens, user information, repository details, and sensitive webhook payloads.

Query Parameters

All query parameters are optional except format. Supports the same filtering options as /logs/api/entries.
format
string
required
Export format. Currently only json is supported.Example: format=json
hook_id
string
Filter by specific GitHub delivery IDExample: hook_id=abc123-def456
pr_number
integer
Filter by pull request numberExample: pr_number=123
repository
string
Filter by repository name in owner/repo formatExample: repository=my-org/my-repo
event_type
string
Filter by GitHub event typeExample: event_type=pull_request
github_user
string
Filter by GitHub usernameExample: github_user=octocat
level
string
Filter by log level: DEBUG, INFO, WARNING, ERRORExample: level=ERROR
start_time
string
Start time filter in ISO 8601 formatExample: start_time=2025-01-30T00:00:00Z
end_time
string
End time filter in ISO 8601 formatExample: end_time=2025-01-30T23:59:59Z
Full-text search in log messagesExample: search=error
limit
integer
default:"10000"
Maximum number of entries to export (1-50,000)Important: Maximum limit is 50,000 entries to prevent excessive server loadExample: limit=1000

Response

Returns a StreamingResponse with Content-Disposition: attachment header to trigger file download.

Response Headers

  • Content-Type: application/json
  • Content-Disposition: attachment; filename=webhook_logs_YYYYMMDD_HHMMSS.json

JSON File Structure

export_metadata
object
Metadata about the export
log_entries
array
Array of log entry objects (same structure as /logs/api/entries response)

Examples

Export all error logs

curl "http://localhost:5000/logs/api/export?format=json&level=ERROR" -o errors.json
{
  "export_metadata": {
    "generated_at": "2025-01-30T12:34:56.789000Z",
    "filters_applied": {
      "level": "ERROR",
      "limit": 10000
    },
    "total_entries": 42,
    "export_format": "json"
  },
  "log_entries": [
    {
      "timestamp": "2025-01-30T10:30:00.123000",
      "level": "ERROR",
      "logger_name": "PullRequestHandler",
      "message": "Failed to assign reviewers: API rate limit exceeded",
      "hook_id": "abc123-def456",
      "event_type": "pull_request",
      "repository": "my-org/my-repo",
      "pr_number": 123,
      "github_user": "octocat"
    }
  ]
}

Export logs for specific PR

curl "http://localhost:5000/logs/api/export?format=json&pr_number=123&limit=5000" -o pr-123-logs.json

Export logs for repository within time range

curl "http://localhost:5000/logs/api/export?format=json&repository=my-org/my-repo&start_time=2025-01-30T00:00:00Z&end_time=2025-01-30T23:59:59Z" -o daily-logs.json

Export logs matching search term

curl "http://localhost:5000/logs/api/export?format=json&search=rate%20limit&limit=1000" -o rate-limit-logs.json

Export complete webhook history

curl "http://localhost:5000/logs/api/export?format=json&hook_id=abc123-def456" -o webhook-abc123.json

Export Limits

Maximum Export Size: 50,000 entriesThe export endpoint enforces a maximum limit of 50,000 entries to prevent server overload and excessive memory usage. If you need to export more data:
  1. Use more specific filters to reduce the result set
  2. Export data in multiple batches using time ranges
  3. Use pagination with multiple export requests

Processing Limits

  • Without filters: Processes up to limit + 1000 entries
  • With filters: Processes up to min(limit + 20000, 100000) entries
  • Maximum log files examined: 25 most recent files

Performance Considerations

  • Streaming architecture: Memory-efficient processing for large exports
  • Early filtering: Applies filters during log parsing to minimize memory usage
  • Automatic cleanup: Releases processed data immediately after transmission
  • Response time: Sub-second to several seconds depending on filter complexity and dataset size

Error Responses

400 Bad Request
error
Invalid parameters (e.g., unsupported format)
{
  "detail": "Invalid format: csv. Only 'json' is supported."
}
413 Payload Too Large
error
Result set exceeds maximum export limit
{
  "detail": "Result set too large (max 50000 entries)"
}
500 Internal Server Error
error
Export generation failed
{
  "detail": "Export generation failed"
}

Use Cases

  • Offline analysis: Export logs for analysis with external tools
  • Archiving: Create backups of webhook processing logs
  • Compliance: Export logs for audit trails and compliance reporting
  • Debugging: Download logs for detailed investigation of issues
  • Data processing: Import logs into analytics platforms or databases
  • Incident response: Export error logs for post-mortem analysis

Integration Examples

Python

import requests

# Export error logs
response = requests.get(
    "http://localhost:5000/logs/api/export",
    params={
        "format": "json",
        "level": "ERROR",
        "limit": 1000
    }
)

with open("errors.json", "wb") as f:
    f.write(response.content)

JavaScript/Node.js

const fs = require('fs');
const https = require('https');

const url = 'http://localhost:5000/logs/api/export?format=json&pr_number=123';

https.get(url, (response) => {
  const fileStream = fs.createWriteStream('pr-logs.json');
  response.pipe(fileStream);
  fileStream.on('finish', () => {
    console.log('Export complete');
  });
});

Shell Script

#!/bin/bash

# Export daily logs for all repositories
DATE=$(date +%Y-%m-%d)
START="${DATE}T00:00:00Z"
END="${DATE}T23:59:59Z"

curl "http://localhost:5000/logs/api/export?format=json&start_time=${START}&end_time=${END}" \
  -o "logs-${DATE}.json"

echo "Exported logs for ${DATE}"

Build docs developers (and LLMs) love