Overview
The log viewer provides a powerful web-based interface for tracking webhook events, debugging issues, and monitoring system health. It features memory-optimized streaming architecture capable of handling enterprise-scale log volumes.Security Warning
🚨 CRITICAL SECURITY NOTICE The log viewer endpoints (/logs/*) are NOT PROTECTED by authentication or authorization. They expose potentially sensitive webhook data and should NEVER be exposed outside your local network or trusted environment.
Required Security Measures
- ✅ Deploy behind a reverse proxy with authentication (e.g., nginx with basic auth)
- ✅ Use firewall rules to restrict access to trusted IP ranges only
- ✅ Never expose log viewer ports directly to the internet
- ✅ Monitor access to log endpoints in your infrastructure logs
- ✅ Consider VPN-only access for maximum security
Data Exposure Risk
Log files may contain:- 🔑 GitHub Personal Access Tokens and API credentials
- 👤 User information and GitHub usernames
- 📋 Repository details and webhook payloads
- 🔒 Internal system information and error details
Network-Level Security
Deploy log viewer endpoints only on trusted networks:- VPN Access: Deploy behind corporate VPN for internal-only access
- Reverse Proxy Authentication: Use nginx/Apache with HTTP Basic Auth:
- Firewall Rules: Restrict access to webhook server port to specific IP ranges
- Network Segmentation: Deploy in isolated network segments
Performance & Scalability
Memory-Optimized Streaming
The log viewer uses advanced streaming and chunked processing techniques:- Constant Memory Usage: Handles log files of any size with consistent memory footprint
- Early Filtering: Reduces data transfer by filtering at the source before transmission
- Streaming Processing: Real-time log processing without loading entire files into memory
- 90% Memory Reduction: Optimized for enterprise environments with gigabytes of log data
- Sub-second Response Times: Fast query responses even with large datasets
Performance Benchmarks
The memory optimization work has achieved:- 90% reduction in memory usage compared to bulk loading
- Sub-second response times for filtered queries on multi-GB log files
- Constant memory footprint regardless of log file size
- Real-time streaming with less than 100ms latency for new log entries
Technical Architecture
Streaming-First Design:- Streaming Parser: Reads log files line-by-line instead of loading entire files
- Early Filtering: Applies search criteria during parsing to reduce memory usage
- Chunked Responses: Delivers results in small batches for responsive UI
- Automatic Cleanup: Releases processed data immediately after transmission
Accessing the Log Viewer
Enable Log Viewer
Web Interface
Core Features
- 🔍 Real-time log streaming via WebSocket connections with intelligent buffering
- 📊 Advanced filtering by hook ID, PR number, repository, user, log level, and text search
- 🎨 Dark/light theme support with automatic preference saving
- 📈 PR flow visualization showing webhook processing stages and timing
- 📥 JSON export functionality for log analysis and external processing
- 🎯 Color-coded log levels for quick visual identification
- ⚡ Progressive loading with pagination for large datasets
- 🔄 Auto-refresh with configurable intervals
- 🎛️ Advanced query builder for complex log searches
Web Interface Features
Filtering Controls
Available Filters:- Hook ID: GitHub delivery ID for tracking specific webhook calls
- PR Number: Filter by pull request number
- Repository: Filter by repository name (org/repo format)
- User: Filter by GitHub username
- Log Level: Filter by severity level (DEBUG, INFO, WARNING, ERROR)
- Search: Free text search across log messages
- Time Range: Filter by start and end time (ISO 8601 format)
Real-time Features
- Live Updates: WebSocket connection for real-time log streaming
- Auto-refresh: Historical logs refresh when filters change
- Connection Status: Visual indicator for WebSocket connection status
- Progressive Loading: Smooth scrolling with automatic pagination
Theme Support
- Dark/Light Modes: Toggle between themes with automatic preference saving
- Responsive Design: Works on desktop and mobile devices
- Keyboard Shortcuts: Quick access to common functions
Log Level Color Coding
The web interface uses intuitive color coding:- 🟢 INFO (Green): Successful operations and informational messages
- 🟡 WARNING (Yellow): Warning messages that need attention
- 🔴 ERROR (Red): Error messages requiring immediate action
- ⚪ DEBUG (Gray): Technical debug information
API Endpoints
Get Historical Log Entries
hook_id(string): Filter by GitHub delivery IDpr_number(integer): Filter by pull request numberrepository(string): Filter by repository name (e.g., “org/repo”)event_type(string): Filter by GitHub event typegithub_user(string): Filter by GitHub usernamelevel(string): Filter by log level (DEBUG, INFO, WARNING, ERROR)start_time(string): Start time filter (ISO 8601 format)end_time(string): End time filter (ISO 8601 format)search(string): Free text search in log messageslimit(integer): Maximum entries to return (1-1000, default: 100)offset(integer): Pagination offset (default: 0)
Export Logs
/logs/api/entries plus:
format(string): Export format - only “json” is supportedlimit(integer): Maximum entries to export (max 50,000, default: 10,000)
WebSocket Real-time Streaming
PR Flow Visualization
identifier: Hook ID (e.g., “abc123”) or PR number (e.g., “123”)
Usage Examples
Monitor Specific PR
Track Webhook Processing
Debug Error Issues
Monitor Repository Activity
Search for Specific Errors
Troubleshooting
WebSocket Connection Issues
Problem: WebSocket won’t connect Solutions:- Check firewall rules for WebSocket traffic
- Verify server is accessible on specified port
- Ensure WebSocket upgrades are allowed by reverse proxy
- Check browser console for connection errors
Missing Log Data
Problem: Logs not appearing in viewer Solutions:- Verify log file permissions and paths
- Check if log directory exists and is writable
- Ensure log parser patterns match your log format
- Verify
ENABLE_LOG_SERVERis set totrue
Performance Issues
Problem: Slow query responses or high memory usage Solutions:- Large Result Sets: Reduce filter result sets using specific time ranges or repositories
- Memory Usage: The streaming architecture automatically handles large datasets efficiently
- Query Optimization: Use specific filters (hook_id, pr_number) for fastest responses
- File Size Management: Consider log file rotation for easier management
- Network Latency: Use pagination for mobile or slow connections
Log File Format Issues
Problem: Logs not parsing correctly Solutions:- Verify log file format matches expected JSON structure
- Check for corrupted log entries
- Ensure UTF-8 encoding
- Review log rotation configuration
Log File Format
Location:{config.data_dir}/logs/webhooks_YYYY-MM-DD.json
Format: Pretty-printed JSON (2-space indentation)
Rotation: Daily based on UTC date
Log Entry Structure:
Best Practices
Security
- Never expose to internet: Deploy behind VPN or with authentication
- Monitor access: Track who accesses log viewer endpoints
- Rotate logs: Implement log retention policies
- Encrypt transit: Use TLS/SSL for all connections
- Audit regularly: Review access logs and security configurations
Performance
- Use specific filters: Narrow down results with precise filters
- Limit result sets: Use pagination for large datasets
- Enable log rotation: Keep individual log files manageable
- Monitor resource usage: Track memory and CPU usage
- Optimize queries: Use hook_id or pr_number for fastest lookups
Operational
- Regular backups: Backup log files for compliance and auditing
- Monitor disk space: Ensure adequate storage for logs
- Set retention policies: Define how long to keep logs
- Document procedures: Create runbooks for common tasks
- Test recovery: Verify log recovery procedures work