POST /api/export_filtered
Export filtered timeline data in CSV, XLSX, or JSON format with forensic-grade formatting.Request Body
CSV filename to export
Output format
Global search filter
Column-specific filters (e.g.,
{"EventID":"4624"})Export only selected row IDs
Time range start (ISO 8601)
Time range end (ISO 8601)
Sort column
Sort direction
Export only specified columns (default: all columns)
Original artifact filename (for output naming)
Generate AI-optimized context export (JSON summary instead of raw data)
Response Schema
Download endpoint:
/download/{filename}Generated filename (e.g.,
Export_Security_1704067200.csv)CSV Export Features
- UTF-8 BOM: Prepended for Excel compatibility
- Quoted fields: All fields quoted with
quote_style="always" - Hex preservation:
0x...values wrapped in Excel formula="0x..." - Text formatting: Hash, GUID, and SID columns forced to text format
CSV exports preserve hex values (e.g.,
0x00000030) without Excel auto-conversion to decimal.XLSX Export Features
- xlsxwriter with
strings_to_numbers: False - Text format:
@number format applied to hex/hash/GUID columns - write_string(): All cells written as strings (prevents auto-conversion)
- Header formatting: Bold + colored background
- Row limit: 100,000 rows (protection against memory exhaustion)
JSON Export Features
- Standard array format:
[{...}, {...}] - SOAR-compatible: Ingestible by Splunk SOAR, Cortex XSOAR
- UTF-8 encoding: No BOM (pure JSON)
- Complex types: Lists/Structs cast to string
AI-Optimized Context Export
Whenai_optimized: true, generates forensic intelligence summary instead of raw data:
- Timeline summary (peaks, gaps, time range)
- Top users, IPs, processes, event IDs
- Suspicious patterns detected
- YARA matches
- Network connections
- Session profiles
- Risk justification
format parameter)
Examples
Response Examples
POST /api/export/html
Generate standalone HTML forensic report with embedded Chart.js visualizations.Request Body
Same as/api/export_filtered (supports all filtering parameters).
Features
- Standalone HTML: Single-file report (no external dependencies)
- Embedded charts: Chart.js histograms and distribution charts
- Forensic dashboard: Risk level, top tactic, primary identity, EPS
- Sigma detections: Rule matches with evidence tables
- YARA hits: Malware pattern matches
- MITRE kill chain: ATT&CK tactic mapping
- Context sections: Timeline, hunting, identity analysis
- Print-optimized CSS: Auto-expand details blocks on print
Response
Example
POST /api/export/pdf
Generate PDF forensic report using 4-method fallback chain:- WeasyPrint (best quality, requires GTK/Pango)
- Playwright (headless Chromium, cross-platform)
- xhtml2pdf (pure Python, moderate quality)
- wkhtmltopdf (subprocess, if installed)
Request Body
Same as/api/export_filtered.
Response
PDF Generation Process
- Generate HTML report via
/api/export/html - Attempt conversion with each method (fallback on failure)
- Validate output (reject PDFs < 1KB)
- Return download URL + method used
Playwright waits 4s after page load for Chart.js animations to complete.
Print CSS Optimizations
- Color preservation: Risk badges maintain severity colors (red/orange/yellow/green)
- Chart legend: Static HTML color legend for histogram (Peak/Above mean/Normal)
- Section backgrounds: Light backgrounds with dark text for print readability
- Page breaks:
page-break-inside: avoidon cards/sections
Example
POST /api/export/forensic-summary
Export forensic context modal data as formatted XLSX report with all 12+ analysis sections.Request Body
Source filename
Full forensic report object from
/api/forensic_reportXLSX Sections
- Header Metadata: File, risk level, primary identity, total records, EPS
- Timeline Analysis: Peaks, gaps, time range
- Sanitized Forensic Summary: Top IPs, users, hosts, directories, violations
- Chronos Hunter Summary: Suspicious patterns, network destinations, logon events
- Identity & Assets: Top processes, rare executions, rare paths
- Sigma Rule Detections: With evidence samples (up to 10 rows per rule)
- YARA Detections: Malware signatures
- MITRE Kill Chain: ATT&CK tactics with threat levels
- Cross-Source Correlation: Pivot chains with first/last seen
- Session Profiles: IP/user sessions with dwell time
- Risk Justification: Bulleted list of risk factors
Formatting
- Title format: Bold, 14pt, blue
- Section headers: Bold, white text on blue background
- Table headers: Bold on light gray background
- Severity colors: Critical=red, High=orange, Medium=yellow, Low=green
- Text wrapping: Enabled for long values
- Column widths: Pre-configured (22, 42, 28, 16, 55 characters)
Example
Response
POST /api/export/split-zip
Export filtered data as multiple ZIP archives split by size limit (for AI analysis with token constraints).Request Body
Source CSV filename from ingestion
Export format:
"csv"Maximum size per ZIP file in MB (e.g., 50, 99)
Global search filter
Column-specific filters
Row IDs to include (from row selection)
ISO 8601 start timestamp
ISO 8601 end timestamp
Response
Use Case
Split large datasets into multiple archives that fit within AI token limits:part_001.csv, part_002.csv, etc.) that can be analyzed sequentially by AI systems.
GET /download/
Download exported file with automatic cleanup (10s delay after serving).Path Parameters
Filename from export response
Response
Binary file stream with headers:Automatic Cleanup
Files are deleted 10 seconds after serving to ensure browser download completes:Forensic Integrity Preservation
Hex Value Protection
Problem: Excel auto-converts0x00000030 to decimal 30.
Solution:
- CSV: Wrap hex in Excel formula:
="0x00000030" - XLSX: Force text format with
num_format='@'+write_string()
Hash Preservation
SHA256/MD5 hashes, GUIDs, and SIDs are:- Cast to
pl.Utf8before export - Written with
write_string()in XLSX - Column-level text format applied
Timestamp Integrity
All timestamps preserved in ISO 8601 format:- No fabrication (MFT uses real FILETIME values)
- No timezone conversion (preserves original)
- Millisecond precision maintained
Column Filtering
Internal analysis columns excluded from exports:Validated_EventID_epoch_tmp__ts_sort__bucket
Performance Benchmarks
| Dataset Size | Format | Export Time | File Size |
|---|---|---|---|
| 42,000 rows | CSV | 2.3s | 18 MB |
| 42,000 rows | XLSX | 12.5s | 8.2 MB |
| 42,000 rows | JSON | 3.1s | 22 MB |
| 42,000 rows | HTML | 8.7s | 2.4 MB |
| 42,000 rows | PDF (Playwright) | 14.2s | 1.8 MB |
Error Handling
Dataset Too Large (XLSX)
PDF Generation Failure
If all 4 methods fail, returns:Memory Exhaustion
If export runs out of memory:Next Steps
API Overview
API architecture and common patterns
Upload Endpoint
Stream forensic artifacts with chain of custody
Analysis Endpoints
Query timelines and generate forensic reports