TechEmpower Benchmarks
TechEmpower provides a comprehensive performance comparison of web application frameworks executing fundamental tasks such as JSON serialization, database access, and server-side template rendering. Each framework operates under a realistic production configuration, with results recorded on both cloud instances and physical hardware. Test implementations are community-contributed and maintained in the FrameworkBenchmarks repository.Test Environment
Hardware Specifications
- CPU: 56 Cores Intel(R) Xeon(R) Gold 6330 @ 2.00GHz
- Hardware: Three homogeneous ProLiant DL360 Gen10 Plus
- RAM: 64GB
- Storage: Enterprise SSD
- OS: Ubuntu
- Network: Mellanox Technologies MT28908 Family ConnectX-6 40Gbps Ethernet
- Fiber Version: v3.0.0
Benchmark Results
Plaintext
The Plaintext test measures basic request routing and demonstrates the capacity of high-performance platforms. Requests are pipelined, and the tiny response body demands high throughput to saturate the benchmark’s gigabit Ethernet. See Plaintext requirementsFiber
11,987,976 responses per secondAverage latency: 1.0 ms
Express
1,204,969 responses per secondAverage latency: 8.8 ms
Fiber handles 10x more requests than Express in plaintext benchmarks while maintaining significantly lower latency.
JSON Serialization
The JSON test measures JSON serialization performance, a common operation in modern web APIs.Fiber
2,363,294 responses per secondAverage latency: 0.2 ms
Express
949,717 responses per secondAverage latency: 0.5 ms
Single Query
The Single Query test exercises the framework’s object-relational mapping (ORM) or database access layer with a single database query per request.Fiber
953,016 responses per secondAverage latency: 0.6 ms
Express
441,543 responses per secondAverage latency: 1.3 ms
Multiple Queries
The Multiple Queries test performs multiple database queries per request, testing how frameworks handle concurrent database access.Fiber
54,002 responses per secondAverage latency: 9.4 ms
Express
85,011 responses per secondAverage latency: 6.0 ms
Data Updates
The Data Updates test performs database queries and updates, representing a more complete CRUD operation pattern.Fiber
29,984 responses per secondAverage latency: 16.9 ms
Express
54,887 responses per secondAverage latency: 9.2 ms
Performance Analysis
Where Fiber Excels
High-Throughput Scenarios
High-Throughput Scenarios
Fiber demonstrates exceptional performance in high-throughput scenarios:
- Plaintext: 10x faster than Express (11.9M vs 1.2M req/s)
- JSON Serialization: 2.5x faster than Express (2.4M vs 950K req/s)
- Single Query: 2.2x faster than Express (953K vs 442K req/s)
- High-traffic APIs
- Microservices handling many requests
- Real-time applications
- Gateway and proxy services
Low Latency Operations
Low Latency Operations
Fiber maintains consistently low latency across most operations:
- Plaintext: 1.0ms average latency
- JSON: 0.2ms average latency
- Single Query: 0.6ms average latency
- User-facing applications
- API gateways
- Service mesh communications
Memory Efficiency
Memory Efficiency
Fiber’s zero-allocation router and efficient memory management contribute to:
- Lower memory footprint
- Better garbage collection performance
- Higher request throughput under memory pressure
- Reduced infrastructure costs
Benchmark Methodology
Test Categories
TechEmpower benchmarks test several key aspects of web framework performance:- JSON Serialization: Tests the framework’s ability to serialize objects to JSON
- Single Database Query: Tests database access with a single query per request
- Multiple Database Queries: Tests handling of multiple sequential database queries
- Database Updates: Tests reading, modifying, and updating database records
- Plaintext: Tests basic request routing and response handling
- Fortunes: Tests template rendering and HTML escaping
Why These Tests Matter
Real-World Patterns
Tests reflect common web application patterns: JSON APIs, database access, and content rendering.
Standardized Comparison
All frameworks run identical operations under the same hardware conditions for fair comparison.
Production Configuration
Frameworks are configured as they would be in production environments.
Community Validated
Test implementations are reviewed and maintained by the community.
Interpreting Results
Considerations
- Database tests: Performance depends heavily on database configuration and network latency
- Real applications: Include business logic, validation, and other processing not captured in benchmarks
- Scalability: Both vertical (single machine) and horizontal (multiple machines) scaling characteristics matter
- Development velocity: Framework productivity, ecosystem, and maintainability are crucial factors beyond raw performance