Basic Benchmarking
Writing Your First Benchmark
math_bench.ts
Benchmark File Naming
Deno automatically discovers benchmark files matching:*_bench.ts*_bench.tsx*_bench.js*_bench.jsx*.bench.ts*.bench.tsx*.bench.js*.bench.jsx
Benchmark Structure
Basic Format
With Options
Async Benchmarks
Benchmark Groups
Compare different implementations:Baseline and Comparison
Warmup
Enable warmup iterations to get more stable results:Permissions
Specify permissions for benchmarks:Ignoring Benchmarks
Explicit Timers
Manually control timing for setup/teardown:Configuration
Configure in deno.json
deno.json
Named Permissions
deno.json
Running Benchmarks
Basic Commands
Output Formats
Real-World Examples
Array Operations
array_bench.ts
JSON Parsing
json_bench.ts
String Operations
string_bench.ts
Async Operations
async_bench.ts
Comparing Implementations
implementation_bench.ts
Tracking Performance
Save Results
CI Integration
.github/workflows/bench.yml
Best Practices
Isolate benchmarks
Each benchmark should test one specific thing
Use groups
Group related benchmarks for easy comparison
Set baselines
Mark a baseline to compare other implementations against
Warmup when needed
Enable warmup for more stable results
Meaningful data sizes
Use realistic data sizes that match production
Run multiple times
Deno automatically runs benchmarks multiple times for accuracy
Configuration Example
deno.json
Understanding Results
Benchmark output shows:- Iterations: Number of times the benchmark ran
- Time/iteration: Average time per iteration
- Throughput: Operations per second
- Min/Max/Avg: Statistical summary
- Percentiles: P75, P99, P995, P999 for distribution analysis