The Challenge
Dart tools sometimes vary in their behavior deliberately. Understanding these variations is crucial for writing effective tests.Example: Number Equality
- On the web (dart2js and DDC): Numbers are represented using JavaScript numbers, which don’t distinguish between integers and floating point numbers of the same value. This test “fails” - but that’s the intended behavior
- On the VM: Integers and doubles are distinct types, so the test passes
Configuration Variations
Other times, behavior varies across different configurations of a single tool:- A test of the
assert()statement will produce different outcomes in debug versus release mode - NNBD features require null safety to be enabled
- Some features may only be available in specific runtime modes
Managing Test Variations
We need a mechanism to define which tests are meaningful for which configurations.Legacy Approach: Status Files
Currently, we mostly express this in status files with entries that mark tests asSkipByDesign on some configurations:
We’d like to move away from status files over time in favor of the requirements system.
Test Requirements System
The newer “requirements” system provides a better way to manage test variations. It was initially created for testing NNBD but can be extended for other needs.How It Works
Add requirements to test
A test can have a comment line starting with
Requirements= followed by a comma-separated list of identifiers. Each identifier is the name of a “feature” that a Dart tool may support.Test runner checks features
For any given configuration, the test runner knows which features that configuration supports.
Available Features
The full set of features is defined in the Dart SDK repository: View feature definitionsExample Usage
Single requirement:Writing Tests for Multiple Configurations
When writing tests, consider:1. Platform Differences
Web Platforms
dart2js and DDC use JavaScript number representation
VM Platforms
Native VM has distinct integer and double types
2. Mode Variations
- Debug mode: Assertions are checked, additional runtime checks
- Release mode: Assertions disabled, optimizations enabled
- Production mode: Maximum optimizations
3. Language Features
- NNBD: Requires null safety to be enabled
- Strong mode: Type checking variations
- Experimental features: May require specific flags
Best Practices
Use requirements instead of status files
Use requirements instead of status files
When possible, use the
Requirements= comment instead of adding entries to status files. This makes the test’s constraints explicit and self-documenting.Document why a test is platform-specific
Document why a test is platform-specific
If a test only makes sense on certain platforms, add a comment explaining why:
Test the intended behavior
Test the intended behavior
Write tests that verify the intended behavior for each configuration, not tests that will fail by design on some platforms.
Reuse tests when possible
Reuse tests when possible
If a test can run on multiple configurations without modification, don’t add unnecessary requirements. The goal is maximum test reuse.
Example: NNBD Test
Here’s a complete example of a test using the requirements system:- ✓ Run on configurations with NNBD support
- ✗ Be automatically skipped on configurations without NNBD support
Migration Path
As we move forward:- New tests: Use the
Requirements=system from the start - Existing tests: Gradually migrate from status files to requirements
- Status files: Will be phased out over time
The requirements system provides better maintainability and makes test constraints explicit in the test file itself.