Overview
Most tests in Node.js are JavaScript programs that exercise functionality and verify it behaves as expected. Tests should exit with code 0 on success.
When to Add Tests
Add tests when:
Adding new functionality
Fixing regressions and bugs
Expanding test coverage
Test Structure
Basic Test Example
'use strict' ; // 1
const common = require ( '../common' ); // 2
const fixtures = require ( '../common/fixtures' ); // 3
// This test ensures that the http-parser can handle UTF-8 characters
// in the http header.
const assert = require ( 'node:assert' ); // 8
const http = require ( 'node:http' ); // 9
const server = http . createServer ( common . mustCall (( req , res ) => {
res . end ( 'ok' );
}));
server . listen ( 0 , () => {
http . get ({
port: server . address (). port ,
headers: { 'Test' : 'Düsseldorf' },
}, common . mustCall (( res ) => {
assert . strictEqual ( res . statusCode , 200 );
server . close ();
}));
});
Key Components
Strict Mode
All tests should be in strict mode unless the test specifically requires non-strict mode.
Common Module
const common = require ( '../common' );
The common module provides useful test utilities. Always include it before other modules, even if you don’t use its functions directly, as it includes checks for variable leaks.
// This test ensures that the http-parser can handle UTF-8 characters
// in the http header.
Start tests with comments explaining what functionality is being tested.
Module Imports
const assert = require ( 'node:assert' );
const http = require ( 'node:http' );
Require statements should be sorted in ASCII order (digits, uppercase, _, lowercase).
Writing Good Tests
Keep Tests Isolated
When adding a new test, create a new file rather than appending to an existing one. Isolated tests are easier to debug when they fail.
Add Context
Include comments explaining:
What the test is trying to verify
Why certain approaches are used
Any edge cases being covered
Be Minimal
Write tests that are as minimal as possible while still testing the functionality. This makes debugging easier when tests fail.
Common Module API
common.mustCall
Ensures a callback is called the expected number of times:
const server = http . createServer ( common . mustCall (( req , res ) => {
res . end ();
})). listen ( 0 , common . mustCall (() => {
http . get ( options , common . mustCall (( res ) => {
res . resume ();
server . close ();
}));
}));
Don’t use common.mustCall() for callbacks with error parameters. Use common.mustSucceed() instead to properly handle errors.
common.mustSucceed
For callbacks that receive an error as the first argument:
fs . readFile ( 'file.txt' , common . mustSucceed (( data ) => {
// This will only be called if err is null/undefined
console . log ( data );
}));
Adjusts timeouts for slower platforms:
const timer = setTimeout ( fail , common . platformTimeout ( 4000 ));
Countdown Module
For tests requiring multiple async operations to complete:
const Countdown = require ( '../common/countdown' );
const countdown = new Countdown ( 2 , () => {
console . log ( 'Both operations completed' );
});
countdown . dec ();
countdown . dec (); // Callback invoked
Testing Promises
Wrap promise handlers in common.mustCall() to ensure they execute:
const common = require ( '../common' );
const assert = require ( 'node:assert' );
const fs = require ( 'node:fs' ). promises ;
fs . readFile ( 'test-file' ). then (
common . mustCall (( content ) => {
assert . strictEqual ( content . toString (), 'expected' );
})
);
Assertions
Use Strict Versions
// Preferred
assert . strictEqual ( actual , expected );
assert . deepStrictEqual ( actual , expected );
// Avoid
assert . equal ( actual , expected );
assert . deepEqual ( actual , expected );
Full Error Messages
// Good - full regex
assert . throws (
() => { throw new Error ( 'Wrong value' ); },
/ ^ Error: Wrong value $ /
);
// Bad - partial match
assert . throws (
() => { throw new Error ( 'Wrong value' ); },
/Wrong value/
);
Error Codes
For internal errors, check only the code:
assert . throws (
() => { throw new ERR_FS_FILE_TOO_LARGE ( ` ${ sizeKiB } Kb` ); },
{ code: 'ERR_FS_FILE_TOO_LARGE' }
);
Running Tests
Run All Tests
# Quick verification
make test-only
# Full test suite (before PRs)
make -j4 test
Run Specific Test File
tools/test.py test/parallel/test-stream2-transform.js
# Or directly with node
./node test/parallel/test-stream2-transform.js
Run by Subsystem
tools/test.py child-process
Run by Pattern
# All tests starting with 'test-stream-'
tools/test.py "test/parallel/test-stream-*"
# All inspector tests across directories
tools/test.py "*/test-inspector-*"
Test Options
# See all available options
tools/test.py --help
Test Flags
Some tests require specific Node.js flags:
'use strict' ;
// Flags: --expose-internals
require ( '../common' );
const assert = require ( 'node:assert' );
const freelist = require ( 'node:internal/freelist' );
Console Output
Console output can be useful for debugging but should be used judiciously:
Output is suppressed unless the test fails
Helps with debugging timeouts in CI
Avoid excessive output, especially in loops
Comment whether console.log() is part of the test or debug code
ES.Next Features
When writing tests, prefer modern JavaScript features:
// Prefer const/let over var
const server = http . createServer ();
let counter = 0 ;
// Use template literals
const message = `Server running on port ${ port } ` ;
// Use arrow functions when appropriate
server . on ( 'request' , ( req , res ) => {
res . end ( 'ok' );
});
Test Naming
Test files use kebab-case:
test-[module]-[method]-[description].js
Examples:
test-process-before-exit.js
test-process-before-exit-arrow-functions.js
test-http-server-utf8-headers.js
C++ Unit Tests
Adding C++ Tests
Place tests in test/cctest/:
#include "gtest/gtest.h"
#include "node_test_fixture.h"
#include "env.h"
#include "node.h"
#include "v8.h"
static bool called_cb = false ;
static void at_exit_callback ( void* arg );
class EnvTest : public NodeTestFixture { };
TEST_F (EnvTest, RunAtExit) {
v8 ::HandleScope handle_scope (isolate_);
v8 ::Local < v8 ::Context > context = v8 :: Context :: New (isolate_);
node ::IsolateData * isolateData =
node :: CreateIsolateData (isolate_, uv_default_loop ());
Argv argv{ "node" , "-e" , ";" };
auto env = node :: CreateEnvironment (
isolateData, context, 1 , * argv, 2 , * argv);
node :: AtExit (env, at_exit_callback);
node :: RunAtExit (env);
EXPECT_TRUE (called_cb);
}
static void at_exit_callback ( void* arg ) {
called_cb = true ;
}
Running C++ Tests
# Run all C++ tests
make cctest
# Run specific test
make cctest GTEST_FILTER=EnvironmentTest.AtExitWithArgument
# Run with gtest directly
out/Release/cctest --gtest_filter=EnvironmentTest.AtExit *
Test Coverage
Generate Coverage Report
./configure --coverage
make coverage
# View reports
# JavaScript: coverage/index.html
# C++: coverage/cxxcoverage.html
Coverage for Specific Tests
make coverage-clean
NODE_V8_COVERAGE = coverage/tmp tools/test.py test/parallel/test-stream2-transform.js
make coverage-report-js
Best Practices
Avoid Timers
Timers are a source of test flakiness. If you must use them:
const timer = setTimeout ( fail , common . platformTimeout ( 4000 ));
Port Numbers
Always use port 0 to let the OS assign a random port:
// Good
server . listen ( 0 , () => {
const port = server . address (). port ;
// Use port...
});
// Bad - can cause conflicts in parallel tests
server . listen ( 8080 );
Clean Up Resources
Always close servers, timers, and other resources:
server . listen ( 0 , () => {
http . get ( options , ( res ) => {
res . resume ();
server . close (); // Important!
});
});
Building Guide Learn how to build Node.js
API Documentation Writing API documentation