Skip to main content

Quick Start

Run the full test suite with linting:
make test
This command runs:
  1. Linting - Ruff code quality checks and formatting validation
  2. Type checking - mypy static type analysis
  3. Unit tests - pytest with parallel execution

Test Framework

GOV.UK Notify API uses:
  • pytest - Test framework
  • pytest-xdist - Parallel test execution
  • pytest-mock - Mocking utilities
  • pytest-env - Environment variable management
  • pytest-testmon - Test monitoring for faster re-runs
  • moto - AWS service mocking
  • freezegun - Time/date mocking
  • requests-mock - HTTP request mocking

Running Tests

Full Test Suite

make test
This runs:
  • ruff check . - Linting
  • ruff format --check . - Format checking
  • mypy - Type checking
  • pytest -n logical --maxfail=10 - Tests with parallel execution

Unit Tests Only

pytest

Parallel Execution

Tests run in parallel using pytest-xdist with the logical mode:
pytest -n logical --maxfail=10
  • -n logical - Groups tests by module for better database handling
  • --maxfail=10 - Stops after 10 failures to save time

Watch Mode

Automatically re-run tests when files change:
make watch-tests
This uses:
  • ptw (pytest-watch) - File watcher
  • pytest-testmon - Smart test selection (only affected tests)

Specific Tests

# Run tests in a specific file
pytest tests/app/test_example.py

# Run a specific test class
pytest tests/app/test_example.py::TestClassName

# Run a specific test function
pytest tests/app/test_example.py::test_function_name

# Run tests matching a pattern
pytest -k "test_pattern"

Test Configuration

Test configuration is defined in pytest.ini:
[pytest]
testpaths = tests
env =
    NOTIFY_ENVIRONMENT=test
    MMG_API_KEY=mmg-secret-key
    FIRETEXT_API_KEY=Firetext
    NOTIFICATION_QUEUE_PREFIX=testing
    REDIS_ENABLED=0
addopts = -p no:warnings
xfail_strict = true

Environment Variables

Tests automatically set:
  • NOTIFY_ENVIRONMENT=test
  • REDIS_ENABLED=0 (Redis disabled by default in tests)
  • Test API keys for SMS providers

Test Options

  • -p no:warnings - Suppresses warnings
  • xfail_strict = true - Marks unexpectedly passing xfail tests as failures

Test Databases

Pytest-xdist creates separate databases for parallel test execution.

Database Naming

When running with -n logical, pytest creates:
  • test_notification_api_master - Master database
  • test_notification_api_gw0, gw1, gw2, etc. - Worker databases
The number of worker databases matches your CPU count.

Clean Up Test Databases

# Drop all test databases
make drop-test-dbs

# Drop test databases in Docker
make drop-test-dbs-in-docker

Linting

Run linting and formatting checks:
make lint
This runs:

Ruff

# Check for issues
ruff check .

# Auto-fix issues
ruff check --fix .

# Check formatting
ruff format --check .

# Apply formatting
ruff format .

mypy

# Run type checking
mypy

Code Quality Tools

Pre-commit Hooks

Pre-commit hooks automatically run on every commit:
# From .pre-commit-config.yaml
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
  hooks:
  - trailing-whitespace
  - end-of-file-fixer
  - check-yaml
  - debug-statements
- repo: https://github.com/charliermarsh/ruff-pre-commit
  hooks:
  - ruff (with --fix)
  - ruff-format
Run manually on all files:
pre-commit run --all-files

Type Hints

The project uses mypy for static type checking. Type stubs are included for:
  • types-cachetools
  • types-pycurl
  • types-python-dateutil
  • types-requests

Testing Utilities

AWS Service Mocking

Use moto to mock AWS services:
from moto import mock_s3

@mock_s3
def test_s3_operation():
    # Your test code
    pass

Time/Date Mocking

Use freezegun for time-based tests:
from freezegun import freeze_time

@freeze_time("2024-01-15 12:00:00")
def test_time_sensitive_operation():
    # Your test code
    pass

HTTP Request Mocking

Use requests-mock for HTTP requests:
def test_api_call(requests_mock):
    requests_mock.get('https://api.example.com', json={'status': 'ok'})
    # Your test code

Flaky Test Handling

Use the flaky decorator for tests that occasionally fail:
from flaky import flaky

@flaky(max_runs=3, min_passes=2)
def test_occasionally_flaky():
    # Your test code
    pass

Docker Testing

The repository includes Docker images for testing:

Test Image

# Build test image
docker build -f docker/Dockerfile --target test -t notifications-api .

# Run tests in container
docker run notifications-api make test

Concourse Test Image

The concourse_tests target includes PostgreSQL 15 for CI/CD:
FROM production AS concourse_tests
# Includes PostgreSQL 15
# Sets SQLALCHEMY_DATABASE_URI to localhost

Writing Tests

Test Location

Tests should mirror the application structure:
app/
├── v2/
│   └── notifications/
│       └── post_notifications.py
tests/
├── app/
│   └── v2/
│       └── notifications/
│           └── test_post_notifications.py

Test Naming

  • Test files: test_*.py or *_test.py
  • Test functions: test_*
  • Test classes: Test*

Best Practices

  1. Isolate tests - Each test should be independent
  2. Use fixtures - Leverage pytest fixtures for setup/teardown
  3. Mock external services - Don’t make real API calls
  4. Clear assertions - Use descriptive assertion messages
  5. Test edge cases - Don’t just test the happy path

Continuous Integration

Tests run automatically in Concourse CI/CD pipeline:
  • Uses the concourse_tests Docker image
  • PostgreSQL 15 runs in the container
  • All test databases are created/destroyed automatically

Performance

Test Execution Time

To identify slow tests:
pytest --durations=10

Database Performance

Tests use the logical grouping strategy to minimize database conflicts and improve parallel execution performance.

Troubleshooting

Tests Hanging

If tests hang, check for:
  • Database connection issues
  • Leftover test databases
make drop-test-dbs

Import Errors

Ensure dependencies are installed:
make bootstrap

Database Errors

Ensure PostgreSQL is running:
# macOS with Postgres.app
# Check that Postgres.app is running

# Or via Homebrew
brew services list

Next Steps

Build docs developers (and LLMs) love