Skip to main content
Home Assistant has an extensive test suite to ensure code quality and prevent regressions. All contributions must include tests.

Running Tests

Prerequisites

Ensure you have installed test dependencies:
uv pip install -r requirements_test_all.txt

Run All Tests

pytest tests/

Run Specific Tests

# Run tests for a specific component
pytest tests/components/light/

# Run a specific test file
pytest tests/components/light/test_init.py

# Run a specific test function
pytest tests/components/light/test_init.py::test_light_turn_on

# Run tests matching a pattern
pytest -k "test_light"

Useful pytest Options

# Run with verbose output
pytest -v

# Stop on first failure
pytest -x

# Show print statements
pytest -s

# Run tests in parallel (faster)
pytest -n auto

# Only run tests that failed last time
pytest --lf

# Show test durations
pytest --durations=10

Test Configuration

Test configuration is defined in pyproject.toml:
[tool.pytest.ini_options]
testpaths = ["tests"]
norecursedirs = [".git", "testing_config"]
asyncio_mode = "auto"
asyncio_default_fixture_loop_scope = "function"
log_format = "%(asctime)s.%(msecs)03d %(levelname)-8s %(threadName)s %(name)s:%(filename)s:%(lineno)s %(message)s"
log_date_format = "%Y-%m-%d %H:%M:%S"
asyncio_debug = true

Key Settings

  • testpaths - Directory containing tests
  • asyncio_mode = “auto” - Automatically detect and run async tests
  • asyncio_debug = true - Enable asyncio debug mode to catch issues
  • filterwarnings - Configure which warnings to show or ignore

Test Plugins

Home Assistant’s test suite uses several pytest plugins:
  • pytest-asyncio (1.3.0) - Support for async test functions
  • pytest-aiohttp (1.1.0) - Fixtures for testing aiohttp applications
  • pytest-cov (7.0.0) - Code coverage measurement
  • pytest-freezer (0.4.9) - Freeze time in tests
  • pytest-socket (0.7.0) - Disable or restrict socket calls
  • pytest-timeout (2.4.0) - Set timeouts for tests
  • pytest-sugar (1.0.0) - Better test output formatting

Writing Tests

Test Structure

Tests are organized to mirror the source code structure:
tests/
├── components/
│   └── light/
│       ├── __init__.py
│       ├── test_init.py
│       ├── test_config_flow.py
│       └── test_device_trigger.py
├── helpers/
│   └── test_entity.py
└── util/
    └── test_dt.py

Basic Test Example

"""Tests for the light component."""
import pytest
from homeassistant.core import HomeAssistant
from homeassistant.components import light


async def test_light_turn_on(hass: HomeAssistant) -> None:
    """Test turning on a light."""
    # Setup
    await async_setup_component(hass, light.DOMAIN, {})
    
    # Action
    await hass.services.async_call(
        light.DOMAIN,
        "turn_on",
        {"entity_id": "light.test"},
        blocking=True,
    )
    
    # Assert
    state = hass.states.get("light.test")
    assert state.state == "on"

Common Fixtures

Home Assistant provides useful fixtures:
  • hass - Home Assistant instance
  • aiohttp_client - Test client for HTTP requests
  • hass_ws_client - WebSocket client
  • snapshot - Snapshot testing for data structures

Async Tests

Most Home Assistant code is async, so tests should be too:
async def test_async_function(hass: HomeAssistant) -> None:
    """Test an async function."""
    result = await my_async_function()
    assert result == expected_value

Freezing Time

Use freezegun or pytest-freezer to control time in tests:
from freezegun import freeze_time

@freeze_time("2026-03-10 12:00:00")
async def test_time_dependent_behavior(hass: HomeAssistant) -> None:
    """Test behavior at a specific time."""
    # Your test code

Mocking

Use unittest.mock for mocking:
from unittest.mock import patch, Mock

async def test_with_mock(hass: HomeAssistant) -> None:
    """Test with mocked external dependency."""
    with patch("homeassistant.components.light.async_get_data") as mock_get:
        mock_get.return_value = {"brightness": 255}
        # Your test code

Testing Config Flows

from homeassistant import config_entries
from homeassistant.data_entry_flow import FlowResultType

async def test_config_flow(hass: HomeAssistant) -> None:
    """Test the config flow."""
    result = await hass.config_entries.flow.async_init(
        "my_integration",
        context={"source": config_entries.SOURCE_USER},
    )
    
    assert result["type"] == FlowResultType.FORM
    assert result["step_id"] == "user"

Code Coverage

Home Assistant uses coverage.py to measure test coverage.

Run Tests with Coverage

# Run tests and generate coverage report
pytest --cov=homeassistant --cov-report=html

# View HTML report
open htmlcov/index.html

Coverage Configuration

Coverage is configured in pyproject.toml:
[tool.coverage.run]
source = ["homeassistant"]

[tool.coverage.report]
exclude_lines = [
  "pragma: no cover",
  "def __repr__",
  "raise AssertionError",
  "raise NotImplementedError",
  "if TYPE_CHECKING:",
  "@overload",
]

Coverage Requirements

  • New code should have high test coverage (aim for 90%+)
  • All public APIs must be tested
  • Edge cases and error conditions should be covered

Test Best Practices

Do’s

  • Write tests first - Consider TDD (Test-Driven Development)
  • Test behavior, not implementation - Tests should verify what code does, not how it does it
  • Use descriptive names - Test names should clearly describe what they test
  • Keep tests isolated - Each test should be independent
  • Test error cases - Don’t just test the happy path
  • Use fixtures appropriately - Leverage pytest fixtures for setup/teardown

Don’ts

  • Don’t test external services - Mock external dependencies
  • Don’t write flaky tests - Tests should be deterministic
  • Don’t share state - Avoid global variables or shared mutable objects
  • Don’t test private methods - Test public interfaces instead
  • Don’t make network calls - Use mocks or pytest-socket will catch them

Debugging Tests

Using pdb

async def test_debug_example(hass: HomeAssistant) -> None:
    """Test with debugger."""
    import pdb; pdb.set_trace()
    # Test code

Logging in Tests

import logging

_LOGGER = logging.getLogger(__name__)

async def test_with_logging(hass: HomeAssistant) -> None:
    """Test with logging."""
    _LOGGER.debug("Debug message")
    # Test code
Run pytest with -s to see log output:
pytest -s tests/components/light/test_init.py

Continuous Integration

Home Assistant uses GitHub Actions for CI. Tests run automatically on:
  • Pull requests
  • Commits to the dev branch
  • Commits to the master branch
All tests must pass before a PR can be merged.

Component-Specific Testing

Testing Integrations

Integrations typically need tests for:
  • Config flow - Setup and configuration
  • Entity creation - Entities are created correctly
  • State updates - Entities reflect correct states
  • Services - Service calls work as expected
  • Unloading - Clean shutdown and resource cleanup

Testing Helpers

Helper modules should have:
  • Unit tests for all public functions
  • Tests for edge cases
  • Tests for error handling

Resources

Build docs developers (and LLMs) love