Overview
Testing is a critical part of contributing to Terraform. All pull requests must pass tests before being merged, and most changes require updates to the test suite.Running Tests
Unit Tests
Run the full unit test suite:The unit test suite is self-contained, using mocks and local files to ensure it can run offline and isn’t broken by external system changes.
Test Specific Packages
For faster iteration, test only the package you’re working on:Verbose Output
Get detailed test output:Run Specific Tests
Run a single test by name:Acceptance Tests
Acceptance tests interact with external services and are disabled by default.Enable Acceptance Tests
Set theTF_ACC environment variable:
Best Practices for Acceptance Tests
- Focus on specific packages: Only enable for the package you’re testing
- Run unchanged tests first: Verify existing tests pass before making changes
- Expect drift: External systems may cause test failures unrelated to your changes
- Handle failures: Investigate whether failures are due to your changes or external drift
Example Workflow
Equivalence Tests
Equivalence tests are E2E tests that verify Terraform command output doesn’t change unexpectedly.What Are Equivalence Tests?
These tests compare the output of Terraform commands before and after code changes. They help catch unintended behavioral changes.Running Equivalence Tests
Equivalence tests use the terraform-equivalence-testing framework:Automated Equivalence Testing
Equivalence tests run automatically in CI:- On PR open: Runs diff and comments with results
- On PR close: Runs update and opens PR with new reference outputs
If no changes are detected, the equivalence testing process is invisible to PR authors.
Writing New Equivalence Tests
Add new test cases to thetesting/equivalence-tests/tests directory. Each test should be in a separate directory following the framework’s guidelines.
Test Requirements for PRs
Before submitting a pull request:1. Verify Existing Tests Pass
2. Add Tests for New Functionality
New features require:- Unit tests for core functionality
- Integration tests where appropriate
- Acceptance tests if interacting with external services
3. Update Tests for Changed Behavior
If your changes affect existing functionality:- Update affected tests to match new behavior
- Ensure test descriptions reflect the new expectations
- Document why the change in behavior is correct
Code Coverage
Generate code coverage reports:Benchmarking
Run benchmark tests:Common Testing Patterns
Table-Driven Tests
Terraform uses table-driven tests extensively:Using Test Fixtures
Test data is stored intestdata directories:
Debugging Tests
See the Debugging guide for details on debugging tests in VS Code and other IDEs. Quick example for VS Code:Continuous Integration
PR Checks
When you open a PR, the following checks run:- CLA: Sign the Contributor License Agreement (first-time contributors)
- Unit tests: All tests must pass
- Acceptance tests: Tests that interact with external services
- Change files: Verify changelog entries or
no-changelog-neededlabel - Vercel: Internal tool (doesn’t work for external contributors)
Test Parallelization
Tests run in parallel by default. If you need to disable parallelism:Mock Testing
Terraform uses go-mock for generating mocks://go:generate directives.
Performance Testing
For performance-sensitive changes:- Benchmark before and after your changes
- Compare results to ensure no regression
- Include benchmark results in PR description
Next Steps
Debugging
Learn how to debug tests and Terraform operations
Code Style
Follow code style requirements for tests
Pull Requests
Submit your tested changes for review