Overview
Adapt follows a pragmatic testing approach: add tests where they provide value for preventing regressions, especially during refactoring or bug fixes. We use table-driven tests, sqlmock for database operations, and interface-based mocking for dependencies.Running Tests
Quick Commands
CI/CD Pipeline
GitHub Actions runs comprehensive checks on every push/PR:- Linting - golangci-lint v2.9.0 with Go 1.26 support
- Unit Tests - All package tests
- Integration Tests - Database and API tests
- Coverage Reporting - Reports to Codecov
Testing Patterns
Table-Driven Tests
The recommended pattern for testing multiple scenarios:Interface-Based Testing
For API endpoints using dependency injection:Sqlmock Testing
For functions with direct SQL queries:Database Testing
For testing database operations:Test File Structure
Organise tests alongside implementation:test_mocks.go- Shared mocks and utilities per package*_test.go- Focused tests per file or endpoint group- Use sqlmock for direct SQL query testing
- Use interface mocks for dependency injection testing
Coverage Approach
We follow a pragmatic coverage strategy:When to Add Tests
New Functions
Add tests where they provide value for preventing regressions:
- Complex business logic
- Critical user workflows
- Edge cases and error handling
- Functions likely to change
Existing Functions
Add tests opportunistically:
- During refactoring
- When fixing bugs
- When adding new features
- When reducing complexity
What to Test
High Priority:- Job creation and lifecycle management
- Task claiming and processing logic
- Database operations (CRUD, transactions)
- Authentication and authorization
- Critical business logic
- Error handling and recovery
- Simple getter/setter functions
- Trivial utility functions
- UI rendering code
- Third-party library wrappers
Extract + Test + Commit Pattern
When refactoring large functions, follow this proven methodology:Analyse Structure
Map distinct responsibilities within the function:
- Identify logical sections
- Find reusable patterns
- Spot potential extractions
Commit Steps
Commit each extraction separately:
feat: extract validateJobOptions functiontest: add tests for validateJobOptionsrefactor: use validateJobOptions in CreateJob
Recent Refactoring Success
5 monster functions eliminated:getJobTasks: 216 → 56 lines (74% reduction)CreateJob: 232 → 42 lines (82% reduction)setupJobURLDiscovery: 108 → 17 lines (84% reduction)setupSchema: 216 → 27 lines (87% reduction)WarmURL: 377 → 68 lines (82% reduction)
Manual Testing
API Testing
Use HTTP clients to test endpoints:Job Queue Testing
Test the job queue system:Integration Tests
Integration tests require a test database:Debugging Tests
Verbose Output
Race Detection
Memory Profiling
Best Practices
Test Isolation
- Each test should be independent
- Use table-driven tests for multiple scenarios
- Clean up resources in defer statements
- Don’t rely on test execution order
Mock Management
- Share common mocks in
test_mocks.go - Use interface-based mocking for flexibility
- Verify all mock expectations are met
- Keep mocks simple and focused
Error Testing
- Test both success and failure paths
- Verify error messages and types
- Test edge cases and boundary conditions
- Include timeout and cancellation scenarios
Coverage Mindset
- Don’t chase 100% coverage
- Focus on high-value tests
- Add tests during refactoring
- Test critical paths thoroughly
Troubleshooting
Tests Hanging
Check for:- Unclosed database connections
- Missing context cancellation
- Goroutine leaks
- Deadlocks in concurrent code
Flaky Tests
Common causes:- Race conditions (use
-raceflag) - Timing dependencies (use proper synchronisation)
- Shared state between tests
- External dependencies (mock them)
Mock Errors
If mock expectations fail:- Verify exact arguments match
- Check call order for ordered expectations
- Ensure all expected calls are made
- Review mock setup in test
Next Steps
- Read Architecture to understand what to test
- Review Database Schema for database testing
- Check Local Setup for environment configuration