Contract testing ensures your mock APIs stay synchronized with real implementations. Apicentric validates request/response schemas, status codes, and data formats, catching breaking changes before they reach production.
Why contract testing
Prevent breaking changes : Catch API mismatches before deployment
Documentation validation : Ensure specs match reality
Team alignment : Keep frontend and backend in sync
Confident refactoring : Know when changes break contracts
CI/CD integration : Automate validation in pipelines
How it works
Define your expected API behavior in YAML.
Register the service as a contract to test against.
Apicentric compares mock responses with real API responses.
Get detailed HTML reports showing any differences.
Setting up contract testing
Create a service definition
First, create a mock API:
name : users-api
version : "1.0"
server :
port : 9000
base_path : /api/v1
proxy_base_url : https://api.example.com
endpoints :
- method : GET
path : /users
description : List all users
responses :
200 :
content_type : application/json
body : |
[
{"id": 1, "name": "Alice", "email": "[email protected] "},
{"id": 2, "name": "Bob", "email": "[email protected] "}
]
- method : GET
path : /users/{id}
description : Get user by ID
responses :
200 :
content_type : application/json
body : |
{
"id": "{{params.id}}",
"name": "Alice",
"email": "[email protected] "
}
404 :
content_type : application/json
body : |
{
"error": "User not found"
}
Register the contract
Register your service as a testable contract:
apicentric contract register \
--name users-api \
--service services/users-api.yaml
This stores the contract in Apicentric’s database for future testing.
Run a contract test
Compare your mock against the real API:
apicentric contract test \
--contract users-api \
--base-url https://api.example.com
Apicentric will:
Call each endpoint in your mock
Call the same endpoint on the real API
Compare responses (status, headers, body)
Generate a detailed HTML report
View the report
Open the generated report:
open cypress/reports/contract-test-report.html
The report shows:
✅ Matching endpoints
❌ Mismatched responses
📊 Detailed diffs
🔍 Missing endpoints
Configuration
Create apicentric.json to configure contract testing:
{
"base_url" : "https://api.example.com" ,
"default_timeout" : 30000 ,
"reports_dir" : "cypress/reports" ,
"simulator" : {
"enabled" : true ,
"services_dir" : "services" ,
"db_path" : "apicentric.db"
},
"execution" : {
"mode" : "development" ,
"continue_on_failure" : true ,
"verbose" : false
}
}
Advanced contract testing
Testing specific endpoints
Test only certain endpoints:
apicentric contract test \
--contract users-api \
--endpoints "GET /users,GET /users/{id}"
Testing with authentication
Add authentication headers:
endpoints :
- method : GET
path : /profile
header_match :
Authorization : "Bearer *"
responses :
200 :
content_type : application/json
body : |
{
"id": 1,
"name": "Current User",
"email": "[email protected] "
}
Provide the token when testing:
export AUTH_TOKEN = "your-token-here"
apicentric contract test \
--contract users-api \
--header "Authorization: Bearer $AUTH_TOKEN "
Testing multiple scenarios
Use scenarios to test different states:
scenarios :
- name : empty_list
description : No users exist
endpoints : [ "GET /users" ]
response :
status : 200
body : '[]'
- name : server_error
description : Server error scenario
response :
status : 500
body : '{"error": "Internal server error"}'
Test a specific scenario:
apicentric contract test \
--contract users-api \
--scenario empty_list
Continuous integration
GitHub Actions
Automate contract testing on every PR:
.github/workflows/contract-tests.yml
name : Contract Tests
on :
pull_request :
paths :
- 'services/**/*.yaml'
schedule :
- cron : '0 8 * * *' # Daily at 8am
jobs :
contract-tests :
runs-on : ubuntu-latest
steps :
- uses : actions/checkout@v3
- name : Install Apicentric
run : |
curl -fsSL https://raw.githubusercontent.com/pmaojo/apicentric/main/scripts/install.sh | sh
- name : Validate services
run : |
apicentric simulator validate --path services --recursive
- name : Register contract
run : |
apicentric contract register \
--name users-api \
--service services/users-api.yaml
- name : Run contract tests
env :
API_BASE_URL : ${{ secrets.API_BASE_URL }}
API_TOKEN : ${{ secrets.API_TOKEN }}
run : |
apicentric contract test \
--contract users-api \
--base-url "$API_BASE_URL" \
--header "Authorization: Bearer $API_TOKEN" \
--verbose
- name : Upload report
if : always()
uses : actions/upload-artifact@v3
with :
name : contract-test-report
path : cypress/reports/
- name : Comment PR with results
if : failure()
uses : actions/github-script@v6
with :
script : |
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: '❌ Contract tests failed! Check the report for details.'
})
GitLab CI
stages :
- validate
- test
- report
validate-services :
stage : validate
script :
- curl -fsSL https://raw.githubusercontent.com/pmaojo/apicentric/main/scripts/install.sh | sh
- apicentric simulator validate --path services --recursive
contract-tests :
stage : test
script :
- apicentric contract register --name users-api --service services/users-api.yaml
- apicentric contract test --contract users-api --base-url "$API_BASE_URL"
artifacts :
when : always
paths :
- cypress/reports/
expire_in : 1 week
Testing strategies
Development workflow
Create mock first : Define expected API behavior
Develop against mock : Build frontend with working mock
Backend implements : Backend team implements real API
Run contract test : Validate mock matches implementation
Fix mismatches : Update mock or real API to align
Monitoring production
Schedule regular contract tests to detect API drift:
# Daily cron job
0 8 * * * cd /path/to/project && apicentric contract test --contract users-api
Pre-deployment validation
Run contract tests before deploying API changes:
#!/bin/bash
# pre-deploy.sh
echo "Running contract tests..."
apicentric contract test \
--contract users-api \
--base-url https://staging-api.example.com
if [ $? -eq 0 ]; then
echo "✅ Contract tests passed. Safe to deploy."
exit 0
else
echo "❌ Contract tests failed. Fix issues before deploying."
exit 1
fi
Understanding test results
Successful match
✅ GET /users
Status: 200 (expected 200)
Content-Type: application/json
Body matches schema ✓
Schema mismatch
❌ GET /users/{id}
Status: 200 (expected 200)
Body mismatch:
- Mock has field 'email'
+ Real API missing field 'email'
+ Real API has extra field 'username'
Status code mismatch
❌ GET /users/999
Status: 200 (expected 404)
Expected not found error, got success response
Missing endpoint
⚠️ POST /users
Endpoint exists in mock but not found in real API
Real API returned 404 Not Found
Best practices
Mock realistic responses that match production data structures.
# Good
body : |
{
"id": 1,
"created_at": "2024-01-15T10:30:00Z",
"email": "[email protected] "
}
# Bad
body : '{"data": "test"}'
Include error responses in your contracts:
responses :
200 :
body : '{"success": true}'
400 :
body : '{"error": "Bad request"}'
401 :
body : '{"error": "Unauthorized"}'
404 :
body : '{"error": "Not found"}'
500 :
body : '{"error": "Server error"}'
Track contract versions alongside API versions:
name : users-api
version : "2.0.0"
Document breaking changes
Use descriptions to explain changes:
endpoints :
- method : GET
path : /users
description : |
v2.0: Added 'role' field to response
v2.0: Removed 'permissions' array (moved to /users/{id}/permissions)
Schedule tests to catch API drift:
Daily for critical APIs
Weekly for stable APIs
On every deployment
Troubleshooting
Contract test fails with timeout
Increase timeout in configuration:
{
"default_timeout" : 60000
}
Real API requires authentication
Add headers to your test command:
apicentric contract test \
--contract users-api \
--header "Authorization: Bearer $TOKEN " \
--header "X-API-Key: $API_KEY "
Response body has dynamic fields
Use wildcards or ignore specific fields:
responses :
200 :
body : |
{
"id": "{{params.id}}",
"timestamp": "*",
"data": {}
}
Contract tests call real APIs. Ensure you’re testing against staging or development environments to avoid affecting production data.
Next steps
Digital twin setup Create IoT device simulations
Request validation Add schema validation to contracts
Dockerizing services Run contract tests in containers
Export specs Generate documentation from contracts