COR-Matrix Integration
COR-Matrix (Code Origin Retention Matrix) is an advanced analytics feature that tracks AI-generated code retention patterns in your codebase. It helps you understand how much AI-generated code persists over time and analyze the quality and utility of AI contributions.Overview
COR-Matrix provides:- Code Origin Tracking: Identify which code was AI-generated vs human-written
- Retention Analysis: Track how long AI-generated code remains in the codebase
- Quality Metrics: Analyze patterns in code modifications and refactoring
- Team Insights: Understand AI adoption and effectiveness across your team
How It Works
Code Origin Tagging
When HAI Build generates or modifies code:- Each change is tagged with metadata (timestamp, task ID, file path)
- Tags are sent to the COR-Matrix service
- The service tracks the code’s lifecycle
- Analytics are generated on retention patterns
Retention Tracking
COR-Matrix monitors:- Initial Generation: When AI creates new code
- Modifications: When humans edit AI-generated code
- Deletions: When AI-generated code is removed
- Persistence: How long code remains unchanged
Analytics Dashboard
Visualize:- Retention rates over time
- Most stable AI-generated components
- Frequently modified sections
- Team adoption metrics
Configuration
Setup via .hai.config
Create or edit.hai.config in your workspace root:
Configuration Parameters
| Parameter | Description | Required |
|---|---|---|
cormatrix.baseURL | COR-Matrix service endpoint | Yes |
cormatrix.token | Authentication token | Yes |
cormatrix.workspaceId | Unique workspace identifier | Yes |
Example Configuration
Security Considerations
Protecting Sensitive Data
Add to.gitignore:
Environment-Specific Configuration
Use different configs for different environments: Local Development (.hai.config):Token Management
Rotate tokens regularly:- Generate new token in COR-Matrix dashboard
- Update
.hai.configlocally - Update secrets in CI/CD platform
- Revoke old token
Data Collected
Code Change Events
For each AI-generated change:Modification Events
When AI-generated code is modified:Privacy
COR-Matrix collects:- ✅ File paths and names
- ✅ Line counts (additions, deletions, modifications)
- ✅ Timestamps
- ✅ Task IDs
- ✅ User IDs (hashed)
- ❌ Actual code content (NOT collected)
- ❌ Sensitive data (NOT collected)
Analyzing Results
Retention Metrics
High Retention (Good)- AI-generated code remains unchanged for extended periods
- Indicates high-quality, production-ready output
- Less rework needed
- Frequent modifications to AI-generated code
- May indicate prompt engineering improvements needed
- Could suggest expert guidelines need refinement
Example Insights
Strong Performance:Dashboard Access
Access your COR-Matrix analytics:- Navigate to your COR-Matrix instance
- Log in with your credentials
- Select your workspace
- View dashboards:
- Retention Overview: High-level metrics
- File Analysis: Per-file retention rates
- Team Performance: User-level insights
- Model Comparison: AI model effectiveness
Key Metrics
- Overall Retention Rate: % of AI code unchanged after N days
- Average Modification Time: How quickly AI code is edited
- Top Retained Files: Most stable AI contributions
- Refactor Hotspots: Frequently modified files
- AI vs Human Code Ratio: Codebase composition
Use Cases
1. Quality Assurance
Identify which types of code generation work well:2. Expert Refinement
Use retention data to improve custom experts:3. Team Adoption
Track how teams use AI assistance:4. Model Comparison
Compare effectiveness of different AI models:5. Continuous Improvement
Iterative enhancement workflow:Integration with CI/CD
Pre-Commit Hooks
Track retention before commits:GitHub Actions
Automate tracking in CI:Advanced Configuration
Custom Event Types
Extend tracking with custom events:Filtering
Exclude files from tracking:Batching
Batch events for performance:Troubleshooting
Events Not Appearing
- Check Configuration:
- Verify Token:
- Check Logs:
Authentication Errors
.hai.config
Network Issues
baseURL is correct and service is accessible
Best Practices
1. Regular Review
Schedule weekly/monthly reviews of retention metrics:- Identify trends
- Spot areas for improvement
- Celebrate wins
2. Team Training
Use retention data to train team:- Share high-retention examples
- Discuss low-retention patterns
- Refine prompting strategies
3. Iterative Refinement
Continuously improve:- Generate code
- Measure retention
- Identify gaps
- Update experts/prompts
- Repeat
4. Privacy First
Never send:- Proprietary code
- Customer data
- Credentials
- PII
5. Version Control
Track config changes:FAQ
What data is sent to COR-Matrix?
Only metadata: file paths, line counts, timestamps, task IDs. No actual code content is transmitted.Can I self-host COR-Matrix?
Yes, COR-Matrix can be deployed on-premises. Contact your organization’s platform team for setup.How long is data retained?
Configurable per instance. Default is 1 year of retention history.Does this slow down HAI Build?
No. Events are sent asynchronously and don’t block code generation.Can I disable COR-Matrix?
Yes. Simply remove or comment out thecormatrix.* configuration in .hai.config.
Next Steps
Custom Experts
Use retention data to refine your experts
CLI Usage
Automate tracking with CLI workflows