Overview
The/arckit.evaluate command creates objective vendor evaluation frameworks and scores vendor proposals against requirements.
Command Usage
Examples
Three Evaluation Modes
- Create Framework
- Score Vendor
- Compare Vendors
Generate evaluation criteria before receiving proposalsWhen: After SOW/DOS published, before proposal deadlineOutput:
ARC-{PROJECT_ID}-EVAL-v1.0.mdPrerequisites
Mandatory:- Requirements (
ARC-{PROJECT_ID}-REQ-v1.0.md) - Evaluate vendors against requirements - Architecture Principles (
ARC-000-PRIN-v1.0.md) - Technology standards, compliance
- Statement of Work (
ARC-{PROJECT_ID}-SOW-v1.0.md) - Pre-defined evaluation criteria - DOS Requirements (
ARC-{PROJECT_ID}-DOS-v1.0.md) - DOS evaluation approach - Technology Research (
ARC-{PROJECT_ID}-RSCH-v1.0.md) - Market landscape, vendor options - G-Cloud Search (
ARC-{PROJECT_ID}-GCLD-v1.0.md) - Shortlisted services
- Vendor Proposals in
projects/{project}/vendors/{vendor}/- For scoring mode - Stakeholder Analysis (
ARC-{PROJECT_ID}-STKE-v1.0.md) - Evaluation panel composition
Mode 1: Create Evaluation Framework
Mandatory Qualifications (Pass/Fail)
Vendors must meet ALL mandatory qualifications or be disqualified: Certifications:- Industry-specific (PCI-DSS for payments, ISO 27001 for security)
- Cloud provider certifications (AWS/Azure/GCP partner status)
- Security clearances (UK public sector: SC, DV)
- Minimum years in relevant domain (e.g., 5+ years financial services)
- Similar project references (minimum 2-3)
- Technology stack expertise
- Minimum company age
- Professional indemnity insurance
- Financial statements (if required)
- Minimum 2-3 client references from similar projects
- Contactable references
- Recent projects (within 2 years)
Scoring Criteria (100 Points Total)
Standard weightings (customizable):Example Framework Output
Mode 2: Score Individual Vendor
Scoring Process
- Vendor directory created:
projects/{project}/vendors/{vendor-name}/ - Proposal documents read (if available in directory)
- Interactive scoring:
- Ask for proposal highlights
- Ask for concerns/gaps
- Score each category against framework
- Detailed justification for each score
- Requirement traceability (link to BR/FR/NFR IDs)
Example Scoring Output
Mode 3: Compare Vendors
Comparison Matrix
Side-by-side comparison table:Evaluation Best Practices
Objectivity
- ✅ Documented criteria before receiving proposals
- ✅ Specific justification for each score (no arbitrary numbers)
- ✅ Requirement traceability (link to BR/FR/NFR IDs)
- ✅ Multiple evaluators (reduce bias)
- ✅ Audit trail (meeting minutes, scoring sheets)
Mandatory Qualifications
- ✅ Pass/fail (missing any = disqualified)
- ✅ Checked first (before detailed scoring)
- ✅ Evidence required (certificates, references)
- ✅ No exceptions (maintains fairness)
Scoring Transparency
- ✅ Justification required for each score
- ✅ Reference requirement IDs (BR-001, NFR-P-001)
- ✅ Strengths and weaknesses documented
- ✅ Risk assessment per vendor
- ✅ Interview notes if conducted
UK Public Sector Specific
Social Value (10% weighting):- Apprenticeships and skills development
- SME subcontracting commitments
- Environmental sustainability
- Local economic impact
- Technical authority (RACI: Accountable)
- Commercial lead (RACI: Accountable)
- User representative (RACI: Consulted)
- Finance (RACI: Informed)
- Evaluation report for approving authority
- Decision rationale documented
- Conflicts of interest declared
- Award notice published (Contracts Finder)
Related Commands
- /arckit.sow - Generate Statement of Work with evaluation criteria
- /arckit.dos - DOS procurement with evaluation framework
- /arckit.gcloud-search - G-Cloud service search
- /arckit.hld-review - Review vendor HLD
- /arckit.dld-review - Review vendor DLD
- /arckit.traceability - Validate requirements coverage