Skip to main content

Service Assessment

Prepare for GDS Service Standard assessments by analyzing evidence against the 14-point standard, identifying gaps, and generating a comprehensive readiness report.

Command

arckit service-assessment <project ID and stage>

Arguments

  • PHASE (required): alpha, beta, or live - The assessment phase to prepare for
  • DATE (optional): YYYY-MM-DD - Planned assessment date for timeline calculations

Examples

arckit service-assessment "001 Alpha"
arckit service-assessment "001 Beta" DATE=2025-12-15

Purpose

Generate a comprehensive GDS Service Standard assessment preparation report that:
  1. Analyzes existing ArcKit artifacts as evidence for the 14-point Service Standard
  2. Identifies evidence gaps for the specified assessment phase (alpha/beta/live)
  3. Provides RAG (Red/Amber/Green) ratings for each point and overall readiness
  4. Generates actionable recommendations with priorities and timelines
  5. Includes assessment day preparation guidance

The 14-Point Service Standard

Section 1: Meeting Users’ Needs

  1. Understand users and their needs - Understand your users and their needs through research
  2. Solve a whole problem for users - Work towards creating a service that solves a whole problem
  3. Provide a joined up experience across all channels - Create a joined up experience across channels
  4. Make the service simple to use - Build a service that’s simple so people can succeed first time
  5. Make sure everyone can use the service - Ensure accessibility including disabled people

Section 2: Providing a Good Service

  1. Have a multidisciplinary team - Put in place a sustainable multidisciplinary team
  2. Use agile ways of working - Create the service using agile, iterative ways of working
  3. Iterate and improve frequently - Have capacity and flexibility to iterate frequently
  4. Create a secure service which protects users’ privacy - Ensure security and privacy protection
  5. Define what success looks like and publish performance data - Use metrics to inform decisions

Section 3: Using the Right Technology

  1. Choose the right tools and technology - Choose tools that enable efficient service delivery
  2. Make new source code open - Make source code open and reusable under appropriate licences
  3. Use and contribute to open standards, common components and patterns - Build on open standards
  4. Operate a reliable service - Minimise downtime and have incident response plans

Output

Generates ARC-{PROJECT_ID}-SVCASS-v{VERSION}.md with:
  • Executive summary with overall readiness score and critical gaps
  • Detailed assessment for all 14 Service Standard points
  • Evidence inventory mapping artifacts to standard points
  • Assessment preparation checklist with priorities
  • Assessment day preparation guidance
  • Documentation to share with panel
  • Who should attend and show and tell structure
  • Tips for success

Assessment Process

The command:
  1. Reads existing ArcKit artifacts (STKE, REQ, RISK, PLAN, DIAG, etc.)
  2. Maps evidence to each of the 14 Service Standard points
  3. Applies phase-appropriate criteria (alpha/beta/live)
  4. Assigns RAG ratings based on evidence found
  5. Identifies gaps and generates specific recommendations
  6. Provides assessment day guidance

RAG Ratings

  • 🟢 Green (Ready): All critical evidence found, team ready to present
  • 🟡 Amber (Partial): Some evidence found but gaps remain
  • 🔴 Red (Not Ready): Critical evidence missing, significant work required
Overall Readiness:
  • 🟢 Green (Ready): 12+ points Green, max 2 Amber, 0 Red
  • 🟡 Amber (Nearly Ready): 10+ points Green/Amber, max 2 Red
  • 🔴 Red (Not Ready): More than 2 Red points or fewer than 10 Green/Amber

Phase-Appropriate Criteria

Alpha Assessment

  • Lower bar for operational evidence (monitoring, performance data)
  • Higher bar for user research and prototyping
  • Critical: User testing, team composition, technology viability

Beta Assessment

  • Higher bar for everything
  • Critical: Working service, security testing, accessibility compliance, performance monitoring
  • All 14 points must be addressed substantively

Live Assessment

  • Highest bar, operational excellence expected
  • Critical: Published performance data, user satisfaction, continuous improvement
  • Operational maturity demonstrated

Prerequisites

MANDATORY (warn if missing):
  • PRIN (Architecture Principles) - Technology standards, compliance requirements
  • REQ (Requirements) - User stories, NFRs, accessibility requirements
RECOMMENDED (read if available):
  • STKE (Stakeholder Analysis) - User needs, personas
  • RISK (Risk Register) - Security risks, mitigation strategies
  • PLAN (Project Plan) - Phases, timeline, team structure
  • SOBC (Business Case) - Benefits, success metrics
  • DATA (Data Model) - GDPR compliance, data governance
  • DIAG (Architecture Diagrams) - C4, deployment
  • SECD (Secure by Design) - Security assessment
  • DPIA (DPIA) - Privacy protection evidence
  • arckit tcop - Technology Code of Practice assessment (points 11, 13 overlap)
  • arckit analyze - Comprehensive governance quality analysis
  • arckit traceability - Requirements traceability matrix

Resources

Build docs developers (and LLMs) love