Skip to main content

Overview

The arckit risk command creates a comprehensive risk register following the UK Government Orange Book (2023) risk management framework. It identifies, assesses, and manages project risks using Orange Book methodology.

Command Syntax

arckit risk "<project ID or category>"

Arguments

category
string
required
Project ID or risk category, e.g. 001, procurement risks, security assessment

What It Creates

Generates a comprehensive Orange Book-compliant risk register:
  • File: projects/{NNN}-{project-name}/ARC-{PROJECT_ID}-RISK-v1.0.md
  • Document ID: ARC-{PROJECT_ID}-RISK-v1.0
  • Content: Risk register with inherent/residual assessments, 4Ts responses, and action plan

Prerequisites

MANDATORY: Must run arckit stakeholders first. Every risk MUST have an owner from the stakeholder RACI matrix.
RECOMMENDED: Also run arckit principles to extract technology standards and compliance requirements that create risks.

Orange Book Framework

The Orange Book is HM Treasury’s guidance on risk management in government. The 2023 update provides:
  • Part I: 5 Risk Management Principles (Governance, Integration, Collaboration, Risk Processes, Continual Improvement)
  • Part II: Risk Control Framework (4-pillar “house” structure)
  • 4Ts Risk Response Framework: Tolerate, Treat, Transfer, Terminate
  • Risk Assessment Methodology: Likelihood × Impact for Inherent and Residual risk
  • Risk Appetite: Amount of risk organization is prepared to accept/tolerate

Risk Register Structure

Risk Categories

  • STRATEGIC Risks: Risks to strategic objectives, competitive position, policy changes, stakeholder drivers under threat
  • OPERATIONAL Risks: Risks to operations, service delivery, resource availability, skills gaps, dependencies
  • FINANCIAL Risks: Budget overruns, funding shortfalls, ROI not achieved, cost escalation
  • COMPLIANCE/REGULATORY Risks: Non-compliance with laws, regulations, policies, audit findings, regulatory penalties
  • REPUTATIONAL Risks: Damage to reputation, stakeholder confidence, public perception, media scrutiny
  • TECHNOLOGY Risks: Technical failure, cyber security, legacy system issues, vendor lock-in, technology obsolescence

Risk Assessment Scales

Inherent Likelihood (BEFORE controls, 1-5 scale):
  • 1 - Rare: < 5% probability, highly unlikely
  • 2 - Unlikely: 5-25% probability, could happen but probably won’t
  • 3 - Possible: 25-50% probability, reasonable chance
  • 4 - Likely: 50-75% probability, more likely to happen than not
  • 5 - Almost Certain: > 75% probability, expected to occur
Inherent Impact (BEFORE controls, 1-5 scale):
  • 1 - Negligible: Minimal impact, easily absorbed, < 5% variance
  • 2 - Minor: Minor impact, manageable within reserves, 5-10% variance
  • 3 - Moderate: Significant impact, requires management effort, 10-20% variance
  • 4 - Major: Severe impact, threatens objectives, 20-40% variance
  • 5 - Catastrophic: Existential threat, project failure, > 40% variance
Risk Score = Likelihood × Impact (1-25):
  • 1-5: Low (Green)
  • 6-12: Medium (Yellow)
  • 13-19: High (Orange)
  • 20-25: Critical (Red)

4Ts Risk Response Framework

Select ONE primary response for each risk:
When to use: Low residual risk score (1-5), within appetite, cost of mitigation exceeds benefitExample: “Minor UI inconsistency - aesthetic only, no functional impact”
When to use: Medium/High risk, can be reduced through actionsExample: “Implement automated testing to reduce defect risk”
When to use: Low likelihood/high impact, can be insured or contracted outExample: “Purchase cyber insurance for breach liability”
When to use: High likelihood/high impact, exceeds appetite, cannot be mitigatedExample: “Cancel high-risk vendor contract, source alternative”

Real-World Examples

Example 1: Technology Modernization

Common risk pattern:
  • TECHNOLOGY: Legacy system failure during migration (High)
  • OPERATIONAL: Skills gap in new technology (Medium)
  • FINANCIAL: Cloud costs exceed estimates (Medium)
  • REPUTATIONAL: Service outage during cutover (High)

Example 2: New Digital Service

Common risk pattern:
  • STRATEGIC: User adoption below target (High)
  • TECHNOLOGY: Scalability limitations at peak (High)
  • COMPLIANCE: GDPR/Accessibility non-compliance (Critical)
  • OPERATIONAL: Support team not ready for go-live (Medium)

Example 3: Vendor Procurement

Common risk pattern:
  • FINANCIAL: Vendor pricing increases post-contract (Medium)
  • OPERATIONAL: Vendor delivery delays (Medium)
  • TECHNOLOGY: Vendor lock-in limits future options (High)
  • REPUTATIONAL: Vendor security breach affects reputation (High)

UK Government-Specific Risks

For UK Government/public sector projects, include:

STRATEGIC

  • Policy/ministerial direction change mid-project
  • Manifesto commitment not delivered
  • Machinery of government changes

COMPLIANCE/REGULATORY

  • Spending controls (HMT approval delays)
  • NAO audit findings
  • PAC scrutiny and recommendations
  • FOI requests reveal sensitive information
  • Judicial review of procurement

REPUTATIONAL

  • Parliamentary questions and media scrutiny
  • Citizen complaints and service failures
  • Social media backlash
  • Select Committee inquiry

OPERATIONAL

  • GDS Service Assessment failure
  • CDDO digital spend control rejection
  • Civil service headcount restrictions
  • Security clearance delays

Risk Register Output

The generated risk register includes:

A. Executive Summary

  • Total risks identified by category
  • Risk profile distribution (Critical/High/Medium/Low)
  • Risks exceeding organizational appetite
  • Overall risk profile assessment
  • Key risks requiring immediate attention

B. Risk Matrix Visualization

  • Inherent Risk Matrix (before controls): 5×5 ASCII matrix showing Likelihood × Impact
  • Residual Risk Matrix (after controls): Shows risk movement after mitigations

C. Top 10 Risks

Ranked table showing: ID, Title, Category, Residual Score, Owner, Status, Response

D. Risk Register (Detailed Table)

Full table with:
  • Risk ID, Category, Title, Description
  • Inherent Likelihood/Impact/Score
  • Current Controls and Effectiveness
  • Residual Likelihood/Impact/Score
  • 4Ts Response
  • Owner, Status, Actions, Target Date

E. Risk by Category Analysis

For each category: number of risks, average scores, control effectiveness, key themes

F. Risk Ownership Matrix

Which stakeholder owns which risks (from RACI matrix)

G. 4Ts Response Summary

Distribution: Tolerate X%, Treat Y%, Transfer Z%, Terminate W%

H. Risk Appetite Compliance

(If organizational appetite exists): Risks within/exceeding appetite by category

I. Action Plan

Prioritized list of risk mitigation actions with owners and due dates

J. Monitoring and Review Framework

  • Review frequency (Monthly for Critical/High, Quarterly for Medium/Low)
  • Escalation criteria
  • Reporting requirements
  • Next review date

K. Integration with SOBC

Note which sections of SOBC use this risk register (Strategic Case, Economic Case, Management Case Part E)

Command Handoffs

Feed Risk Register into SOBC

arckit sobc
Feed risk register into Strategic Outline Business Case Management Case (Part E).

Create Risk-Driven Requirements

arckit requirements
Create requirements that mitigate identified risks.

Validate Security Controls

arckit secure
Validate security controls against identified risks.

Orange Book Compliance Checklist

Ensure the risk register demonstrates Orange Book compliance:
  • Governance and Leadership: Risk owners assigned from senior stakeholders
  • Integration: Risks linked to objectives, stakeholders, and business case
  • Collaboration: Risks sourced from stakeholder concerns and expert judgment
  • Risk Processes: Systematic identification, assessment, response, monitoring
  • Continual Improvement: Review framework and action plan for ongoing management

Important Notes

If stakeholder analysis doesn’t exist, the command will STOP and warn: “Risk register requires stakeholder analysis to identify risk owners and affected parties. Please run arckit stakeholders first.”

Key Concepts

  • Every risk MUST have an owner: From stakeholder RACI matrix (Accountable role = Risk Owner)
  • Inherent vs Residual: Inherent is BEFORE controls, Residual is AFTER controls
  • Risk Appetite: Amount of risk organization is prepared to accept/tolerate
  • 4Ts Framework: Every risk needs ONE primary response (Tolerate/Treat/Transfer/Terminate)
  • Traceability: Every risk should link back to stakeholder concerns or strategic objectives

Build docs developers (and LLMs) love