Skip to main content

Overview

The dpia command generates Data Protection Impact Assessments (DPIAs) following UK GDPR Article 35 requirements and ICO guidance. A DPIA is a legal requirement for processing that is likely to result in a high risk to individuals’ rights and freedoms. It systematically assesses privacy risks, evaluates necessity and proportionality, and identifies mitigations.

Command Syntax

arckit dpia <project-id-or-activity>
Example:
arckit dpia 001
arckit dpia "biometric tracking"
arckit dpia "NHS appointment booking system"

When is a DPIA Required?

UK GDPR Article 35 - Legal RequirementA DPIA is mandatory when processing is likely to result in high risk to individuals, including:
  • Systematic monitoring (e.g., CCTV, location tracking, behavioral profiling)
  • Large-scale processing of special category data (health, biometric, ethnic, political)
  • Automated decision-making with legal or significant effects (credit scoring, recruitment)
  • Processing children’s data at scale
  • Innovative technology use (AI, biometrics, blockchain)
  • Matching/combining datasets from different sources
  • Processing that prevents individuals exercising their rights
Failure to conduct a DPIA when required can result in ICO enforcement action (fines up to £17.5 million or 4% of global turnover).

ICO 9-Criteria Screening

The command automatically scores the ICO 9 criteria based on your data model:
#CriterionDescriptionExample
1Evaluation or scoringProfiling, credit scoring, risk assessmentAI-powered credit scoring, employee performance rating
2Automated decision-makingDecisions with legal/significant effectAutomated loan rejection, algorithmic recruitment
3Systematic monitoringContinuous tracking, surveillanceCCTV, location tracking, web analytics
4Sensitive dataSpecial category data (Article 9)Health records, biometric data, ethnicity
5Large scale>5000 data subjects or national scopeNHS patient database, national census
6Matching datasetsCombining data from multiple sourcesLinking NHS + social care + police records
7Vulnerable subjectsChildren, elderly, disabled, patientsSchool pupil tracking, elderly care monitoring
8Innovative technologyNew/emerging techAI, blockchain, facial recognition
9Prevents rights exerciseNo mechanism for SAR/deletion/portabilityLegacy system with no data export feature
Decision Rules:
  • 2+ criteria met: DPIA REQUIRED (UK GDPR Article 35)
  • 1 criterion met: DPIA RECOMMENDED (good practice)
  • 0 criteria met: DPIA NOT REQUIRED (but consider Data Privacy Notice)
Example Screening Output:
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
📋 DPIA Screening Results (ICO 9 Criteria)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

[X] Criterion 4: Sensitive data (Health, Ethnicity)
[X] Criterion 7: Vulnerable subjects (Children identified)
[ ] Criterion 1: Evaluation/scoring (Not detected)
[ ] Criterion 2: Automated decision-making (Not detected)
...

**Screening Score**: 2/9 criteria met
**Decision**: ✅ DPIA REQUIRED under UK GDPR Article 35

Proceeding to generate full DPIA...
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Prerequisites

Mandatory

  • Data Model (DATA): Must identify all entities with PII/special category data
    • Command: arckit data-model <project>
    • The tool will STOP and warn if missing
    • Why: A DPIA requires a data model to identify personal data processing
  • Architecture Principles (PRIN): Privacy by Design principles, data minimization
  • Requirements (REQ): Data requirements (DR-xxx), security (NFR-SEC), compliance (NFR-C)
  • Stakeholder Analysis (STKE): Data subject categories, vulnerable groups, RACI (Data Controller, DPO)

Optional

  • Risk Register (RISK): Extract existing data protection risks
  • Secure by Design (SECD): Extract security controls as DPIA mitigations

Workflow

1. Create Data Model First

arckit requirements "NHS appointment booking"
arckit data-model "NHS appointment booking"

2. Generate DPIA

arckit dpia "NHS appointment booking"
The command will:
  1. Screen: Run ICO 9-criteria screening
  2. Assess: Identify risks to individuals (confidentiality, integrity, availability)
  3. Mitigate: Propose technical, organizational, procedural controls
  4. Document: Generate full DPIA document

3. Review Output

The command creates:
  • File: projects/001-nhs-appointment-booking/ARC-001-DPIA-v1.0.md
  • Classification: OFFICIAL-SENSITIVE (contains privacy risk analysis)
  • Summary: Shows screening score, risks identified, ICO consultation requirement

4. Next Steps (Handoffs)

Risk Register

Add DPIA risks to project risk register

Secure by Design

Extract security controls as DPIA mitigations

AI Playbook

If AI/ML: Integrate algorithmic bias assessment

ICO Consultation

If residual high risks: Contact ICO before processing

DPIA Risk Assessment

Risk Categories

DPIA risks focus on impact on individuals (not organizational risks): Confidentiality Risks:
  • Data breach (unauthorized access, exfiltration)
  • Insider threat (employees accessing data inappropriately)
  • Third-party processor breach (cloud provider, SaaS vendor)
Integrity Risks:
  • Data corruption (inaccurate profiling, incorrect medical records)
  • Unauthorized modification (tampering with transaction history)
  • Data quality issues (incomplete records affecting decisions)
Availability Risks:
  • Inability to access data (SAR requests fail)
  • Inability to port data (no export mechanism)
  • Inability to delete data (no erasure mechanism)

Risk Scoring Matrix

Likelihood:
  • Remote: Unlikely to occur (e.g., nation-state attack)
  • Possible: Could occur under certain circumstances (e.g., phishing attack)
  • Probable: Likely to occur (e.g., insider threat without access controls)
Severity (Impact on Individuals):
  • Minimal: Minor inconvenience (e.g., spam emails)
  • Significant: Distress, financial loss (e.g., identity theft, discrimination)
  • Severe: Physical harm, serious psychological distress (e.g., medical misdiagnosis, stalking)
Overall Risk:
  • Low (🟢): Remote + Minimal, Possible + Minimal
  • Medium (🟠): Remote + Significant, Possible + Significant, Probable + Minimal
  • High (🔴): Remote + Severe, Possible + Severe, Probable + Significant/Severe
Example Risk:
### DPIA-003: Unauthorized Profiling of Children

**Risk Description:** AI algorithm profiles children's learning disabilities without parental consent.

**Data Subjects Affected:** School pupils (ages 5-16), vulnerable group

**Likelihood:** Possible (if AI is deployed without consent mechanism)

**Severity:** Severe (discrimination, psychological harm to children)

**Overall Risk:** 🔴 HIGH

**Mitigations:**
- Implement parental consent workflow (GDPR Article 8)
- Age verification mechanism (verify age &lt;16)
- Human review of all AI-generated learning disability flags
- Child-friendly privacy notice (plain language, icons)
- Best interests assessment (document benefit to child)

**Residual Risk:** 🟠 MEDIUM (after mitigations)

**ICO Consultation:** Not required (residual risk reduced to MEDIUM)

ICO Prior Consultation

When is ICO Prior Consultation Required?Under UK GDPR Article 36, you must consult the ICO before processing if:
  • A DPIA identifies a high residual risk even after mitigations
  • You cannot implement sufficient mitigations to reduce the risk
How to Consult:
  1. Contact ICO: https://ico.org.uk/make-a-complaint/your-personal-information-concerns/
  2. Provide: DPIA document, risk assessment, proposed mitigations
  3. Wait for ICO advice (statutory deadline: 8 weeks)
  4. Do NOT begin processing until ICO consultation completes
Failure to consult when required: ICO enforcement action, fines

GDPR Compliance Sections

Necessity and Proportionality

Necessity Test: For each processing purpose, justify why it’s necessary. Example:
Purpose: Process health data to recommend treatment options. Necessity: Treatment recommendations require analyzing patient medical history (diagnoses, medications, allergies). This is necessary for safe, effective care. Proportionality: We collect only medical history directly relevant to current symptoms. We do NOT collect unrelated data (e.g., employment history, political views).
Legal Basis (Article 6):
  • Consent: User explicitly consents (e.g., marketing emails)
  • Contract: Necessary for contract performance (e.g., payment processing)
  • Legal Obligation: Required by law (e.g., tax records retention)
  • Vital Interests: Necessary to protect life (e.g., emergency medical treatment)
  • Public Task: Necessary for public interest (e.g., NHS patient care)
  • Legitimate Interest: Necessary for legitimate interests (e.g., fraud detection)
Special Category Conditions (Article 9): If processing health, biometric, ethnic, political, religious, or genetic data, document:
  • Explicit Consent: User gives explicit consent (not just opt-in checkbox)
  • Employment: Necessary for employment rights/obligations
  • Vital Interests: Necessary to protect life (unconscious patient)
  • Medical Purposes: Necessary for healthcare (diagnosis, treatment)
  • Public Interest: Necessary for public health (epidemics, disease surveillance)

Data Subject Rights

Children’s Data (Article 8)

If processing children’s data (under 18 in UK):
Enhanced Protections for ChildrenChildren have reduced capacity to understand risks and consent. You must:
  • Age Verification: Verify age (self-declaration, ID check, age estimation)
  • Parental Consent: Obtain consent from parent/guardian if child <13
  • Best Interests Assessment: Document how processing serves child’s best interests
  • Child-Friendly Privacy Notice: Plain language, icons, short sentences
  • Minimize Data: Collect only what’s necessary (avoid profiling, targeting)
  • Security: Enhanced security controls (prevent grooming, bullying)

International Transfers

If transferring data outside UK: Adequate Countries (no additional safeguards needed):
  • EU/EEA countries (adequacy decision)
  • Countries with ICO adequacy decisions (e.g., Israel, New Zealand)
Non-Adequate Countries (require safeguards):
  • Standard Contractual Clauses (SCCs): EU Commission-approved contracts
  • Binding Corporate Rules (BCRs): Internal group data transfer rules
  • Certification Schemes: e.g., EU-US Data Privacy Framework (if applicable)
High-Risk Countries:
  • Avoid transfers to countries with mass surveillance laws (e.g., China, Russia) unless essential and with strong encryption

Document Structure

The generated DPIA includes:
# DPIA Document

## Document Control
- Document ID: ARC-001-DPIA-v1.0
- Classification: OFFICIAL-SENSITIVE
- Assessment Date: 2026-03-04
- Next Review Date: 2027-03-04 (12 months)

## Section 1: Need for DPIA
- ICO 9-criteria screening results
- Legal requirement justification

## Section 2: Description of Processing
- Project context, processing purposes
- Nature, scope, context of processing
- Data categories (PII, special category)
- Data subjects (including vulnerable groups)
- Retention periods

## Section 3: Consultation
- Internal stakeholders (Data Controller, DPO, IT Security)
- External stakeholders (data subjects, processors)

## Section 4: Necessity and Proportionality
- Legal basis (Article 6)
- Special category conditions (Article 9)
- Necessity and proportionality tests
- Data minimization assessment

## Section 5: Risk Assessment
- Risks to individuals (DPIA-001, DPIA-002, etc.)
- Likelihood, severity, overall risk scores
- Confidentiality, integrity, availability risks

## Section 6: Mitigations
- Technical controls (encryption, pseudonymization)
- Organizational controls (policies, training)
- Procedural controls (breach notification, audit trails)
- Residual risk after mitigations

## Section 7: ICO Consultation
- Required if residual high risks remain

## Section 8: Sign-off and Approval
- Data Controller signature
- DPO signature
- Senior Responsible Owner signature

## Section 9: Review and Monitoring
- Review triggers (12 months, major changes, breaches)

## Section 10: Traceability
- Links to data model, requirements, stakeholders

## Section 11: Data Subject Rights
- Implementation of SAR, rectification, erasure, portability

## Section 12: International Transfers
- Safeguards (SCCs, BCRs, adequacy decisions)

## Section 13: Children's Data
- Age verification, parental consent, best interests

## Section 14: AI/Algorithmic Processing
- Algorithmic bias, explainability, human oversight

## Section 15: Summary and Action Plan
- Total risks, key mitigations, ICO consultation
- Action plan with owners and deadlines

Real-World Example

Project: NHS Appointment Booking System (Project 003)ICO Screening: 3/9 criteria met → DPIA REQUIRED
  • ✅ Criterion 4: Sensitive data (Health records, NHS numbers)
  • ✅ Criterion 5: Large scale (>500,000 patients nationally)
  • ✅ Criterion 7: Vulnerable subjects (Elderly patients, disabled patients)
Processing Overview:
  • Data Subjects: NHS patients (all ages, including children and elderly)
  • Personal Data: 5 entities with PII (Patient, Appointment, MedicalHistory)
  • Special Category Data: YES (Health data - diagnoses, medications, allergies)
  • Legal Basis: Public Task (Article 6(1)(e)) - NHS statutory duty to provide care
  • Article 9 Condition: Medical Purposes (Article 9(2)(h)) - healthcare provision
  • Retention Period: 8 years after last interaction (NHS records retention policy)
Risk Assessment:
  • Total Risks: 8 identified
    • 🔴 High: 2 (unauthorized access to health records, data breach affecting vulnerable patients)
    • 🟠 Medium: 4 (data quality issues, inability to delete records, third-party breach)
    • 🟢 Low: 2 (minor availability issues, email delivery failures)
Key Risks:
  1. DPIA-001: Unauthorized access to patient health records (🔴 HIGH)
    • Likelihood: Possible (phishing, insider threat)
    • Severity: Severe (medical confidentiality breach, discrimination, psychological harm)
    • Mitigations: Role-based access control, audit logging, MFA for staff, data encryption (AES-256)
    • Residual Risk: 🟠 MEDIUM
  2. DPIA-002: Data breach affecting vulnerable elderly patients (🔴 HIGH)
    • Likelihood: Possible (targeted attack on elderly demographic)
    • Severity: Severe (elderly more susceptible to fraud, identity theft)
    • Mitigations: Enhanced security for vulnerable patient records, breach notification within 72 hours, ICO reporting
    • Residual Risk: 🟠 MEDIUM
Mitigations Proposed: 12 technical, organizational, procedural controls
  • Encryption at rest (AES-256) and in transit (TLS 1.3)
  • Role-based access control (RBAC) with least privilege
  • Multi-factor authentication (MFA) for all staff
  • Audit logging (all access to patient records)
  • Data minimization (only collect necessary health data)
  • Pseudonymization for analytics (anonymize NHS numbers)
  • Breach notification process (72-hour ICO reporting)
  • Staff training (GDPR, data protection, phishing awareness)
ICO Prior Consultation: NOT REQUIRED (all residual risks reduced to MEDIUM or LOW)Data Subject Rights:
  • ✅ Implemented: SAR (export patient data), rectification (update records), portability (JSON export)
  • ❌ Not Implemented: Erasure (NHS retention policy prevents deletion before 8 years)
    • Mitigation: Document legal basis for retention (NHS statutory duty, medical records retention)
Next Steps:
  1. Obtain Data Controller signature (NHS Trust Chief Executive)
  2. Obtain DPO signature (NHS Trust Data Protection Officer)
  3. Implement recommended mitigations (encryption, RBAC, MFA)
  4. Establish 12-month review cycle
  5. Train staff on GDPR and data protection
  6. Implement breach notification process

Tips & Best Practices

Start EarlyConduct the DPIA during the design phase, not after the system is built. This allows you to design privacy in from the start (Privacy by Design) and avoid costly retrofitting.
DPIA is NOT a Risk RegisterDPIA risks focus on impact on individuals (privacy harm, discrimination, physical harm), NOT organizational risks (financial loss, reputational damage). These are separate assessments.
Review RegularlyDPIAs must be reviewed:
  • Every 12 months (recommended)
  • When new processing activities are added
  • When data protection risks change
  • When ICO guidance is updated
  • After a data breach occurs

Quality Checks

Before generating the document, ArcKit validates:

Data Model

Prerequisite: Run before DPIA to identify PII

Requirements

Input: Extract DR-xxx, NFR-SEC, compliance requirements

Risk Register

Integration: Add DPIA risks to risk register

Secure by Design

Integration: Extract security controls as mitigations

AI Playbook

Integration: Algorithmic bias assessment for AI/ML

Stakeholders

Input: Data subject categories, vulnerable groups

Additional Resources

Build docs developers (and LLMs) love