Key Fields in Issues
Every issue returned by the tools includes structured fields:Field Descriptions
ruleId
ruleId
Unique identifier for the accessibility rule
- axe-core:
image-alt,color-contrast,aria-valid-attr - Pa11y:
WCAG2AA.Principle1.Guideline1_1.1_1_1.H37 - Lighthouse:
image-alt,color-contrast,button-name
- Group similar issues across the site
- Exclude specific rules if needed
- Look up detailed rule documentation
severity
severity
Issue severity from the tool’s perspective
- critical: Blocks users entirely from functionality
- serious: Significantly hinders users, major barriers
- moderate: Some difficulty, workarounds exist
- minor: Annoyance but doesn’t prevent access
priority for business decisions.wcag
wcag
WCAG success criterion mapping
criterion: Specific WCAG guideline (1.1.1, 2.4.4, etc.)level: Conformance level (A, AA, AAA)principle: One of 4 WCAG principles
- Understand which WCAG requirement is violated
- Group issues by principle or level
- Link to official WCAG documentation
priority
priority
Business-oriented priority (enriched by AccessibilityHub)Based on user impact and WCAG level:
- critical: Blocks essential tasks, Level A violations
- high: Significant barriers, Level AA violations
- medium: Moderate impact, some workarounds
- low: Minor issues, Level AAA or edge cases
remediationEffort
remediationEffort
Estimated time to fix (enriched by AccessibilityHub)
- low: < 30 minutes per issue
- medium: 1-4 hours per issue
- high: Days or sprints
affectedUsers
affectedUsers
User groups impacted (enriched by AccessibilityHub)
screen-reader: JAWS, NVDA, VoiceOver userskeyboard-only: Users who can’t use a mouselow-vision: Screen magnifier userscolor-blind: Color vision deficienciescognitive: Learning/memory disabilitiesmotor-impaired: Limited dexterity
humanContext
humanContext
Human-readable explanation (enriched by AccessibilityHub)Includes:
- WCAG criterion name and explanation
- Real-world impact example
- Why it matters for accessibility
suggestedActions
suggestedActions
Actionable fix steps (enriched by AccessibilityHub)Concrete steps with code examples:
- What to change
- How to change it
- Code snippets when applicable
How to Prioritize
Prioritization Matrix
The most effective way to prioritize is by combining priority and effort:| Priority | Low Effort | Medium Effort | High Effort |
|---|---|---|---|
| Critical | 🔥 Fix TODAY Drop everything | 📅 This Sprint High priority | 📅 Plan Sprint Needs estimation |
| High | ✅ Quick Wins Do soon | 📅 This Sprint Medium priority | 📅 Next Sprint Needs planning |
| Medium | 📝 Easy Fix When available | 📝 Backlog Plan eventually | 📝 Backlog Evaluate vs impact |
| Low | 📝 Nice to Have Low priority | 📝 Backlog Consider deferring | ❌ Defer Not worth effort |
Triage Example
Prompt:Tips for Prioritization
Use the matrix
Always prioritize by combining priority and effort:
- Critical + Low effort = Fix TODAY
- Critical + Medium/High effort = Plan for sprint
- High + Low effort = Quick wins
- Medium/Low + High effort = Evaluate carefully
Validate duplicates
If
deduplicatedCount > issueCount in analyze-mixed results:- Check
individualResultsto see which tool found it - Higher confidence if multiple tools report the same issue
- Lower confidence if only one tool reports it (possible false positive)
Review affected users
Prioritize issues that affect:
- screen-reader and keyboard-only users (most common AT users)
- User groups that represent your actual user base
- Critical user journeys (checkout, signup, login)
Leverage humanContext
Read the real-world examples to understand:
- Actual impact on users
- Whether it’s a blocker or annoyance
- Whether workarounds exist
Advanced Tips
1. SPA Analysis with Lazy Loading
For single-page applications with dynamic content:2. Mobile Viewport Analysis
Test responsive layouts:- iPhone SE: 375x667
- iPhone 12/13: 390x844
- iPad: 768x1024
- Android: 360x640
3. Exclude Specific Rules
Useful for known false positives:4. Contrast Analysis with AAA Level
For maximum accessibility:5. Specific Section Contrast
Analyze only part of the page:.header,header,[role="banner"].main-content,main,[role="main"].footer,footer,[role="contentinfo"].modal,[role="dialog"]
6. APCA Analysis (WCAG 3.0 Draft)
For projects preparing for WCAG 3.0:- WCAG 2.1 (Current)
- APCA (WCAG 3.0 Draft)
Algorithm: Relative luminance ratioFormat: 4.5:1, 7:1 (ratios)Pros:
- Current legal standard
- Well understood
- Widely supported
- Not perceptually uniform
- Doesn’t account for polarity
- Can miss readability issues
- 75Lc: Body text (16px)
- 60Lc: Large text (24px)
- 45Lc: Non-text UI components
Tool Comparison
Same Issue, Different Tools
How different tools report the same problem:- axe-core
- Pa11y
- Lighthouse
- Compact selectors
- Clear impact levels
- HTML snippet included
- Low false positive rate
Tool Feature Comparison
| Feature | axe-core | Pa11y | Lighthouse | Contrast |
|---|---|---|---|---|
| Speed | ~2-3s | ~2s | ~5-10s | ~1-2s |
| Selector format | Compact | Full path | Compact | Compact |
| Severity levels | 4 levels | 3 types | Binary | 4 levels |
| HTML snippet | ✅ | ✅ | ✅ | ✅ |
| Confidence score | ✅ | ✅ | - | Always 1.0 |
| False positives | Very low | Moderate | Low | Very low |
| Scoring | - | - | 0-100 | - |
| Fix suggestions | - | - | Limited | ✅ Colors |
| WCAG 3.0 (APCA) | - | - | - | ✅ |
Frequently Asked Questions
Which tool should I use in CI/CD?
Which tool should I use in CI/CD?
For deployed web: Use
analyze-mixed for maximum coverage- Runs axe-core, Pa11y, and Lighthouse in parallel
- Deduplicates common issues
- Provides Lighthouse score for tracking
analyze-contrast with fix suggestions- Fast and focused
- Provides actionable color fixes
- Can run on specific components
How do I handle false positives?
How do I handle false positives?
- Verify with a second tool: Run analyze-mixed and check if multiple tools report it
- Check confidence score: Scores < 0.8 may be false positives
- Manual verification: Test with actual assistive technology
- Exclude if confirmed: Use
excludeRulesoption with documentation
Can I analyze authenticated pages?
Can I analyze authenticated pages?
Not directly. Tools analyze public pages without authentication.Workarounds:
- Capture HTML: Save authenticated page HTML and analyze with
htmlparameter - Test environment: Configure test environment with auth bypass
- Component testing: Analyze components in isolation
What are MCP Resources used for?
What are MCP Resources used for?
Resources provide reference data to complement analysis tools:
- WCAG criteria lookup:
wcag://criteria/{criterion} - Contrast thresholds:
contrast://thresholds/wcag21orapca - Lighthouse audits:
lighthouse://audits
What do severity levels mean?
What do severity levels mean?
| Severity | User Impact |
|---|---|
| critical | Completely blocks users from tasks |
| serious | Major barriers, significantly hinders access |
| moderate | Some difficulty, but workarounds exist |
| minor | Annoyance that doesn’t prevent access |
Use
priority field for business decisions, not severity. Priority considers WCAG level and user impact.How often should I run accessibility checks?
How often should I run accessibility checks?
| Scenario | Frequency | Tool |
|---|---|---|
| During development | Every feature branch | analyze-with-axe (fast) |
| Pre-deployment | Every release | analyze-mixed (comprehensive) |
| Production monitoring | Weekly or per sprint | analyze-mixed + Lighthouse |
| After major redesigns | Once | full-accessibility-audit prompt |
| Continuous tracking | Every commit | analyze-with-axe in CI |
Related
Workflows
Recommended workflows for common tasks
Effective Prompts
Tips for better prompts
Enriched Context
Understanding enriched human context
WCAG Criteria Resource
Browse WCAG reference data