Skip to main content
Web accessibility testing ensures that websites and applications are usable by everyone, including people with disabilities. AccessibilityHub integrates three powerful testing tools to provide comprehensive accessibility analysis.

Why Accessibility Testing Matters

Accessibility testing is essential because:

Legal Compliance

Many countries require WCAG compliance for public-facing websites (ADA in the US, EAA in EU)

Wider Audience

15% of the world’s population has some form of disability

Better UX

Accessible sites benefit all users with clearer navigation and structure

SEO Benefits

Semantic HTML and proper structure improve search engine rankings

Testing Tools

AccessibilityHub integrates three industry-standard testing tools, each with different strengths:

axe-core

axe-core

Fast, accurate, and widely adopted accessibility testing engine by Deque Systems
Strengths:
  • Very low false positive rate
  • Fast execution (~2-3 seconds)
  • Industry standard used by Microsoft, Google, and government agencies
  • Excellent WCAG coverage
Best for:
  • CI/CD pipelines
  • Automated testing
  • Quick validation during development
Example issues detected:
  • Missing alt text on images
  • Color contrast violations
  • Missing form labels
  • Invalid ARIA usage
{
  "tool": "analyze-with-axe",
  "url": "https://example.com"
}

Pa11y

Pa11y

Accessibility testing tool built on HTML CodeSniffer
Strengths:
  • Comprehensive HTML structure validation
  • Detailed selector information
  • Good for catching HTML semantic issues
  • Moderate execution speed (~2 seconds)
Best for:
  • Validating HTML semantics
  • Catching structural issues
  • Cross-validation with axe-core
Example issues detected:
  • Improper heading hierarchy
  • Missing landmarks
  • Table structure problems
  • Document structure violations
{
  "tool": "analyze-with-pa11y",
  "url": "https://example.com"
}

Lighthouse

Lighthouse

Google’s automated testing tool providing accessibility scores from 0-100
Strengths:
  • Provides overall accessibility score (0-100)
  • Detailed audit results with WCAG mappings
  • Part of Chrome DevTools ecosystem
  • Industry-recognized scoring standard
Best for:
  • High-level accessibility metrics
  • Tracking improvements over time
  • Pre-deployment validation
  • Stakeholder reporting
Example metrics:
  • Overall accessibility score: 87/100
  • Passed audits: 42
  • Failed audits: 8
  • Individual audit scores and impact
{
  "tool": "analyze-with-lighthouse",
  "url": "https://example.com"
}

Color Contrast Analysis

Contrast Analyzer

Specialized tool for color contrast validation with fix suggestions
Strengths:
  • Calculates exact contrast ratios
  • Suggests compliant color alternatives
  • Supports WCAG 2.1 and APCA (WCAG 3.0 draft)
  • Very fast (~1-2 seconds)
Best for:
  • Design system validation
  • Color palette compliance
  • Quick contrast checks
  • Preparing for WCAG 3.0
Example output:
  • Current ratio: 3.2:1 (fails WCAG AA)
  • Required ratio: 4.5:1
  • Suggested fix: Darken text to #333333 (ratio 5.1:1)
{
  "tool": "analyze-contrast",
  "url": "https://example.com",
  "options": {
    "wcagLevel": "AA",
    "suggestFixes": true
  }
}

Combined Analysis

For the most comprehensive results, use the analyze-mixed tool to run multiple tools in parallel:

analyze-mixed

Run axe-core, Pa11y, and Lighthouse in parallel and get deduplicated, unified results
Benefits:
  • Maximum issue coverage
  • Automatic deduplication
  • Single unified report
  • Parallel execution for speed
{
  "tool": "analyze-mixed",
  "url": "https://example.com"
}

Tool Comparison

Featureaxe-corePa11yLighthouseContrast
Speed~2-3s~2s~5-10s~1-2s
False PositivesVery LowModerateLowVery Low
WCAG CoverageExcellentGoodGoodSpecialized
Scoring✅ (0-100)
Fix SuggestionsLimited✅ Colors
Best UseCI/CDHTML validationMetricsDesign QA

WCAG Guidelines

All tools test against the Web Content Accessibility Guidelines (WCAG):
Minimum accessibility level
  • Basic keyboard navigation
  • Text alternatives for images
  • Captions for audio content
  • No keyboard traps
Failing Level A means some users cannot access content at all.
Highest accessibility level
  • Color contrast: 7:1 for text, 4.5:1 for large text
  • Sign language interpretation
  • Extended audio descriptions
  • Low background noise in audio
AAA is not required for full site compliance but recommended for critical content.

Testing Workflow

1

Quick Check

Use analyze-with-axe or analyze-mixed for a fast initial scan:
Analyze the accessibility of https://my-site.com
2

Review Issues

Prioritize issues by severity and remediation effort. Focus on:
  • Critical + Low effort = Fix today
  • Serious + Low effort = Quick wins
3

Contrast Check

Run dedicated contrast analysis for design QA:
Check the color contrast of https://my-site.com
4

Score Tracking

Use Lighthouse for measurable progress:
Get the Lighthouse accessibility score for https://my-site.com
5

Fix and Retest

Apply fixes and run analysis again to verify improvements.

Limitations of Automated Testing

Automated tools can only detect 30-40% of accessibility issues. Manual testing and user feedback are essential.
Cannot be detected automatically:
  • Logical heading order and content hierarchy
  • Quality of alt text (only detects if missing)
  • Keyboard navigation flow and usability
  • Screen reader announcement order
  • Clarity of error messages
  • Cognitive load and understandability
Best practice: Combine automated testing with:
  • Manual keyboard testing
  • Screen reader testing (NVDA, JAWS, VoiceOver)
  • User testing with people with disabilities
  • Accessibility audits by experts

Enriched Context

Learn about the human context added to each issue

Workflows

Common testing workflows and use cases

Interpreting Results

How to prioritize and fix issues

Tools Reference

Detailed tool documentation

Build docs developers (and LLMs) love