Accessibility Testing Tools:
Automated vs Manual Testing
Effective accessibility testing requires both automated tools and manual evaluation. Understanding when to use each approach ensures comprehensive WCAG compliance.
The Testing Landscape
π What Can Be Tested?
Automated Testing: 30-40% of WCAG issues
Manual Testing: 60-70% of WCAG issues
User Testing: Real-world validation
Why the Difference?
Automated tools check technical compliance
Manual testing evaluates usability and context
User testing validates real-world experience
Automated Testing Tools
π€ Popular Tools
Browser Extensions:
WAVE (WebAIM) - Visual feedback
axe DevTools - Detailed issue reporting
Lighthouse (Chrome) - Built-in auditing
IBM Equal Access - Enterprise-focused
ANDI (SSA) - Screen reader simulation
Command Line/CI:
Pa11y - Automated testing pipeline
axe-core - API for custom integration
Lighthouse CI - Continuous integration
jest-axe - Jest testing integration
Cloud Services:
Deque Axe Monitor - Site-wide scanning
Siteimprove - Enterprise platform
Monsido - Continuous monitoring
Silktide - All-in-one solution
Automated Tool Capabilities
What Automated Tools Can Find:
β Missing alt text
β Insufficient color contrast
β Missing form labels
β Invalid ARIA usage
β Missing page titles
β Duplicate IDs
β Missing language attribute
β Incorrect heading hierarchy
β Missing skip links
What They Cannot Find:
β Whether alt text is accurate
β Whether tab order makes sense
β Whether content is understandable
β Whether error messages are helpful
β Whether keyboard navigation is logical
β Whether focus order matches visual order
β Whether ARIA is used correctly in context
β Whether video captions are accurate
Setting Up Automated Testing
π οΈ Quick Start with axe DevTools
// Install
npm install --save-dev axe-core
// Basic usage
const { AxePuppeteer } = require('@axe-core/puppeteer');
const puppeteer = require('puppeteer');
(async () => {
const browser = await puppeteer.launch();
const page = await browser.newPage();
await page.goto('https://yoursite.com');
const results = await new AxePuppeteer(page).analyze();
console.log('Violations:', results.violations);
await browser.close();
})();
CI/CD Integration
# GitHub Actions example
name: Accessibility Tests
on: [push, pull_request]
jobs:
a11y-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Run Pa11y
run: |
npm install -g pa11y-ci
pa11y-ci --sitemap https://yoursite.com/sitemap.xml
Jest Integration
import { render } from '@testing-library/react';
import { axe, toHaveNoViolations } from 'jest-axe';
expect.extend(toHaveNoViolations);
test('should not have accessibility violations', async () => {
const { container } = render(<YourComponent />);
const results = await axe(container);
expect(results).toHaveNoViolations();
});
Manual Testing Methods
π€ Essential Manual Tests
1. Keyboard Navigation
Test every interactive element:
Tab - Move forward
Shift+Tab - Move backward
Enter - Activate links/buttons
Space - Toggle checkboxes/buttons
Arrow keys - Navigate menus/radios
Escape - Close dialogs
Checklist:
All interactive elements reachable
Visible focus indicator
Logical tab order
No keyboard traps
Skip links work
Modal/dialog focus management
2. Screen Reader Testing
NVDA (Windows - Free):
NVDA+N - NVDA menu
Insert+Down - Read all
Up/Down arrows - Read by line
H - Next heading
K - Next link
F - Next form field
B - Next button
JAWS (Windows - Paid):
Similar commands to NVDA
VoiceOver (Mac - Built-in):
Cmd+F5 - Turn on/off
VO+A - Read all
VO+Right/Left - Navigate
VO+Space - Activate
VO+U - Rotor menu
Testing Checklist:
All content read aloud
Alt text makes sense
Form labels announced
Buttons/links identified
Error messages announced
Heading structure logical
Tables navigable
3. Zoom and Reflow
Browser zoom to 200%:
- [ ] All content visible
- [ ] No horizontal scrolling
- [ ] Text remains readable
- [ ] No content overlap
- [ ] Interactive elements still usable
4. Color and Contrast
Tests:
- [ ] Content understandable without color
- [ ] Links distinguishable without color
- [ ] Error states clear without color
- [ ] Use contrast checker tools
- [ ] Test with grayscale mode
5. Content Quality
Review:
- [ ] Alt text describes images accurately
- [ ] Link text is descriptive
- [ ] Headings are meaningful
- [ ] Error messages are helpful
- [ ] Instructions are clear
- [ ] Content is understandable
Screen Reader Testing Guide
π― Platform-Specific Testing
Windows + NVDA (Free):
Download from nvaccess.org
Install and restart
Launch with Ctrl+Alt+N
Navigate your site
Listen for issues
Mac + VoiceOver (Built-in):
Cmd+F5 to enable
Practice in VoiceOver training
Navigate your site
Cmd+F5 to disable
iOS + VoiceOver:
Settings > Accessibility > VoiceOver
Triple-click home/side button shortcut
Swipe to navigate
Double-tap to activate
Android + TalkBack:
Settings > Accessibility > TalkBack
Enable TalkBack
Swipe to navigate
Double-tap to activate
Testing Strategy
π Comprehensive Approach
Development Phase:
1. Linting - ESLint jsx-a11y plugin
2. Unit tests - jest-axe
3. Component testing - Automated checks
4. Developer keyboard testing
Pre-Release:
1. Automated scan - axe DevTools
2. Manual keyboard testing
3. Screen reader testing (NVDA/VoiceOver)
4. Color contrast verification
5. Zoom/reflow testing
Post-Release:
1. Monitoring - Ongoing automated scans
2. User feedback - Bug reports
3. Periodic manual audits
4. User testing with disabled users
Tool Comparison
π§ Tool Feature Matrix
axe DevTools:
β Accurate, low false positives
β Detailed guidance
β Free browser extension
β Pro version with advanced features
Best for: Development and QA
WAVE:
β Visual feedback
β Free
β Easy to understand
β Browser extension or API
Best for: Beginners and quick checks
Lighthouse:
β Built into Chrome
β Performance + accessibility
β Free
β CI integration
Best for: Overall site quality
Pa11y:
β Command line
β CI/CD integration
β Sitemap scanning
β Free and open source
Best for: Automation and pipelines
Siteimprove/Enterprise Tools:
β Site-wide monitoring
β Compliance reporting
β Issue tracking
β Paid service
Best for: Enterprise and governance
Common Pitfalls
β Testing Mistakes to Avoid
Over-Reliance on Automation:
Bad: "Automated tests passed, we're accessible!"
Good: "Automated tests passed, now let's do manual testing"
Testing Only Homepage:
Bad: Test homepage and call it done
Good: Test representative pages and user flows
Ignoring False Positives:
Bad: Dismiss all automated findings
Good: Investigate and document why certain items are not issues
Testing Without Context:
Bad: Check each page in isolation
Good: Test complete user journeys
Never Using Real Screen Readers:
Bad: Rely only on automated simulation
Good: Test with NVDA, JAWS, or VoiceOver
User Testing
π₯ Testing with Real Users
Recruiting:
Disability organizations
User testing platforms
Local communities
Accessibility advocates
Test Scenarios:
Complete a purchase
Fill out a contact form
Navigate to specific content
Use search functionality
Create an account
What to Observe:
Where do users get stuck?
What causes confusion?
What do they skip?
What takes too long?
What works well?
Compensation:
Pay users fairly for their time
Typical rate: $50-100/hour
Provide flexible scheduling
Offer remote options
Building a Testing Workflow
π Continuous Testing Process
Daily:
Linting during development
Unit test runs
Local accessibility checks
Per Pull Request:
Automated CI tests
Component-level checks
Developer keyboard test
Weekly:
Automated site scans
New feature manual tests
Issue triage
Monthly:
Comprehensive manual audit
Screen reader testing
User testing session
Quarterly:
Third-party audit
WCAG compliance review
Strategy assessment
Conclusion
Effective accessibility testing combines automated tools, manual evaluation, and user testing. Automated tools catch technical issues quickly, manual testing validates context and usability, and user testing ensures real-world accessibility.
Recommended Approach:
Start with automated tools (30-40% coverage)
Add keyboard and screen reader testing (60-70% coverage)
Validate with user testing (real-world experience)
Monitor continuously
No single tool or method catches everything - layered testing is essential for true accessibility.
