Accessibility Testing Tools: Automated vs Manual Testing

Compare automated and manual accessibility testing. Learn which tools to use, their limitations, and how to build an effective testing strategy.

Written by: Vexnexa AdminΒ·
Accessibility Testing Tools: Automated vs Manual Testing

Accessibility Testing Tools:

Automated vs Manual Testing

Effective accessibility testing requires both automated tools and manual evaluation. Understanding when to use each approach ensures comprehensive WCAG compliance.

The Testing Landscape

πŸ” What Can Be Tested?

Automated Testing: 30-40% of WCAG issues
Manual Testing: 60-70% of WCAG issues
User Testing: Real-world validation

Why the Difference?

  • Automated tools check technical compliance

  • Manual testing evaluates usability and context

  • User testing validates real-world experience

Automated Testing Tools

πŸ€– Popular Tools

Browser Extensions:

  • WAVE (WebAIM) - Visual feedback

  • axe DevTools - Detailed issue reporting

  • Lighthouse (Chrome) - Built-in auditing

  • IBM Equal Access - Enterprise-focused

  • ANDI (SSA) - Screen reader simulation

Command Line/CI:

  • Pa11y - Automated testing pipeline

  • axe-core - API for custom integration

  • Lighthouse CI - Continuous integration

  • jest-axe - Jest testing integration

Cloud Services:

  • Deque Axe Monitor - Site-wide scanning

  • Siteimprove - Enterprise platform

  • Monsido - Continuous monitoring

  • Silktide - All-in-one solution

Automated Tool Capabilities

What Automated Tools Can Find:

βœ“ Missing alt text
βœ“ Insufficient color contrast
βœ“ Missing form labels
βœ“ Invalid ARIA usage
βœ“ Missing page titles
βœ“ Duplicate IDs
βœ“ Missing language attribute
βœ“ Incorrect heading hierarchy
βœ“ Missing skip links

What They Cannot Find:

❌ Whether alt text is accurate
❌ Whether tab order makes sense
❌ Whether content is understandable
❌ Whether error messages are helpful
❌ Whether keyboard navigation is logical
❌ Whether focus order matches visual order
❌ Whether ARIA is used correctly in context
❌ Whether video captions are accurate

Setting Up Automated Testing

πŸ› οΈ Quick Start with axe DevTools

// Install
npm install --save-dev axe-core

// Basic usage
const { AxePuppeteer } = require('@axe-core/puppeteer');
const puppeteer = require('puppeteer');

(async () => {
  const browser = await puppeteer.launch();
  const page = await browser.newPage();
  await page.goto('https://yoursite.com');

  const results = await new AxePuppeteer(page).analyze();

  console.log('Violations:', results.violations);

  await browser.close();
})();

CI/CD Integration

# GitHub Actions example
name: Accessibility Tests

on: [push, pull_request]

jobs:
  a11y-test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v2
      - name: Run Pa11y
        run: |
          npm install -g pa11y-ci
          pa11y-ci --sitemap https://yoursite.com/sitemap.xml

Jest Integration

import { render } from '@testing-library/react';
import { axe, toHaveNoViolations } from 'jest-axe';

expect.extend(toHaveNoViolations);

test('should not have accessibility violations', async () => {
  const { container } = render(<YourComponent />);
  const results = await axe(container);
  expect(results).toHaveNoViolations();
});

Manual Testing Methods

πŸ‘€ Essential Manual Tests

1. Keyboard Navigation

Test every interactive element:

Tab - Move forward
Shift+Tab - Move backward
Enter - Activate links/buttons
Space - Toggle checkboxes/buttons
Arrow keys - Navigate menus/radios
Escape - Close dialogs

Checklist:

  • All interactive elements reachable

  • Visible focus indicator

  • Logical tab order

  • No keyboard traps

  • Skip links work

  • Modal/dialog focus management

2. Screen Reader Testing

NVDA (Windows - Free):

NVDA+N - NVDA menu
Insert+Down - Read all
Up/Down arrows - Read by line
H - Next heading
K - Next link
F - Next form field
B - Next button

JAWS (Windows - Paid):
Similar commands to NVDA

VoiceOver (Mac - Built-in):

Cmd+F5 - Turn on/off
VO+A - Read all
VO+Right/Left - Navigate
VO+Space - Activate
VO+U - Rotor menu

Testing Checklist:

  • All content read aloud

  • Alt text makes sense

  • Form labels announced

  • Buttons/links identified

  • Error messages announced

  • Heading structure logical

  • Tables navigable

3. Zoom and Reflow

Browser zoom to 200%:
- [ ] All content visible
- [ ] No horizontal scrolling
- [ ] Text remains readable
- [ ] No content overlap
- [ ] Interactive elements still usable

4. Color and Contrast

Tests:
- [ ] Content understandable without color
- [ ] Links distinguishable without color
- [ ] Error states clear without color
- [ ] Use contrast checker tools
- [ ] Test with grayscale mode

5. Content Quality

Review:
- [ ] Alt text describes images accurately
- [ ] Link text is descriptive
- [ ] Headings are meaningful
- [ ] Error messages are helpful
- [ ] Instructions are clear
- [ ] Content is understandable

Screen Reader Testing Guide

🎯 Platform-Specific Testing

Windows + NVDA (Free):

  1. Download from nvaccess.org

  2. Install and restart

  3. Launch with Ctrl+Alt+N

  4. Navigate your site

  5. Listen for issues

Mac + VoiceOver (Built-in):

  1. Cmd+F5 to enable

  2. Practice in VoiceOver training

  3. Navigate your site

  4. Cmd+F5 to disable

iOS + VoiceOver:

  1. Settings > Accessibility > VoiceOver

  2. Triple-click home/side button shortcut

  3. Swipe to navigate

  4. Double-tap to activate

Android + TalkBack:

  1. Settings > Accessibility > TalkBack

  2. Enable TalkBack

  3. Swipe to navigate

  4. Double-tap to activate

Testing Strategy

πŸ“‹ Comprehensive Approach

Development Phase:

1. Linting - ESLint jsx-a11y plugin
2. Unit tests - jest-axe
3. Component testing - Automated checks
4. Developer keyboard testing

Pre-Release:

1. Automated scan - axe DevTools
2. Manual keyboard testing
3. Screen reader testing (NVDA/VoiceOver)
4. Color contrast verification
5. Zoom/reflow testing

Post-Release:

1. Monitoring - Ongoing automated scans
2. User feedback - Bug reports
3. Periodic manual audits
4. User testing with disabled users

Tool Comparison

πŸ”§ Tool Feature Matrix

axe DevTools:

  • βœ“ Accurate, low false positives

  • βœ“ Detailed guidance

  • βœ“ Free browser extension

  • βœ“ Pro version with advanced features

  • Best for: Development and QA

WAVE:

  • βœ“ Visual feedback

  • βœ“ Free

  • βœ“ Easy to understand

  • βœ“ Browser extension or API

  • Best for: Beginners and quick checks

Lighthouse:

  • βœ“ Built into Chrome

  • βœ“ Performance + accessibility

  • βœ“ Free

  • βœ“ CI integration

  • Best for: Overall site quality

Pa11y:

  • βœ“ Command line

  • βœ“ CI/CD integration

  • βœ“ Sitemap scanning

  • βœ“ Free and open source

  • Best for: Automation and pipelines

Siteimprove/Enterprise Tools:

  • βœ“ Site-wide monitoring

  • βœ“ Compliance reporting

  • βœ“ Issue tracking

  • βœ“ Paid service

  • Best for: Enterprise and governance

Common Pitfalls

❌ Testing Mistakes to Avoid

Over-Reliance on Automation:

Bad: "Automated tests passed, we're accessible!"
Good: "Automated tests passed, now let's do manual testing"

Testing Only Homepage:

Bad: Test homepage and call it done
Good: Test representative pages and user flows

Ignoring False Positives:

Bad: Dismiss all automated findings
Good: Investigate and document why certain items are not issues

Testing Without Context:

Bad: Check each page in isolation
Good: Test complete user journeys

Never Using Real Screen Readers:

Bad: Rely only on automated simulation
Good: Test with NVDA, JAWS, or VoiceOver

User Testing

πŸ‘₯ Testing with Real Users

Recruiting:

  • Disability organizations

  • User testing platforms

  • Local communities

  • Accessibility advocates

Test Scenarios:

  • Complete a purchase

  • Fill out a contact form

  • Navigate to specific content

  • Use search functionality

  • Create an account

What to Observe:

  • Where do users get stuck?

  • What causes confusion?

  • What do they skip?

  • What takes too long?

  • What works well?

Compensation:

  • Pay users fairly for their time

  • Typical rate: $50-100/hour

  • Provide flexible scheduling

  • Offer remote options

Building a Testing Workflow

πŸ”„ Continuous Testing Process

Daily:

  • Linting during development

  • Unit test runs

  • Local accessibility checks

Per Pull Request:

  • Automated CI tests

  • Component-level checks

  • Developer keyboard test

Weekly:

  • Automated site scans

  • New feature manual tests

  • Issue triage

Monthly:

  • Comprehensive manual audit

  • Screen reader testing

  • User testing session

Quarterly:

  • Third-party audit

  • WCAG compliance review

  • Strategy assessment

Conclusion

Effective accessibility testing combines automated tools, manual evaluation, and user testing. Automated tools catch technical issues quickly, manual testing validates context and usability, and user testing ensures real-world accessibility.

Recommended Approach:

  1. Start with automated tools (30-40% coverage)

  2. Add keyboard and screen reader testing (60-70% coverage)

  3. Validate with user testing (real-world experience)

  4. Monitor continuously

No single tool or method catches everything - layered testing is essential for true accessibility.

Delen:
TestingToolsWCAGTutorialAutomation

Klaar om je website toegankelijk te maken?