article Part 6 of 6

Why Testing Matters

You can follow all the guidelines in this guide, but the only way to know if your site is actually accessible is to test it. A combination of automated tools and manual testing gives you the best coverage.

Here's the reality: according to WebAIM's analysis of automated testing tools, they can catch around 20–50% of accessibility issues (the exact figure varies by site type and tool combination). The rest require human judgment—like whether alt text is meaningful, whether the tab order makes sense, or whether the site is usable with a screen reader.

So you need both: automated tools for efficiency, and manual testing for thoroughness.

Automated Testing Tools

Run these early and often. They're fast, free (mostly), and catch common mistakes.

WAVE (Web Accessibility Evaluation Tool)

What it is: Browser extension and online service from WebAIM

What it does:

  • Visually annotates your page with icons showing issues
  • Identifies missing alt text, low contrast, missing form labels, ARIA errors
  • Shows document structure (headings, landmarks)
  • Color-codes issues (errors vs. alerts vs. features)

How to use:

  1. Install the WAVE browser extension
  2. Navigate to your page
  3. Click the WAVE icon
  4. Review the sidebar for errors and warnings

Best for: Quick visual overview of accessibility issues

axe DevTools

What it is: Browser extension from Deque Systems

What it does:

  • Integrates with browser DevTools
  • Runs automated WCAG checks
  • Provides detailed issue descriptions and remediation guidance
  • Can be integrated into CI/CD pipelines (axe-core)

How to use:

  1. Install axe DevTools extension
  2. Open DevTools (F12)
  3. Go to the "axe DevTools" tab
  4. Click "Scan ALL of my page"
  5. Review issues sorted by severity

Best for: Detailed technical analysis during development

Lighthouse

What it is: Built into Chrome DevTools

What it does:

  • Runs automated checks for performance, SEO, and accessibility
  • Gives a scored report (0–100)
  • Identifies opportunities for improvement

How to use:

  1. Open Chrome DevTools
  2. Go to "Lighthouse" tab
  3. Select "Accessibility" category
  4. Click "Analyze page load"

Best for: Quick snapshot and tracking improvement over time

Other Useful Tools

  • Accessibility Insights (Microsoft): Automated checks + guided manual tests
  • HTML Validator: Catches markup errors that can cause accessibility issues
  • Contrast Checkers: WebAIM, Colorable, Who Can Use
  • Browser DevTools: Most browsers now show contrast ratios in color pickers

Running axe-core in Your CI/CD Pipeline

Integrating automated accessibility checks into CI prevents regressions from shipping unnoticed. Here's a GitHub Actions example that scans your built site with @axe-core/cli on every push:

# .github/workflows/accessibility.yml
name: Accessibility check

on: [push, pull_request]

jobs:
  a11y:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Set up Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '20'

      - name: Install dependencies
        run: npm ci

      - name: Build site
        run: npm run build  # or: bundle exec jekyll build

      - name: Serve built site
        run: npx serve _site -p 8080 &
             sleep 3  # give server time to start

      - name: Run axe-core accessibility scan
        run: npx @axe-core/cli http://localhost:8080 --exit

The --exit flag makes axe return a non-zero exit code when violations are found, failing the CI job. You can scope the scan to specific pages by listing multiple URLs, or use --include / --exclude to target specific selectors. For React or Next.js apps, consider jest-axe for component-level testing in your unit test suite instead.

Manual Testing Techniques

These tests catch issues automation misses.

1. Keyboard-Only Test

How: Put away your mouse and navigate using only the keyboard.

What to check:

  • Can you reach every interactive element with Tab?
  • Is focus visible at all times?
  • Does the tab order make sense?
  • Can you activate all buttons, links, and controls with Enter/Space?
  • Can you open and close dropdowns, modals, and menus?
  • Can you escape from modals and widgets?
  • Does the skip link appear and work?

Time needed: 5-10 minutes per page

2. Screen Reader Test

How: Use NVDA (Windows) or VoiceOver (Mac) to navigate your site.

What to check:

  • Is all content announced?
  • Are headings identified correctly (and in the right order)?
  • Are images described with alt text?
  • Are form labels announced with their inputs?
  • Are button purposes clear?
  • Are state changes announced (expanded/collapsed, checked/unchecked)?
  • Is the reading order logical?
  • Can you navigate by headings, landmarks, and links?

Time needed: 15-20 minutes per page (when learning)

Tip: You don't need to be an expert screen reader user—even basic testing reveals major issues.

3. Zoom and Reflow Test

How: Zoom to 200% and 400% in your browser (Ctrl/Cmd +)

What to check:

  • Is text still readable?
  • Does content reflow or does it require horizontal scrolling?
  • Are buttons and controls still usable?
  • Is anything cut off or overlapping?
  • Does the site work at 320px viewport width (mobile)?

Time needed: 5 minutes per page

4. Color and Contrast Tests

How: Use various tools and simulations

What to check:

  • Contrast check: All text meets 4.5:1 (or 3:1 for large text)
  • Grayscale view: Can you tell what's interactive without color?
  • Color blindness simulation: Is information still clear? (Use Color Oracle or Colorblinding plugins)
  • High Contrast Mode (Windows): Test with Windows High Contrast enabled

Time needed: 10 minutes initially, then check new components

5. Mobile and Touch Testing

What to check:

  • Are touch targets at least 44x44px (or 24x24px for WCAG 2.2)?
  • Is there enough spacing between interactive elements?
  • Can you use the site with just touch (no hover states that break functionality)?
  • Do complex gestures have simpler alternatives?

Building Accessibility Into Your Workflow

Don't treat accessibility as something you check at the end. Bake it into every stage.

During Design

  • Choose color palettes that meet contrast requirements
  • Include focus states in design mockups
  • Design for keyboard navigation
  • Plan for mobile and zoomed views
  • Use accessible design system components
  • Annotate designs with accessibility notes (focus order, ARIA roles)

During Development

  • Use semantic HTML by default
  • Run linters (ESLint with jsx-a11y for React)
  • Test components with keyboard as you build them
  • Run WAVE or axe on every page/component
  • Write meaningful alt text and labels
  • Test with browser zoom frequently

During QA

  • Include accessibility in definition of done
  • Run full automated scan
  • Do keyboard test on key user flows
  • Spot-check with screen reader
  • Verify contrast on all new UI
  • Test on actual mobile devices

Before Launch

  • Comprehensive automated scan
  • Manual testing of critical user flows
  • Screen reader testing of main pages
  • Consider expert audit (if budget allows)
  • Gather feedback from users with disabilities (if possible)

After Launch

  • Monitor for accessibility regressions (automated tests in CI)
  • Provide accessibility feedback mechanism
  • Conduct periodic audits
  • Test new features thoroughly
  • Stay updated on WCAG changes

Creating an Accessibility Testing Checklist

Here's a quick checklist you can use for any page:

Quick Check (10 minutes)

  • ☐ Run automated tool (WAVE or axe)
  • ☐ Tab through page—can you reach and use everything?
  • ☐ Is focus visible everywhere?
  • ☐ Check main text contrast

Thorough Check (30-45 minutes)

  • ☐ Run multiple automated tools
  • ☐ Full keyboard navigation test
  • ☐ Screen reader test (headings, landmarks, forms)
  • ☐ Zoom to 200% and 400%
  • ☐ Check all text and UI component contrast
  • ☐ Verify all images have appropriate alt text
  • ☐ Test forms for labels and error messages
  • ☐ Test any dynamic content (modals, dropdowns, etc.)
  • ☐ Grayscale test
  • ☐ Mobile/touch test

Common Issues and How to Fix Them

Missing Alt Text

Error: Image has no alt attribute

Fix: Add alt="description" or alt="" for decorative images

Low Contrast

Error: Text contrast ratio 2.8:1 (needs 4.5:1)

Fix: Use darker text or lighter background to reach 4.5:1

Missing Form Label

Error: Form input has no associated label

Fix: Add <label for="inputID"> or aria-label

Empty Link Text

Error: Link has no text content

Fix: Add descriptive text or aria-label="description"

Missing Page Title

Error: Document has no <title>

Fix: Add <title>Descriptive Page Title</title> in <head>

Skip Link Missing

Error: No way to bypass navigation

Fix: Add skip link as first focusable element

Learning from Real Users

Automated tools and developer testing catch technical failures, but they can't tell you whether a person with a disability can actually accomplish their goals on your site. Testing with real users closes that gap and consistently surfaces issues no audit would find.

Recruiting Participants

The hardest part is finding participants. A few routes that work:

  • Specialist recruitment services: Fable and Access Works maintain panels of participants with disabilities who are experienced in providing usability feedback. They handle recruitment, scheduling, and incentive logistics.
  • Disability-led organisations and user groups: Local blindness societies, deaf associations, and neurodiversity networks often have members willing to participate in research. Contact them early and build a relationship rather than a one-off transaction.
  • Your own users: If your product already has users with disabilities, reach out directly through your accessibility feedback channel (you should have one).

Aim for 3–5 participants per assistive technology type for qualitative research. Even a single session with a screen reader user will surface more actionable issues than a day of automated testing.

What to Observe

Give participants realistic tasks — not "find the contact page" but "you need to update your delivery address before your order ships tomorrow." Watch for:

  • Moments where they pause, backtrack, or express frustration
  • Strategies they use to work around problems (workarounds indicate real pain)
  • Anything they don't notice or misinterpret
  • How long task completion takes compared to a non-disabled user

Don't prompt or help during tasks — the point is to see where the design fails, not to observe your ability to rescue it.

Compensating Participants Fairly

People with disabilities are professionals providing skilled labour. Pay them the same rate you would any UX research participant — typically £50–£100 (or local equivalent) for a one-hour session. Underpaying or asking for "volunteer" contributions devalues their expertise and makes future recruitment harder for the whole industry.

Continuous Feedback

Beyond formal research, add an accessible feedback mechanism to your site — a clearly labelled "Accessibility feedback" link in the footer pointing to a simple contact form. Some of your most valuable accessibility bug reports will come from users encountering real barriers.

Staying Current

Accessibility is an ongoing commitment:

  • WCAG evolves: WCAG 2.2 is current (2023), WCAG 3.0 is in development
  • Tech changes: New HTML elements, ARIA patterns, and assistive tech features emerge
  • Laws update: Legal requirements change and expand

Resources to follow:

  • WebAIM newsletter and blog
  • W3C WAI updates
  • Deque blog
  • A11y Project
  • Accessibility community on Twitter/Mastodon

Key Takeaways

  • Use both automated tools (fast, catches common issues) and manual testing (thorough, catches context issues).
  • Essential tools: WAVE, axe DevTools, Lighthouse, contrast checkers.
  • Essential manual tests: keyboard-only navigation, screen reader, zoom/reflow, contrast.
  • Build accessibility into every stage: design, development, QA, and post-launch.
  • Create and use a testing checklist for consistency.
  • Test with real users with disabilities when possible.
  • Make accessibility part of your definition of done, not an afterthought.
  • Stay current with WCAG updates and best practices.

Final Thoughts

Building an accessible website might seem complex at first, but by breaking it down into principles (Perceivable, Operable, Understandable, Robust) and following the practical guidelines we've covered, it becomes a natural part of quality web development.

Remember:

  • Accessibility is not a checkbox—it's an ongoing commitment to inclusive design
  • Start with the basics (semantic HTML, keyboard access, color contrast) and build from there
  • Test early and often with both tools and real users
  • Every improvement you make helps real people access information and services

By prioritizing accessibility from the start and treating it as a fundamental aspect of quality, you help move the web toward equal access for all. Your effort means someone with a disability can independently use your site as easily as any other user—which is the ultimate measure of success.

An accessible web is better for everyone: it improves usability, SEO, mobile experience, and overall satisfaction. Each step you take makes a real difference in people's ability to access information and services.

Now go build something accessible! The web needs more developers and designers who care about making it work for everyone. Thank you for taking the time to learn about accessibility—your users will appreciate it.

Additional Resources