New Page
Testing with assistive technology: A practical guide to screen reader testing
You've run your automated tests. Axe says zero violations. WAVE shows no errors. Your site passes every checker you can throw at it.
None of that tells you whether a blind person can actually use your site.
Automated tools catch about 30 percent of accessibility issues . They can detect missing alt text and low contrast. They cannot tell you whether the focus order makes sense, whether form error messages are actually helpful, or whether a screen reader user can complete a purchase without getting stuck.
Screen reader testing is the only way to find out. Here's what you need to know to do it right.
Why screen reader testing matters
Screen readers convert digital text into synthesized speech or braille. Users who are blind or have low vision rely on them to navigate the web. JAWS, NVDA, and VoiceOver are the primary tools .
When you test with a screen reader, you're not just checking for technical compliance. You're experiencing your site the way millions of users do. The WebAIM screen reader survey found that 68 percent of users encounter keyboard traps or inaccessible interfaces monthly. That's not a niche problem.
Screen reader testing is specifically required to validate several WCAG success criteria that automated tools cannot assess :
- 1.1.1 Non-text Content: Does the alt text actually describe what's in the image?
- 1.3.1 Info and Relationships: Are headings announced as headings? Are lists announced as lists?
- 2.4.4 Link Purpose: Does the link text make sense out of context?
- 2.4.6 Headings and Labels: Do headings actually describe the content that follows?
- 3.3.1 Error Identification: Are error messages announced and understandable?
- 4.1.2 Name, Role, Value: Do custom controls announce what they are and what they do?
You don't need to test with every screen reader on the market. But you need more than one.
VoiceOver comes built into every Mac and iOS device. It's free and requires no installation. VoiceOver works best with Safari. Testing with other browsers can produce unreliable results . To start VoiceOver, press Command + F5.
NVDA is a free, open-source screen reader for Windows. It runs in Firefox, Chrome, and Edge. It's the most common choice for testing because it's free and widely used. To start NVDA, press Control + Alt + N .
JAWS is the most advanced screen reader, but it's paid software. It costs about $1,000 for a license. JAWS has been on the market for over 20 years and includes features the others don't, like the ability to add words and text to help with semantics . JAWS offers a 40-minute demo mode that's sufficient for testing .
The majority of the time, when content is coded correctly, it works across different combinations. But discrepancies do occur. If you encounter sound issues in one testing environment, another team member with a different browser and assistive technology should double-check for repeatable results .
VoiceOver differs from JAWS and NVDA. VoiceOver can sometimes detect if something is coded inaccessibly and attempt to fix it. Testing with a single screen reader is not suggested. You need to ensure the code communicates the same information through different assistive technologies, browsers, and operating systems .
Setting up for testing
Before you start testing, prepare your environment.
If you're on a Mac, use Safari with VoiceOver. If you're on Windows, use Firefox or Chrome with NVDA or JAWS. Always start the screen reader before opening the browser to ensure proper connections are established .
Turn off your monitor if possible. Or cover it. You need to rely on what you hear, not what you see . What you see and what the screen reader announces can be very different.
If NVDA's mouse tracking interferes with testing, turn it off with Insert + M . You can adjust speech rate through Preferences > Settings > Speech in NVDA .
Testing sequence: a structured approach
Professional accessibility testing follows a sequence. Here's what to do, in order.
1. Listen to the entire page
Start the screen reader and let it read the entire page from top to bottom .
In NVDA, press Insert + Down Arrow to start reading continuously. Press Control to stop .
Listen for:
- Does the page title accurately describe the page?
- Is any content skipped that shouldn't be?
- Is content read in a logical order that matches the visual layout?
- Are images announced with meaningful alt text, or just filenames?
- Are navigation regions and main content announced properly?
2. Navigate by headings
Screen reader users often navigate by headings to get an overview of page structure and jump to relevant sections.
In NVDA, press H to move to the next heading, Shift+H for previous heading. Open the Headings List with Insert+F7 .
In VoiceOver, press Control+Option+Command+H for next heading .
Check that:
- Headings exist and are used appropriately
- Heading hierarchy is logical and doesn't skip levels (H1 to H2 to H3, not H1 to H3)
- Headings actually describe the content that follows
- There's exactly one H1 per page that matches the page title
3. Navigate by links
Users often pull up a list of all links on a page to navigate quickly.
In NVDA, press K for next link, Shift+K for previous. Open the Links List with Insert+F7 .
Check that:
- Every link has discernible text
- Link text makes sense out of context (no "click here" or "read more" without context)
- The purpose of each link is clear from its text alone
4. Test keyboard navigation
Screen reader testing is inseparable from keyboard testing. Screen readers rely entirely on keyboard commands.
Tab through every interactive element on the page .
Check that:
- All links and functionality are keyboard accessible
- Focus order is logical and matches visual layout
- Focus is always visible
- No keyboard traps exist—you can navigate into and out of every component
- Skip links work and are visible when focused
- Focused elements are never completely hidden by sticky headers, chat widgets, or overlays
- Dropdown menus, modals, and other components work with standard keyboard patterns (arrow keys, Enter, Space, Escape)
5. Test forms
Forms are where most accessibility failures occur. Test them thoroughly.
Navigate to each form field using Tab .
Check that:
- Every form field has a visible label that's programmatically associated (using htmlFor/id or aria-labelledby)
- Required fields are clearly identified both visually and to screen readers
- Instructions are easy to find and understand
- Placeholder text is not used as a substitute for labels
- Autocomplete attributes are used where appropriate
- The accessible name of each control includes its visual label
Now test error handling. Submit the form with errors—leave required fields blank, enter invalid data .
Check that:
- Error messages are announced to screen readers
- Error messages are clearly visible near the relevant input
- Errors are specific and actionable, not just "invalid input"
- Focus moves to error messages or confirmation messages after submission
- Success messages are informative, not just visual cues
6. Test tables
If your site uses data tables, they need proper markup.
In NVDA, press T to move to the next table .
Navigate table cells with Control+Alt+Arrow keys .
Check that:
- Table headers use <th> with appropriate scope attributes
- The screen reader announces column and row headers before cell content
- Cell coordinates and content are announced correctly
7. Test dynamic content
Modern sites update content without page reloads. These updates must be announced.
Check that:
- Loading indicators are announced to screen readers
- Dynamic content changes are announced via ARIA live regions without disrupting focus
- Status messages are announced without taking focus
- Client-side routing updates the page title and moves focus appropriately
Testing WCAG 2.2 criteria with screen readers
WCAG 2.2 added several criteria that require screen reader testing.
3.3.7 Redundant Entry: Test multi-step processes. Are you asked to re-enter information you already provided? Information previously entered should be auto-populated or available for selection .
3.2.6 Consistent Help: If help mechanisms appear on multiple pages, do they appear in the same relative order? Navigate to several pages and verify help links, chat widgets, and contact information are consistently positioned .
3.3.8 Accessible Authentication: Test your login process. Does it rely on cognitive function tests like CAPTCHAs or puzzles? If so, are accessible alternatives provided ?
2.5.3 Label in Name: For controls with visible labels and programmatic names, does the programmatic name include the visible label text? Test by focusing controls and listening to what's announced .
Common issues you'll find
Screen reader testing reveals problems automated tools miss.
Missing or inadequate alt text. Automated tools detect missing alt attributes. They cannot tell you whether "image123.jpg" or "product" is useful alt text. Listen to what's announced. Is it actually descriptive ?
Heading structure problems. Automated tools detect heading tags. They cannot tell you whether the heading hierarchy makes sense or whether headings actually describe the content. Listen to the headings list. Does it provide a useful outline of the page ?
Link text that's not descriptive. "Click here" and "read more" pass automated checks. They fail for screen reader users who navigate by links. Listen to the links list. Do you know where each link goes ?
Form labeling issues. Automated tools detect missing labels. They cannot tell you whether the label is actually associated with the right field, whether instructions are clear, or whether error messages are helpful. Fill out forms with errors and listen to what happens .
Inaccessible custom components. Automated tools flag custom controls. They cannot tell you whether the ARIA implementation actually works. Test dropdowns, modals, tabs, and sliders thoroughly .
Focus management failures. Automated tools detect missing focus indicators. They cannot tell you whether focus moves logically, whether modals trap focus correctly, or whether focus returns to the right place after closing something. Tab through every interaction .
The limitations of screen reader testing
Screen reader testing is essential, but it has limitations.
First, screen readers behave differently across browsers and operating systems. Something that works in VoiceOver on Safari might fail in NVDA on Chrome . Testing with multiple combinations is necessary.
Second, screen reader proficiency takes time. If you're new to screen readers, you'll struggle at first. The speech is fast. The commands are complex. Practice before conducting formal testing .
Third, screen reader testing cannot replace testing with actual users who are blind. As a sighted tester, you'll never fully replicate the experience of someone who relies on screen readers daily . User testing with people who have disabilities is irreplaceable.
Fourth, screen reader testing addresses only a subset of WCAG criteria. You still need to test color contrast, zoom responsiveness, mobile interactions, and cognitive accessibility .
Integrating screen reader testing into development
The most effective approach is to test early and often.
The EBSCOed Accessibility Definition of Done requires VoiceOver validation during development for every feature before merging to production . Developers test with VoiceOver on Mac using Safari. Comprehensive multi-screen reader testing with JAWS, NVDA, and mobile screen readers happens during third-party audits .
This approach catches issues when they're cheapest to fix—during development, not after launch.
Quick reference: screen reader commands
VoiceOver (Mac)
| Task | Command |
|---|---|
| Start/Stop | Command + F5 |
| Pause/Resume | Control |
| Next item | Control + Option + Right Arrow |
| Previous item | Control + Option + Left Arrow |
| Next heading | Control + Option + Command + H |
| Next link | Control + Option + Command + L |
| Click/activate | Control + Option + Spacebar |
NVDA (Windows)
| Task | Command |
|---|---|
| Start | Control + Alt + N |
| Stop | Insert + Q |
| Start continuous reading | Insert + Down Arrow |
| Stop reading | Control |
| Next heading | H |
| Next link | K |
| Next form field | F |
| Next graphic | G |
| Open elements list | Insert + F7 |
| Next table | T |
| Navigate table cells | Control + Alt + Arrow keys |
| Toggle mouse tracking | Insert + M |
JAWS (Windows)
| Task | Command |
|---|---|
| Start continuous reading | Insert + Down Arrow |
| Stop reading | Control |
| Next heading | H |
| Next unvisited link | U |
| Next landmark | R |
| Navigate table cells | Control + Alt + Arrow keys |
The bottom line
Automated testing is a starting point. Screen reader testing is how you know your site actually works.
The Government of British Columbia's testing guide recommends a sequence: automated evaluation first, then manual keyboard testing, then screen reader testing, then magnification and mobile testing, and finally user testing with people with disabilities .
That's the right order. Automated tools catch what they can. Keyboard testing ensures basic operability. Screen reader testing reveals whether the experience actually makes sense for users who can't see the screen.
If you're new to screen reader testing, start with VoiceOver on Mac or NVDA on Windows. Learn the basic navigation commands. Test your own site. You'll find things you never knew were broken.
Then fix them. That's how accessibility happens.