Screen Reader Testing: NVDA vs. JAWS Findings

Screen readers are how blind and visually impaired people access the web. About 7 million Americans use them. The two dominant tools on Windows are NVDA and JAWS. WebAIM's most recent data shows JAWS at 40.5% usage and NVDA at 37.7% among screen reader users, with VoiceOver on Apple devices accounting for most of the rest.

NVDA is free and open source. JAWS costs between $90 and $1,475 per year. The price difference isn't the main distinction. The deeper difference is how they interpret code.

NVDA reads strictly from the DOM and accessibility tree. If your HTML is wrong, NVDA exposes those problems directly. JAWS uses heuristics. It tries to compensate for poor markup. If a label is missing, JAWS might infer one from context. This makes NVDA better for catching code issues during development and JAWS better for understanding how users might actually experience your site.

PowerMapper testing in December 2025 showed JAWS with Chrome and NVDA with Chrome both achieved 100% reliability on form labeling tests. NVDA with Firefox also scored 100%. JAWS with Firefox scored 94%. JAWS with Edge scored 98%. Older browser combinations scored much lower.

Screen Reader Testing: NVDA vs. JAWS Findings

Screen reader testing: NVDA vs. JAWS findings

Screen readers are how blind and visually impaired people use the web. If your site doesn't work with them, it doesn't work for about 7 million Americans, according to the American Foundation for the Blind .

The two dominant screen readers on Windows are NVDA and JAWS. WebAIM's most recent survey data shows JAWS at 40.5% usage and NVDA at 37.7% among screen reader users . VoiceOver on macOS and iOS accounts for most of the rest.

If you're testing website accessibility, you need to understand how these tools differ. They interpret your code differently. They handle errors differently. They give different experiences to users.

This matters because testing with only one screen reader can miss problems that users of the other will encounter. A site that works perfectly in NVDA might be unusable in JAWS, and vice versa.

The fundamental difference in philosophy

NVDA is free and open source. JAWS costs between $90 and $1,475 per year for a single-user license . But the price difference isn't the main distinction.

The deeper difference is how they interpret markup.

NVDA reads strictly from the DOM and accessibility tree. What you code is what you get. If your HTML is wrong or missing attributes, NVDA exposes those problems directly. It doesn't try to guess what you meant .

JAWS uses heuristics. It tries to compensate for poor markup. If a label is missing, JAWS might infer one from context. If code is messy, JAWS attempts to smooth it over .

This creates a trade-off. NVDA is better for catching code issues during development. Its strictness means if something works in NVDA, you can be confident your markup is correct. JAWS is better for understanding how users might actually experience your site, because it attempts to work around imperfections the way a human might.

But that same heuristic approach can mask problems. If JAWS fixes your bad code, you might never know your markup is broken until a user tries your site with NVDA.

What the market share actually means

NVDA has grown significantly. A February 2026 WebAIM survey found that 72% of respondents selected NVDA as their most commonly used desktop or laptop screen reader . JAWS was at 60.5% in some measurements . The numbers overlap because users often have multiple screen readers installed and switch depending on context.

The blindaccessjournal.com ran a seven-day experiment in February 2026 where a longtime JAWS user switched to NVDA exclusively. The writer, who remained anonymous, documented the transition in detail .

What they found: NVDA felt faster and more responsive in many applications. The addon ecosystem, particularly Vision Assistant Pro, provided capabilities JAWS doesn't have. But the learning curve was steep. Muscle memory fought them constantly. Browse Mode and Focus Mode switching caused repeated confusion .

That personal account matches the broader data. NVDA is gaining ground because it's free, actively developed, and frequently updated. JAWS maintains its enterprise footprint because of advanced scripting capabilities and professional support .

How they handle navigation modes

Both screen readers have different modes for different tasks. Understanding these modes is essential for testing because elements must work correctly in all contexts .

Browse mode (called Virtual Cursor in JAWS) lets users navigate through content with arrow keys. Shortcut keys jump to specific element types. H for headings. B for buttons. L for lists.

Forms mode (called Application Mode in some contexts) passes keystrokes directly to the browser so users can interact with form fields and custom widgets.

The JAWS user who switched to NVDA struggled with this constantly. In their day two journal entry: "After submitting a prompt to Gemini and hearing its reply, I pressed H to navigate to the heading where the response started. NVDA just said 'h' and sat there. I was still in Focus Mode. Insert+Space toggled Browse Mode on and then everything worked — but I had to consciously remember to do that" .

JAWS handles this transition more automatically with its Semi Auto Forms Mode. NVDA requires the user to know which mode they're in and switch when needed .

Command structure differences

The modifier keys are different. JAWS uses the Insert key (sometimes called the JAWS key). NVDA also uses Insert by default but allows extensive remapping through its Input Gestures system .

Common commands:

 

ActionNVDAJAWS
Start reading continuouslyInsert + Down ArrowInsert + Down Arrow
Stop speechCtrlCtrl
Next headingHH
Next landmarkDR
List all headingsNVDA + F7Insert + F6
List all linksNVDA + F7, then Alt+LInsert + F7
Toggle modesNVDA + SpaceInsert + Z

The NVDA user who switched from JAWS found the differences painful. "The NVDA find command in Browse Mode is Control+NVDA+F — not Control+F — which felt deeply wrong. I added Control+F, F3, and Shift+F3 under Preferences > Input Gestures" .

They also nearly had a heart attack when Insert+Q, which they expected to announce the active application, instead exited NVDA entirely. They enabled exit confirmation immediately .

What the test results show

PowerMapper ran extensive screen reader compatibility tests on form labeling, images, and links in December 2025. Their results show where each tool excels .

JAWS with Chrome: 100% reliability across tested interaction modes. This combination performed perfectly on labeling tests.

NVDA with Chrome: Also 100% reliability. Both tools at their best with modern browsers.

NVDA with Firefox: 100% reliability. NVDA has historically been optimized for Firefox, and these results confirm that combination remains strong.

JAWS with Firefox: 94% reliability. Not a failure, but slightly less consistent than Chrome.

JAWS with Edge: 98% reliability. Still excellent but not perfect.

NVDA with Internet Explorer: 55% reliability. This matters less as IE usage declines, but it shows how older browsers break accessibility.

The test results also revealed that some combinations work better for specific tasks. For labeling forms and images, JAWS Chrome and NVDA Chrome both scored 100%. For older browser combinations, reliability dropped significantly .

The ARIA handling difference

A WebAIM mailing list discussion from August 2025 highlighted a specific bug affecting both screen readers. Steve Green, Managing Director of Test Partners Ltd, posted about a strange interaction .

The code was simple:

html

<a href="#" aria-label="foo">  <h3>bar</h3> </a>

You might expect a screen reader to announce a link called "foo" containing a heading called "bar". Or possibly a heading/link called "foo".

Instead, both JAWS and NVDA just announced a link called "foo" and ignored the heading entirely. Remove the aria-label, and they announced a heading/link called "bar" as expected .

Green noted the accessibility tree looked correct. The <a> element had an accessible name of "foo". The heading contents were "bar". This suggested both screen readers were interpreting valid code incorrectly. He was considering reporting the issue to both vendors.

This matters because it shows that even when your code is correct, screen readers can still mishandle it. And when both major tools make the same mistake, it's harder to know whether the problem is your code or theirs.

The academic evidence

A 2024 study published in the British Journal of Visual Impairment compared JAWS and NVDA on academic performance of visually impaired students .

The study enrolled 50 severely visually impaired students, split into two age-matched groups. Group A received NVDA. Group B received JAWS. Researchers measured reading acuity, maximum reading speed, and critical print size using the MNREAD acuity chart. They also gathered qualitative data through questionnaires.

Results showed NVDA exhibited better outcomes than JAWS in terms of improved academic activity. The researchers concluded NVDA caters better to individual needs, effectively supports learning processes, and demonstrates higher user appreciation .

This is one study with 50 participants, so it's not definitive. But it suggests that for educational contexts, NVDA's approach may work better for students.

Real-world testing: one user's experience

The blindaccessjournal.com immersion journal provides the most detailed recent comparison from an actual user .

What NVDA does well:

Speed and responsiveness. The user found NVDA frequently felt faster than JAWS, especially in specialized applications like amateur radio logging software.

Deep customizability. The Input Gestures system makes remapping commands relatively easy. The Punctuation/Pronunciation settings give granular control over how symbols are spoken.

The addon ecosystem. Vision Assistant Pro impressed the user. Pressing NVDA+Alt+V then O generated an on-screen description of a web page's layout within seconds. Cross-checking confirmed accuracy.

Object navigation. Once the user understood NVDA's object model, navigating legacy and non-standard interfaces became manageable. This was particularly useful for applications that don't rely on standard accessibility APIs.

Cost. NVDA is free, actively developed, and open source. The value proposition is extraordinary .

Where NVDA struggles:

Silent focus changes. NVDA occasionally goes silent after closing applications or switching browser tabs. The user found this disorienting and suspected it might be a bug worth filing.

PDF handling. Poorly tagged PDFs expose every formatting flaw without mercy. JAWS historically does more smoothing and preprocessing before errors reach the user.

Missing features. NVDA lacks an equivalent to JAWS's Shift+Insert+F1, which gives detailed browser-level view of an element's tags, attributes, roles, and IDs. This is invaluable for accessibility work, and the user couldn't find a satisfactory equivalent.

Different speech rates per context. In JAWS, the user could run character-level echo at a much higher speech rate than general content. NVDA doesn't appear to support different speech rates per context, so typed characters come through at the same rate as everything else .

The 32-bit revelation

One discovery surprised the JAWS user switching to NVDA: NVDA 2025.3.3, the current stable release, is 32-bit. They had assumed for years they were running a 64-bit screen reader .

This came up when they tried to install a 64-bit version of the Eloquence speech synthesizer. The installation seemed successful, but NVDA kept using Windows OneCore voices. The community pointed out the 32-bit issue. The 64-bit Eloquence addon requires a 64-bit NVDA, which only exists in the 2026 beta builds .

The user grabbed the beta, installed everything, and finally got Eloquence working on NVDA. The 64-bit upgrade is coming in the official 2026.1 release .

Browser combinations matter

Different screen readers work better with different browsers. This isn't just preference. It affects how content is interpreted.

Infor's documentation notes that screen reader vendors historically recommended specific browsers. Freedom Scientific recommended JAWS with Internet Explorer. NV Access recommended NVDA with Firefox .

Those recommendations have evolved as browsers improved. Today, both tools support most modern browsers. But combinations still yield varying results because browsers implement accessibility APIs differently and have varying support for WAI-ARIA .

For testing, this means you need to check multiple combinations. A form that works in JAWS with Chrome might behave differently in JAWS with Firefox. The PowerMapper tests showed JAWS Firefox at 94% reliability compared to JAWS Chrome at 100% .

What this means for testing

If you're testing website accessibility, you need both tools. Here's why.

NVDA catches code problems. Its strict interpretation means if your markup is wrong, you'll hear it wrong. That's valuable for development. You want to know when your code doesn't meet standards.

JAWS shows user experience. Its heuristic approach approximates how real users experience sites, which often includes working around imperfect code. If JAWS struggles with your site, real users probably will too.

The WebAIM mailing list post about the aria-label bug is instructive. Both tools failed the same way, which suggests either the code is problematic in a way that wasn't obvious, or both vendors made similar assumptions. If you'd tested with only one, you might have assumed it was an isolated issue. Testing with both confirmed it was consistent across the major Windows screen readers .

The overlay problem

Some testing guides recommend overlays or automated tools as substitutes for screen reader testing. They're not substitutes.

Automated testing tools catch approximately 30-40% of accessibility issues according to TestParty . They can't tell you whether a screen reader user can actually navigate your site. They can't evaluate whether your link text makes sense out of context. They can't detect if your heading order is logical when read aloud.

Screen reader testing is the only way to find those problems. It's also the only way to understand how real users experience your content.

The Viget guide puts it well: "Testing with a screen reader is more than just finding errors; it's seeing the users as real people and having empathy in how they experience a site" .

The learning curve trade-off

NVDA is easier to learn. Its interface is more intuitive. Commands are more consistent. The learning curve is gentler .

JAWS requires more training. Its complexity and multiple command sets mean a steeper climb. But it offers professional support to ease the process .

For testing purposes, you don't need to master every feature. You need to know how to navigate, how to find elements, and how to interpret what you hear. Both tools provide that with a few hours of practice.

The NVDA user who switched from JAWS noted that by day six of their immersion week, they were using the computer heavily and NVDA just worked. No major incidents. No emergency remappings. They noticed they were reaching for JAWS less and less in their thoughts .

Command reference for testing

NVDA essential commands :

  • Start NVDA: Ctrl + Alt + N
  • Stop speech: Ctrl
  • Read next item: Down Arrow
  • Next heading: H
  • Next link: K
  • Next landmark: D
  • Next form field: F
  • Next button: B
  • List all headings: NVDA + F7
  • Toggle modes: NVDA + Space

JAWS essential commands :

  • Start reading continuously: Insert + Down Arrow
  • Stop speech: Ctrl
  • Next heading: H
  • Next landmark: R
  • Next form field: F
  • Next button: B
  • List headings: Insert + F6
  • List links: Insert + F7
  • List form fields: Insert + F5
  • Virtual Cursor on/off: Insert + Z

What to test

Screen reader testing should cover :

Page structure: Navigate by headings and landmarks. Is the hierarchy logical? Can you skip to main content? Are regions properly labeled?

Links and buttons: Tab through interactive elements. Is link text descriptive? Do buttons announce their purpose? Are decorative images hidden from screen readers?

Forms: Navigate to each field. Are labels announced? Are required fields indicated? Are error messages clear and announced?

Dynamic content: Trigger updates. Do ARIA live regions announce changes? Do modal dialogs trap focus? Are loading states communicated?

Tables: Navigate with table navigation commands. Are headers associated with data cells? Is the table structure logical?

The Viget guide suggests turning on your preferred screen reader and having it read through the entire page. Does it read all content in a logical order matching the visual layout? Is any content skipped that shouldn't be?

The bottom line

NVDA and JAWS are different tools with different philosophies. NVDA is strict, code-focused, and free. JAWS is heuristic, user-focused, and expensive.

Neither is better overall. They're better for different purposes.

For development testing, start with NVDA. Its strict interpretation will expose your markup problems. Once those are fixed, test with JAWS to see how real users might experience your site, especially if they're in enterprise environments where JAWS is standard.

For enterprise or government projects where you know users rely on JAWS, test with JAWS first. For public websites, test with both.

The user who switched from JAWS to NVDA for a week ended up keeping NVDA as their primary screen reader. But they didn't abandon JAWS entirely. Both tools have their place .

That's the right approach. Use both. Test both. Your users do.

Frequently Asked Questions

Use both. Start with NVDA because it's free and its strict interpretation will expose your markup problems. Once those are fixed, test with JAWS to see how users in enterprise environments experience your site. For public websites, test with both. For enterprise or government projects where users rely on JAWS, test with JAWS first.
NVDA reads strictly from your code. If your HTML is wrong, you'll hear it wrong. JAWS uses heuristics to guess what you meant and tries to work around imperfections. NVDA is better for development testing. JAWS better approximates real user experience. NVDA is free. JAWS costs money. NVDA feels faster to many users. JAWS has more advanced scripting and professional support.
PowerMapper testing showed JAWS with Chrome and NVDA with Chrome both at 100% reliability. NVDA with Firefox also at 100%. JAWS with Firefox at 94%. JAWS with Edge at 98%. NVDA with Internet Explorer at 55%. You need to test multiple combinations because a site that works in one may fail in another.
For NVDA: H for next heading, K for next link, D for next landmark, F for next form field, B for next button, NVDA+F7 for element list, NVDA+Space to toggle modes. For JAWS: same single-letter keys work, Insert+F6 for headings list, Insert+F7 for links list, Insert+F5 for form fields, Insert+Z to toggle modes. Both use Insert as the modifier key by default.
No. Automated tools catch approximately 30-40% of issues. They can't tell you whether a screen reader user can actually navigate your site. They can't evaluate whether link text makes sense out of context. They can't detect if heading order is logical when read aloud. Screen reader testing is the only way to find those problems.
A longtime JAWS user switched to NVDA for seven days in February 2026 and documented the experience. They found NVDA faster and more responsive. They struggled with mode switching because NVDA requires manually toggling between Browse and Focus modes while JAWS handles this more automatically. They missed JAWS's Shift+Insert+F1 command for detailed element inspection. They discovered NVDA's 32-bit limitation when trying to install a 64-bit speech synthesizer. By day six, they were using NVDA comfortably.
The 2024 British Journal of Visual Impairment study enrolled 50 severely visually impaired students. Half used NVDA, half used JAWS. Researchers measured reading acuity, maximum reading speed, and critical print size. They also collected qualitative data. Results showed NVDA exhibited better outcomes than JAWS in terms of improved academic activity. The researchers concluded NVDA caters better to individual needs and demonstrates higher user appreciation.
A WebAIM mailing list post documented a bug where both tools mishandled the same code. An

bar

link was announced as "foo" with the heading content ignored. The accessibility tree looked correct. This suggests both vendors made similar assumptions or the code has a non-obvious problem. Testing with both tools confirmed the issue was consistent.
NVDA sometimes goes silent after closing applications or switching browser tabs. Its PDF handling exposes every formatting flaw without mercy. It lacks an equivalent to JAWS's Shift+Insert+F1 for detailed element inspection. It doesn't support different speech rates per context, so typed characters come through at the same speed as content. The current stable release is 32-bit, though 64-bit beta versions exist.
JAWS costs money, which can be a barrier. Its heuristic approach can mask code problems, so developers may not realize their markup is broken. It scored slightly lower in Firefox testing at 94% reliability compared to 100% in Chrome. It has a steeper learning curve due to more features and complexity.
Test page structure by navigating headings and landmarks. Is hierarchy logical? Test links and buttons by tabbing through interactive elements. Is link text descriptive? Test forms by navigating each field. Are labels announced? Test dynamic content. Do live regions announce changes? Test tables with table navigation commands. Are headers associated with data cells? Have the screen reader read the entire page. Does content read in logical order matching visual layout?
WebAIM's February 2026 survey found 72% of respondents selected NVDA as their most commonly used desktop screen reader. JAWS was at 60.5% in some measurements. Numbers overlap because users often have multiple screen readers installed. NVDA is growing because it's free, actively developed, and frequently updated. JAWS maintains enterprise footprint due to advanced scripting and professional support.
No. They're different tools for different purposes. NVDA is better for development testing because its strict interpretation exposes code problems. JAWS is better for understanding real user experience because its heuristic approach approximates how users actually encounter sites. For comprehensive accessibility testing, you need both.

Ready to Find Your Perfect Vehicle?

Browse our extensive inventory or schedule a test drive today!

Janeth

About Janeth

Comments (0)

No comments yet.

Get More Info