Where ADA Compliance Should Be vs. Where It Is
Where ADA website compliance should be vs. where it usually sits
ADA website compliance has a floor and a ceiling. Most businesses live somewhere in the middle and call it “good enough.” Courts don’t.
The gap between what accessibility standards require and what most vendors deliver is wide, measurable, and well documented in lawsuits, audits, and settlements from the last few years. This article lays that gap out plainly. No abstractions. No promises. Just numbers, practices, and consequences.
The comparison here is simple:
- Where ADA website compliance should be, based on how courts, regulators, and plaintiffs actually evaluate sites
- Where the average “ADA compliant” benchmark lands in real-world audits
- Where our internal benchmark sits relative to both
This isn’t a pitch. It’s an accounting.
The legal baseline everyone pretends is flexible
ADA website cases under Title III almost always rely on Web Content Accessibility Guidelines 2.1 Level AA as the measuring stick.
That standard is not law. Courts still accept it.
The U.S. Department of Justice has referenced WCAG 2.0 and 2.1 repeatedly in consent decrees and enforcement actions going back more than a decade. Judges treat it as settled ground.
When a complaint says a site fails WCAG 2.1 AA, defendants rarely argue the benchmark. They argue standing, scope, or timing. The standard itself stays intact.
That’s where compliance should start.
In most audits conducted between 2023 and 2025, “ADA compliant” means one of three things:
- An automated scan passes without critical errors
- A site scores high on Lighthouse accessibility
- A plugin or overlay is installed
None of these meet the legal or functional standard courts apply.
Automated tools catch roughly 25 to 35 percent of WCAG failures. Lighthouse checks fewer. Overlays don’t change underlying code.
Still, these benchmarks dominate vendor claims.
The average benchmark across commercial sites
Based on aggregated audits across retail, professional services, healthcare, automotive, and local government sites between 2024 and 2025, the average accessibility posture looks like this:
- Automated scan passes: yes
- Manual keyboard testing: partial
- Screen reader testing: minimal
- Third-party widgets tested: rarely
- Ongoing monitoring: no
In raw numbers, the average site that claims ADA compliance still fails between 18 and 42 WCAG 2.1 Level AA success criteria, depending on complexity.
Most of those failures cluster in the same areas:
- Keyboard access
- Focus management
- Form error handling
- Contrast
- Dynamic content announcements
These are not obscure issues. They’re the same ones cited in complaints.
Where compliance should be if lawsuits are the metric
Courts don’t evaluate compliance the way tools do. They evaluate it through user experience narratives.
A site that should be considered compliant in 2025 typically meets these conditions:
- A keyboard-only user can complete primary tasks
- A screen reader user receives equivalent information and feedback
- Errors are announced, not silent
- Visual information is not the only way content is conveyed
- Third-party tools do not block access
This requires manual testing. It requires regression monitoring. It requires ownership.
There is no shortcut.
The gap between “passes a scan” and “survives a lawsuit”
This gap is where most defendants lose.
Automated benchmarks reward surface fixes. Lawsuits expose workflow failures.
A site can pass 100 percent of automated checks and still block:
- Scheduling
- Checkout
- Account creation
- Form submission
Plaintiffs establish standing through those failures, not through missing alt text alone.
Our benchmark: what we measure that others don’t
Our internal benchmark was built backward from lawsuits, not from tools.
Instead of asking “does this pass WCAG checks,” we asked “what failures show up in complaints, affidavits, and settlements.”
That led to a stricter baseline.
Manual interaction coverage
Our benchmark requires full manual testing of:
- Keyboard-only navigation
- Screen reader workflows using NVDA and VoiceOver
- Error states, not just happy paths
- Focus order through dynamic components
Most competitors stop after spot checks. We don’t.
Third-party accountability
Most vendors disclaim third-party tools. Plaintiffs don’t accept that.
Our benchmark includes:
- Inventory systems
- Scheduling widgets
- Chat tools
- Payment and intake forms
If it’s on the page, it’s in scope.
Regression monitoring
Accessibility breaks after launch more than at launch.
Our benchmark includes scheduled rescans and manual retesting after:
- CMS updates
- Plugin updates
- Theme changes
The average benchmark treats compliance as a one-time event.
A concrete comparison from a real audit
In late 2024, a regional service company hired two vendors in parallel. One provided a standard “ADA compliance package.” The other followed our benchmark.
The standard package reported:
- Lighthouse score: 98
- Automated errors: 0 critical
- Compliance status: “Pass”
Our audit found:
- Keyboard trap in scheduling modal
- Error messages not announced
- Focus loss after form submission
- Low contrast placeholders
The site settled an ADA claim in mid-2025 for $8,750.
The standard report didn’t help. The deeper benchmark would have.
Where competitors usually draw the line
Most accessibility vendors stop where liability begins.
They exclude:
- PDFs unless requested
- Videos unless flagged
- Edge cases in workflows
- Content created after audit
That keeps costs down. It also keeps risk high.
Competitor benchmarks are optimized for sales, not defense.
The cost difference that drives behavior
Average pricing in 2025 looked like this:
- Automated scan + report: $500 to $1,500
- “Compliance certification” packages: $2,000 to $4,000
- Manual audit with remediation guidance: $6,000 to $12,000
Our benchmark sits in the third category.
That price gap explains why most businesses choose the first two until they’re sued.
A criticism worth stating plainly
Our benchmark is not cheap. It’s not fast. It’s not minimal.
It also doesn’t guarantee immunity.
No benchmark does.
The trade-off is effort versus exposure. Businesses decide where to land.
Why “average compliant” keeps failing in court
Judges don’t care how common a practice is. They care whether access existed.
“Industry standard” defenses fail regularly because accessibility isn’t evaluated by comparison. It’s evaluated by experience.
If a blind user couldn’t complete a task, the benchmark doesn’t matter.
Where benchmarks fall apart under cross-examination
Defense attorneys often introduce audit reports. Plaintiff attorneys introduce affidavits.
Affidavits usually win.
A benchmark that doesn’t include real user testing collapses under oath.
Our benchmark vs. competitors in plain terms
Competitor benchmarks aim to:
- Pass tools
- Produce documentation
- Minimize cost
Our benchmark aims to:
- Remove standing
- Reduce allegations
- Survive scrutiny
Those goals lead to different work.
The data behind the stricter approach
Across clients audited under our benchmark between 2023 and 2025:
- Pre-audit WCAG failures averaged 31
- Post-remediation failures averaged under 5
- Post-remediation lawsuits: zero during monitored periods
That last number has a limitation. Monitoring periods vary. Lawsuits can still happen later.
It’s still data most vendors can’t show.
Why competitors avoid publishing benchmarks
Benchmarks create accountability.
Once published, they can be compared against outcomes. Most vendors prefer ambiguity.
We chose specificity instead.
Accessibility as an ongoing condition, not a score
The average benchmark treats accessibility like a finish line.
Courts treat it like a snapshot.
Our benchmark treats it like maintenance.
That difference explains most of the lawsuit gap.
A final example that shows the contrast
In early 2025, a healthcare provider updated its site after receiving a demand letter. The vendor delivered a clean report. The site passed automated checks.
Three months later, a patient filed suit over inaccessible intake forms introduced after the audit.
The benchmark didn’t account for change.
Ours does.
Where ADA compliance actually lands when measured honestly
There is where compliance should be, defined by access and legal outcomes.
There is where it usually sits, defined by tools and marketing.
And there is the gap between them, where lawsuits live.
Benchmarks don’t close that gap by being common.
They close it by being complete.