A buyer asks for an accessibility audit. Three vendors quote the work. One quotes $400 and delivers a six-page PDF from an automated scan. One quotes $4,000 and delivers a 60-page report with screen recordings and a remediation roadmap. A website accessibility audit company that sells the $400 version is not selling an audit — it's selling paper.
A buyer asks for an accessibility audit. Three vendors quote the work. One quotes $400 and delivers a six-page PDF from an automated scan. One quotes $4,000 and delivers a 60-page report with screen recordings and a remediation roadmap. A website accessibility audit company that sells the $400 version is not selling an audit — it's selling paper. And in a demand letter, paper that didn't catch the real violations is worse than having no audit at all.
The gap matters because automated scanners catch a known, measurable slice of WCAG failures. The rest — the majority — only surface under manual testing with assistive technology. If an audit report doesn't distinguish between what was scanned and what was tested by a human, the report is not a defensible document.
Why Automated Scans Catch Only a Third of Failures
Deque Systems, the makers of axe — the most widely used accessibility scanner — has published the limit directly: automated testing can detect roughly 57% of WCAG issues under ideal conditions, and in real-world audits closer to 30–40% of actual barriers. The other 60–70% require human judgment.
This is because most WCAG success criteria require understanding context, not just code. A scanner can verify an image has alt text. It cannot verify the alt text meaningfully describes the image. A scanner can confirm form fields have labels. It cannot confirm a screen reader user can complete the form in a reasonable number of keystrokes. A scanner can find headings. It cannot tell if the heading order makes the page understandable.
Put plainly: a site can pass every automated scan and still be unusable for a blind user. Several of the highest-profile ADA lawsuits in the past three years targeted sites that had passing axe reports on file. The plaintiffs won.
The Three-Tier Methodology a Real Audit Uses
A defensible audit stacks three layers of testing. Each layer finds issues the others miss. Skipping any of them leaves a known blind spot in the report.
Tier 1: Automated Scanning
Every page in scope gets scanned with a tool like axe-core, WAVE, or Lighthouse. The scan catches the objective failures — missing alt text, insufficient color contrast, empty form labels, missing lang attributes, duplicate IDs. These are cheap to fix and must be cleared first because they block the more nuanced testing that follows.
Typical finding rate on an unaudited mid-size site: 200–800 automated issues across 20 sampled pages. Most consolidate into 10–30 unique fixes applied site-wide through the template layer.
Tier 2: Manual Keyboard and Code Review
A trained auditor walks the sampled pages using only a keyboard — no mouse. Tab through every interactive element. Check for visible focus indicators. Confirm logical tab order. Test that modals trap focus and release it correctly. Verify Esc closes overlays. Confirm skip-to-content links work. Inspect the DOM for correct landmark regions and heading hierarchy.
This layer catches issues the scanner can't see: focus traps, unreachable menu items, click-only interactions, non-semantic buttons built from divs. These are the failures that most often trigger complaints from real users.
Tier 3: Assistive Technology Testing
The deepest layer. An auditor — ideally one who uses assistive technology daily — works through key user flows with NVDA or JAWS on Windows, and VoiceOver on macOS and iOS. They test the homepage, primary navigation, contact or checkout flow, search, and any custom widgets. They record the session.
This is where the audit catches the failures that matter most in court: a cart step that announces nothing when the quantity updates, an error message that appears visually but never reaches the screen reader, a product filter that resets focus to the top of the page on every selection. None of these show up in a scan. All of them show up in a lawsuit.
A demand letter doesn't ask "did your scanner pass?" It lists specific, experienced barriers — usually 8 to 15 real failures a plaintiff hit with their screen reader. A defensible audit is one that would have found those same barriers before the plaintiff did. Scan-only audits do not find them.
What a Real Audit Deliverable Contains
A short vendor hands over a PDF of scanner output. A serious vendor hands over four artifacts, each with a specific job.
1. The Issue Inventory
Every finding listed with: the WCAG success criterion it violates, the conformance level (A, AA, or AAA), the exact URL and element, a screenshot, a severity rating (blocker, major, minor), and a plain-English description of the user impact. A real inventory for a 30-page site typically runs 80–250 unique issues after deduplication.
2. The Remediation Plan
Each issue grouped into engineering sprints with effort estimates and a suggested fix approach. The plan should separate template-level fixes (applied once, resolve across the site) from content-level fixes (per page). This distinction is what makes an audit actionable instead of overwhelming.
3. Video Evidence of Assistive Technology Failures
Short screen recordings of the most severe barriers as a screen reader experiences them. Nothing closes the gap between a developer and the user experience faster than 30 seconds of NVDA reading a broken form out loud. This is also the single most useful artifact to hand to leadership when asking for remediation budget.
4. The VPAT or Accessibility Conformance Report
For enterprises selling to government, education, or regulated industries, the Voluntary Product Accessibility Template is the document procurement teams ask for. It maps the site against every WCAG success criterion with a conformance level. A good VPAT is honest about partial conformance rather than overstating it — overstating is how vendors lose contracts when the buyer's own team tests the site.
Who Should Be Doing the Audit
A resume with "10 years of web development" is not an accessibility credential. The auditor's experience is the audit's quality. Ask three questions before signing.
- Are any of your auditors certified? The IAAP offers three credentials: CPACC (foundational), WAS (technical), and CPABE (expert). A vendor should have at least one WAS-certified auditor on the account.
- Do any of your auditors use assistive technology daily? This is the highest-signal question. Auditors who use screen readers as primary navigation catch issues sighted auditors consistently miss.
- How many pages do you sample, and how are they chosen? A proper sample is a representative template set — homepage, product/service templates, form pages, search, error states, authenticated states — not a random 5-URL grab.
Scope, Timeline, and What to Expect to Pay
Scoping varies with site size, but the shape is consistent. A small business site (fewer than 50 unique templates) runs 2–3 weeks of audit work and typically prices at $5,000–$12,000 for a full three-tier audit. A mid-market site with custom applications runs 4–6 weeks and $15,000–$40,000. Enterprise sites with authenticated flows, ecommerce, and multiple subdomains run 6–12 weeks and $40,000+.
Remediation is a separate line. A rule of thumb: remediation typically costs 2–4x the audit itself, spread across 3–9 months of engineering work. Audits without a remediation plan attached are a receipt, not a solution.
An audit that ends at the PDF delivery is the start of a lawsuit defense, not the end of it. The report is only valuable if someone actually fixes the issues listed in it.
Where a Serious Website Accessibility Audit Company Earns the Fee
A serious website accessibility audit company does three things most vendors skip: it runs the three-tier methodology end to end instead of selling scan output with a cover page, it delivers a remediation plan with effort estimates rather than a bare issue list, and it sends auditors who use assistive technology every day rather than consultants who learned WCAG from a slide deck. Revenue Group audits against that bar because that's the bar courts and procurement teams actually apply. If the last audit on file was a scanner PDF, the site is protected about as well as if there were no audit at all — and the next demand letter will surface the barriers the scan missed.
Find the Barriers Before a Plaintiff Does
Free pre-audit. We run a scan plus a keyboard and screen reader spot-check on your top five pages, so you see the gap before you commit to a full audit.
Get My Free Audit →