Accessibility audit gap analysis (template)
A practical accessibility gap-analysis template for EN 301 549 and WCAG 2.1 AA — section coverage, evidence, severity ranking, remediation plan.
- EN 301 549
- ISO 9001
This template helps you run a structured gap analysis between your current ICT product and EN 301 549 / WCAG 2.1 AA. Use it for an initial conformance baseline, before procuring third-party audits, or to scope remediation sprints. The Excel workbook contains worksheets per EN 301 549 section plus a remediation backlog generator.
When to use a gap analysis
- Pre-launch. Before a product enters market in EU jurisdictions governed by the European Accessibility Act (in force 28 June 2025).
- Vendor procurement. When a procurer is comparing vendor products against an EN 301 549 reference.
- Pre-audit. Before commissioning a paid third-party accessibility audit, to spend that audit budget on edge cases instead of basics.
- Post-finding. After a regulator or customer complaint, to scope the full remediation rather than fix only the reported issue.
Scope decisions
Before opening the workbook, decide three things:
- Surface in scope. Web app, marketing site, native iOS, native Android, downloadable PDF documentation, kiosk, set-top-box, hardware peripheral, support documentation.
- Conformance target. WCAG 2.1 AA via EN 301 549 section 9 is the default for web. Add section 10 for non-web software, section 11 for biometrics if applicable, section 12 for documentation.
- Evidence depth. Spot-check (one example per criterion), full sample, or representative sample (top 20 user journeys).
Workbook structure
Sheet 1, Cover
Project metadata: name, scope, target standard and version, audit window, auditor names and roles, sign-off.
Sheet 2, Section coverage
One row per EN 301 549 clause that applies to the scope. Columns:
- Clause (e.g., 9.1.1.1, 11.5.2.5).
- WCAG mapping (e.g., WCAG 2.1 SC 1.1.1).
- Applicable (Yes / No / Partially / Not in scope).
- Test method (Automated, Manual, Assistive-tech, Document review).
- Result (Pass, Fail, Not testable, Not applicable).
- Evidence file (link, screenshot, video).
- Severity (Critical, High, Medium, Low), only meaningful for Fail rows.
- Owner.
- Estimated remediation effort (S, M, L, XL).
- Target close date.
- Status (Open, In progress, Verified, Accepted risk).
Sheet 3, Severity guidance
A reproducible rubric so different auditors grade the same way.
| Severity | Definition |
|---|---|
| Critical | Blocks task completion for any user with a covered disability. |
| High | Forces a workaround; affects multiple user types or core flows. |
| Medium | Degrades experience but task completes; affects narrow user type or non-core flow. |
| Low | Cosmetic, advisory, or affects an edge case. |
Sheet 4, Backlog generator
Filters and groups the failing rows into a remediation backlog with proposed milestones, blocked-by dependencies, and effort totals. The backlog is the artefact that goes into engineering planning.
Sheet 5, Statement input
Fields needed for the EU model accessibility statement: scope of conformance claim, non-conformities, disproportionate-burden invocations, contact point, enforcement procedure.
Test method guidance
- Automated. axe-core, Wave, Lighthouse accessibility, Pa11y, Tenon. These cover roughly 30 to 40 per cent of WCAG criteria, never claim conformance from automated tests alone.
- Manual. Heuristic walk-through against the WCAG techniques. Most effective for cognitive criteria (3.x), labelling, focus order, error identification.
- Assistive technology. NVDA + Firefox, JAWS + Chrome, VoiceOver + Safari (macOS and iOS), TalkBack + Chrome (Android). Test each AT-paired browser combination for the surface in scope; do not generalise.
- Document review. For policy, procurement language, training and documentation criteria.
Severity-to-priority mapping
A useful default for the remediation backlog:
| Severity | Default priority | Default SLA |
|---|---|---|
| Critical | P0 | Hotfix; before next release |
| High | P1 | Within next sprint cycle |
| Medium | P2 | Within current quarter |
| Low | P3 | Backlog; review annually |
What “accepted risk” means
You may temporarily accept a non-conformance only if you document:
- Why remediation is disproportionate (cost, scope, technology), with evidence.
- Compensating measures (alternative path for affected users).
- Review date and trigger.
This is the “disproportionate burden” mechanism in EAA Article 14, it is auditable, time-limited, and not a default.
Common pitfalls
- Skipping section 6 (functional performance). Section 6 sets outcomes; if you only test technical clauses you can pass them and still fail the outcome.
- Treating automated scans as conformance evidence. Automated tools miss semantic and cognitive issues entirely.
- One-time audit without surveillance. Conformance is a property of the running product, not a snapshot.
- No evidence link per row. Without evidence, the gap analysis is not auditable.
Output deliverables
When you close the gap analysis, the artefacts are:
- The completed workbook (one file per release scope).
- A remediation backlog inside engineering’s tracker.
- A draft accessibility statement reflecting the current conformance position.
- A risk-and-opportunity entry tying accessibility into the QMS.