SnapFeed logo SnapFeed

Catching Accessibility Issues Before Launch: A Practical Workflow for Agencies

SnapFeed Team
Accessibility testing integrated into web agency review workflow

Accessibility lawsuits against websites increased 300% between 2018 and 2024. More importantly, 1 in 4 adults in the US lives with a disability that affects how they use the web. Accessibility isn’t a checkbox — it’s core to good web development.

But accessibility is also where feedback workflows typically break down. Issues are discovered late (post-launch), reported vaguely (“the site doesn’t work with my screen reader”), and hard to reproduce without specialized tools.

Here’s how to build accessibility testing into your review process from the start.

Why Accessibility Fails Late in Projects

Most agencies treat accessibility as a final-stage audit. The problems with this:

  1. Fixes are expensive late: An inaccessible component discovered in the review phase might require architectural changes, not just CSS tweaks
  2. Clients don’t know to test it: Without guidance, clients rarely test with assistive technologies
  3. Developers aren’t prompted: Without a structured review, developers may not think to test keyboard navigation or screen reader compatibility

The fix: make accessibility a first-class concern in every review round.

Stage 1: Automated Scanning During Development

Before any client review, run automated accessibility scans on staging.

Tools:

  • axe DevTools (browser extension): Catches ~57% of WCAG 2.1 AA issues automatically
  • Lighthouse (Chrome DevTools → Lighthouse → Accessibility): Good overview score + specific issues
  • WAVE (browser extension): Visual overlay showing accessibility errors and warnings

Process: Run these on every major page before opening for client review. Fix all critical and serious errors first.

These tools won’t catch everything (contrast on dynamic content, focus management in modals, screen reader announcements), but they eliminate the low-hanging fruit quickly.

Stage 2: Manual Keyboard Testing

Automated tools miss most keyboard navigation issues. Manual testing takes 20–30 minutes per page but catches critical failures:

The keyboard test:

  1. Close your mouse and trackpad
  2. Navigate to the page
  3. Press Tab — can you reach every interactive element?
  4. Is the focus indicator visible at all times?
  5. Can you activate buttons with Enter, links with Enter, dropdowns with arrows?
  6. Can you close modals and dialogs with Escape?
  7. Does focus return to a logical position after closing modals?

Document failures as feedback items in SnapFeed, tagged with #accessibility and #keyboard.

Stage 3: Screen Reader Testing

At minimum, test with one screen reader on one browser:

  • macOS/iOS: VoiceOver (free, built in) — test in Safari
  • Windows: NVDA (free) — test in Firefox or Chrome
  • Windows enterprise: JAWS — if your client’s audience uses it

What to test:

  • Page title announced correctly
  • Headings structure logical (H1 → H2 → H3)
  • Images have meaningful alt text
  • Form labels are properly associated
  • Error messages are announced
  • Dynamic content updates (AJAX) are announced

Stage 4: Client Accessibility Review

Include accessibility checks in your client review process. Give clients specific tasks:

Round 1 (Design): “Please check that these images have alt text descriptions you’re happy with. Submit feedback on the staging site if any alt text needs updating.”

Round 2 (Development): “Please try to navigate this form using only your keyboard (no mouse). Submit any issues via the feedback widget.”

Pre-launch: “If you or anyone on your team uses a screen reader, please test the site and submit feedback before launch.”

Most clients won’t test with assistive tech — but the ones who need to will. And asking creates accountability.

Using SnapFeed for Accessibility Feedback

When you or your team submit accessibility feedback via SnapFeed, use consistent tagging:

  • #accessibility on all accessibility-related items
  • #keyboard for keyboard navigation failures
  • #screen-reader for screen reader issues
  • #contrast for color contrast failures
  • #wcag-a, #wcag-aa to indicate compliance level affected

This lets you filter and report on accessibility specifically at the end of a project.

The Accessibility Review Checklist

Include this in your pre-launch review:

Perceivable

  • All images have descriptive alt text (or empty alt if decorative)
  • Videos have captions or transcript
  • Color is not the only way information is conveyed
  • Text meets WCAG AA contrast ratios (4.5:1 for body, 3:1 for large text)
  • Text can be resized to 200% without losing content

Operable

  • All functionality is keyboard accessible
  • No keyboard traps
  • Focus indicator is visible on all interactive elements
  • No content flashes more than 3 times per second

Understandable

  • Page language is set in HTML (<html lang="en">)
  • Form fields have labels
  • Error messages are descriptive and helpful
  • Navigation is consistent across pages

Robust

  • HTML is valid (W3C validator)
  • ARIA is used correctly (roles, states, properties)
  • Custom components work with screen readers

Communicating Accessibility to Clients

Some clients push back on accessibility requirements because they see it as added cost. Frame it correctly:

Legal risk: “Accessibility lawsuits are real. The cost of a lawsuit is orders of magnitude higher than the cost of building it right.”

Market reach: “Roughly 15-20% of your potential users have some form of disability. Inaccessible sites exclude them.”

SEO benefit: “Semantic HTML, good alt text, and logical structure all help search engine rankings.”

Future cost: “Retrofitting an inaccessible site costs 5-10x more than building accessibly from the start.”


Track accessibility feedback alongside all client feedback in SnapFeed. Start free.