$120 tested Claude codes · real before/after data · Full tier $15 one-timebuy --sheet=15 →
$Free 40-page Claude guide — setup, 120 prompt codes, MCP servers, AI agents. download --free →
clskills.sh — terminal v2.4 — 2,347 skills indexed● online
[CL]Skills_
AccessibilityintermediateNew

A11y Testing

Share

Set up automated accessibility testing

Works with OpenClaude

You are an accessibility engineer. The user wants to set up automated accessibility testing in their web project using industry-standard tools.

What to check first

  • Run npm list axe-core jest testing-library to verify test dependencies are installed
  • Check if your project has a jest.config.js or vitest.config.js file
  • Confirm you have React Testing Library or similar DOM testing utilities available

Steps

  1. Install axe-core and jest-axe: npm install --save-dev axe-core jest-axe
  2. Import jest-axe in your test setup file or individual test files: import 'jest-axe/extend-expect'
  3. Render your component using React Testing Library's render() function
  4. Get the container from the render result and pass it to axe() for scanning
  5. Use expect(results).toHaveNoViolations() to assert accessibility compliance
  6. Add WCAG 2.1 level AA rule configuration via axe options object for stricter checks
  7. Create a custom test utility function that wraps axe scanning for reusable assertions
  8. Run your test suite with npm test -- --coverage to track a11y test coverage over time

Code

import { render } from '@testing-library/react';
import { axe, toHaveNoViolations } from 'jest-axe';
import MyButton from './MyButton';

expect.extend(toHaveNoViolations);

describe('MyButton Accessibility', () => {
  it('should not have any accessibility violations', async () => {
    const { container } = render(<MyButton>Click me</MyButton>);
    const results = await axe(container);
    expect(results).toHaveNoViolations();
  });

  it('should have proper ARIA labels on interactive elements', async () => {
    const { container } = render(
      <MyButton aria-label="Submit form">Submit</MyButton>
    );
    const results = await axe(container, {
      rules: { 'button-name': { enabled: true } },
    });
    expect(results).toHaveNoViolations();
  });

  it('should maintain color contrast ratios', async () => {
    const { container } = render(
      <MyButton style={{ color: '#ffffff', backgroundColor: '#0000ff' }}>
        Contrast Test
      </MyButton>
    );
    const results = await axe(container, {
      rules: { 'color-contrast': { enabled: true } },
    });
    expect(results).toHaveNoViolations();
  });

  it('should have keyboard navigation support', async () => {
    const { container, getByRole } = render(<MyButton>Focus me</MyButton>);
    const button = getByRole('button');
    button.focus();
    expect(document.activeElement).toBe(button);
    const results = await

Note: this example was truncated in the source. See the GitHub repo for the latest full version.

Common Pitfalls

  • Auto-generated alt text from filenames — always describe the actual image content, not the filename
  • Using aria-hidden="true" on focusable elements — the element will still receive focus but be invisible to screen readers, breaking keyboard navigation
  • Color contrast ratios that pass on the design file but fail in production due to anti-aliasing or font weight differences
  • Adding ARIA labels to elements that already have semantic HTML — this often confuses screen readers more than it helps
  • Skipping the lang attribute on the <html> element — screen readers won't pronounce content correctly without it

When NOT to Use This Skill

  • When your component is purely decorative and not part of the user-interactive flow
  • When you're prototyping and the design will change significantly — wait until the design stabilizes
  • On third-party embeds where you can't modify the markup (use a wrapper-level fix instead)

How to Verify It Worked

  • Run axe DevTools browser extension on the page — should show 0 violations
  • Test with a screen reader (VoiceOver on Mac, NVDA on Windows) — every interactive element should be announced clearly
  • Navigate the entire flow using only the Tab key — you should be able to reach and activate every interactive element
  • Check Lighthouse accessibility score — should be 95+ for production

Production Considerations

  • Add accessibility tests to your CI pipeline so regressions don't ship — fail the build on critical violations
  • Real users with disabilities navigate differently than automated tools — schedule manual testing with disabled users at least once per quarter
  • WCAG 2.1 AA is the legal minimum in most jurisdictions (ADA, EAA). AAA is aspirational, not required
  • Document your accessibility decisions in a public a11y statement — required for ADA compliance in the US

Quick Info

Difficultyintermediate
Version1.0.0
AuthorClaude Skills Hub
accessibilitytestingautomation

Install command:

curl -o ~/.claude/skills/a11y-testing.md https://claude-skills-hub.vercel.app/skills/accessibility/a11y-testing.md

Related Accessibility Skills

Other Claude Code skills in the same category — free to download.

Want a Accessibility skill personalized to YOUR project?

This is a generic skill that works for everyone. Our AI can generate one tailored to your exact tech stack, naming conventions, folder structure, and coding patterns — with 3x more detail.