How to Add Accessibility Checks to Your CI/CD Pipeline
Stop shipping accessibility regressions. Add automated WCAG checks to your GitHub Actions workflow in 5 minutes. Free, open source, with PR comments.
You write tests for your API. You lint your code. You check for security vulnerabilities in your dependencies. But you're probably not checking for accessibility regressions.
Every deploy is a chance to break something for the 1.3 billion people with disabilities worldwide. A missing alt attribute, a contrast change that drops below AA, a form field that loses its label — these slip through code review because they're invisible to sighted developers.
Here's how to catch them automatically.
The Problem: Accessibility Regressions Are Silent
Unlike a broken API endpoint or a failed test, accessibility regressions don't trigger errors. They don't crash the app. They don't show up in your monitoring. They just quietly exclude users.
Common regressions that slip through:
- Designer changes a color → contrast ratio drops below 4.5:1
- Component refactor removes an
aria-label - New feature adds interactive elements without keyboard support
- Image alt text gets lost in a CMS migration
By the time anyone notices, it's in production. Maybe a customer complains. Maybe a lawyer notices.
The Solution: Shift Left
The accessibility industry loves the phrase "shift left" — catch issues earlier in the development process. The earlier you catch a regression, the cheaper it is to fix.
| Stage | Cost to Fix | Detection |
|---|---|---|
| Design | Low | Manual review |
| Development | Low | Linter, local scan |
| Pull Request | Low | Automated CI check |
| Staging | Medium | QA testing |
| Production | High | User complaints, audits |
| Lawsuit | Very high | Legal |
The pull request is the sweet spot. The code is written but not shipped. The context is fresh. The fix is a commit away.
Option 1: GitHub Action (Easiest)
We built a GitHub Action that runs 39 WCAG checks against any URL and posts the results directly on your PR.
Setup (2 minutes)
Create .github/workflows/accessibility.yml:
name: Accessibility Check
on:
pull_request:
branches: [main]
jobs:
accessibility:
runs-on: ubuntu-latest
steps:
- uses: PrimeStark/accessiguard-action@v1
with:
url: 'https://your-staging-url.com'
threshold: 80
That's it. Every PR now gets an accessibility report as a comment:
- ✅ Score above threshold → green check
- ❌ Score below threshold → build fails, PR blocked
With Vercel Preview Deployments
If you deploy previews on Vercel (or Netlify, or any platform that gives you a preview URL), you can scan the actual preview:
name: Accessibility Check
on:
deployment_status:
jobs:
a11y:
if: github.event.deployment_status.state == 'success'
runs-on: ubuntu-latest
steps:
- uses: PrimeStark/accessiguard-action@v1
with:
url: ${{ github.event.deployment_status.target_url }}
threshold: 75
This scans the preview deployment, not production. You catch the regression before it ships.
Multiple Pages
Use a matrix strategy to scan critical pages:
jobs:
accessibility:
runs-on: ubuntu-latest
strategy:
matrix:
page:
- https://your-site.com
- https://your-site.com/pricing
- https://your-site.com/signup
- https://your-site.com/docs
steps:
- uses: PrimeStark/accessiguard-action@v1
with:
url: ${{ matrix.page }}
threshold: 70
Option 2: CLI in Any CI
The GitHub Action wraps our CLI, which works in any CI environment:
npx accessiguard scan https://your-site.com --ci --threshold 80
Exit codes:
0— score meets threshold1— score below threshold2— error (network, invalid URL)
GitLab CI
accessibility:
image: node:20
script:
- npx -y accessiguard scan $CI_ENVIRONMENT_URL --ci --threshold 80
only:
- merge_requests
Jenkins
stage('Accessibility') {
steps {
sh 'npx -y accessiguard scan https://staging.example.com --ci --threshold 80'
}
}
CircleCI
- run:
name: Accessibility Check
command: npx -y accessiguard scan $DEPLOY_URL --ci --threshold 80
What Gets Checked
AccessiGuard runs 39 automated WCAG 2.1 checks:
Critical: Color contrast, image alt text, form labels, document language, page title
Structure: Heading hierarchy, landmark regions, list structure, table headers, skip navigation
Interactive: Keyboard focus, ARIA attributes, link text, button labels, focus management
Media: Video captions, SVG accessible names, meta viewport
Advanced: Nested interactive elements, deprecated elements, ARIA hidden focus traps, empty headings, new window warnings
Each issue comes with:
- Impact level (critical/serious/moderate/minor)
- Instance count
- WCAG criterion reference
- AI-generated fix suggestion with code
Setting the Right Threshold
Don't start at 100. Start where you are and ratchet up.
- Scan your production site → get your current score
- Set threshold 5 points below → avoid breaking existing PRs
- Fix issues incrementally → raise the threshold as you improve
- Target 90+ → that's where most compliance requirements land
# Week 1: Baseline
threshold: 60
# Month 1: After fixing critical issues
threshold: 75
# Month 3: After systematic improvements
threshold: 85
# Ongoing: Maintain high standard
threshold: 90
What Automated Testing Can't Catch
Automated tools catch about 30-40% of WCAG issues. The rest need manual testing:
- Screen reader experience (navigation flow, announcements)
- Cognitive load and readability
- Complex interaction patterns
- Alternative text quality (is it actually descriptive?)
- Video audio descriptions
Automated CI checks are the foundation, not the ceiling. They catch regressions reliably, but you still need periodic manual audits.
Get Started
- Try a free scan → accessiguard.app
- Add the GitHub Action → PrimeStark/accessiguard-action
- Run locally →
npx accessiguard scan your-site.com
No signup required. No API key. Just paste your URL.
Building accessibility into your pipeline isn't just about compliance. It's about not being the team that ships something that excludes a quarter of your users. Automate the boring part so you can focus on the hard part.