When Jonny Met Pa11y: An Accessibility Love Story
Web accessibility not only improves the experience for all users - whether they use assistive technoloigies or not - it's also becoming critical for organisations who need to be compliant with the law. With an increasing number of organisations – including the National Museum of Crime and Punishment and Netflix – being threatened with action, the importance of accessibility has increased significantly. In this post, Front-end Developer Jon White talks about recent changes we've made to our accessibility practices at Cogapp, and about his first encounter with pa11y.
Accessibility on the web has always been in the domain of the front-end developer.
It might not be as trendy now as it once was, after all the realm of front-end has expanded to include a lot more technologies for developers to get excited about other than focusing on their “lovingly hand crafted, semantic HTML”.
Despite that, accessibility is still as important today as it ever was. But what are good ways of making sure the work we produce is accessible, and applying that process to a team workflow?
At Cogapp we’ve always taken accessibility seriously. Before now, our typical way of testing a site for compliance would look something like this:
- Check the HTML through the W3C validator.
- Manually run through a checklist.
- Use a tool like WAVE to pick up things like missing attributes or bad contrast.
We have regularly worked with WCAG2.0 (Web Content Accessibility Guidelines) and when it came to building the new Rock & Roll Hall of Fame website, we also needed to produce code compliant to Section 508, a different standard and the US counterpart of the WCAG2.0. This is a legal requirement for US organisations, equivalent to the Disability Discrimination Act in the UK.
On top of the legal aspect, accessibility was something the Rock Hall was really passionate about from the get go, so accessibility was a major focus for us all from the start.
There is a lot of crossover between Section 508 and WCAG2.0, with WCAG2.0 actually appearing to be more stringent in most cases. This made development more straight-forward, as myself and the rest of the team know WCAG2.0 well, but because of the importance of Section 508 to the project we tested against both standards.
An updated toolbox
I took the opportunity of the Rock Hall project to review our accessibility processes and see where we could improve. I identified the following:
- We should test accessibility for responsive design more thoroughly.
- Consistently documenting the results of accessibility testing would enhance our practice.
- Our accessibility checklist approach should be reworked so that non-developers could test, which also makes it easier for developers to test.
I worked with colleagues to define an accessibility test plan that we could use for any part of the project, which covered responsive layouts, had a consistent way of documenting accessibility problems and advised us about the right tools to use to identify and fix them.
Accessibility testing documentation
With that, the first and most important thing we changed about our process was to create up to date accessibility testing documentation.
Whereas previously we primarily used a checklist with some useful tools, now we have a document that highlights when testing should be carried out, tasks that must be completed, who can do it, specific tools that should be used, and how to record test results.
It meant that everyone is on the same page and that we get a consistent output from our tests.
Automated accessibility testing
The next change was adopting automated testing for accessibility, specifically pa11y.
Many accessibility checks can be completed efficiently by running them automatically, and then having a human review the report. We have used tools for this in the past, but this was our first time with pa11y, and it suited our needs well.
Pa11y can run tests against multiple standards (WCAG2.0, Section 508) at the same time, and at multiple screen sizes for responsive designs. It generates reports to document the testing process, which we then uploaded to the relevant story in our task tracker.
Ultimately automated tests can never replace manual testing, there are still a lot of things that a real person needs to check (we build websites for people, not machines, right?), but it’s clear that they have a a worthy place in any accessibility testing suite.
With the above changes and additions in mind, our updated accessibility testing process now looks something like this:
- Validate HTML
- Run the automated test
- Run through the manual checklist
- Attach documents to story
- Developer review of reports / checklist
- Re-run automated test to show pass
- Attach passing reports
This is more structured and stringent than before, and crucially provides both us and our client with better documentation of the issues we’ve identified and fixed.
One of the goals here was to provide the client with evidence that we had actually delivered on what we’d promised.
AND we made a beautiful website. Read the Rock & Roll Case study.
Always room for improvement
As the Rockhall project moved forward the process we conceived evolved from an initial set of ideas, into something more mature.
It’s not the final stage for our accessibility testing workflow, but I think what we came away with is a solid foundation to improve upon and we already have some ideas about how to enhance and streamline our process further.
If you’re looking to improve your accessibility testing, you should also consider things I didn’t cover in this post, like using the site with a screen reader, and ensuring content entered in the CMS is accessible too.
Useful tools for checking accessibility
I’ve listed some of my favourite tools for checking and debugging accessibility issues below.
If you want a good starting point for a manual checklist, I highly recommend looking at the WebAIM WCAG2.0 checklist (http://webaim.org/standards/wcag/checklist). They also have a Section 508 checklist (http://webaim.org/standards/508/checklist), and while these aren’t official documents they are useful for getting to grips with pass/fail criteria in a simple language.
- Pa11y (http://pa11y.org/)
- Wave Chrome Extension (http://wave.webaim.org/extension/)
- W3C HTML Validator (http://validator.w3.org)
- Contrast checker (http://contrastchecker.com/)
- Accessibility Developer Tools Chrome extension (https://chrome.google.com/webstore/detail/accessibility-developer-t/fpkknkljclfencbdbgkenhalefipecmb?hl=en)
More from the Cogapp blog