Blog

HTML stickers are here

This retro sticker features the <html> element in block letters printed on a holographic background. As the light moves, the reflected gradient on the sticker shifts through the color spectrum. Just 2 inches wide and half an inch tall.

Contact me with your address to get one.

Preparing for an accessibility test

This is the first article in a series about how to run an accessibility test. The companion video and information on performing an accessibility test are available from the Accessibility Testing project page.

If you’re new to accessibility, check out the many resources at the W3C Web Accessibility Initiative (WAI). A good, free foundational course is the W3C Introduction to Web Accessibility.

Introduction

When I started learning about testing for web accessibility, I came across a lot of checklists of what to test for but I struggled with two things. One, I didn’t understand the Web Content Accessibility Guidelines (WCAG) well enough to know if I was covering everything. Two, I didn’t know where on the page to start testing and how much I should test at a time.

The answer to both of these is to start somewhere. A little bit of accessibility testing of what you understand is better than no testing. I like to think of accessibility as a spectrum from less accessible to more accessible. Our goal is to improve access to our websites—a progressive and ongoing task. Accessibility is a huge discipline with many facets that takes time to do well. I hope this project helps ease you into the process.

Methodology

At the beginning of each test, we must establish what we are testing for given the different versions and levels of WCAG. Industry standard is to test WCAG 2.1 Level A and AA, 50 discrete success criteria (SC); that is what this methodology follows. What I found overwhelming was the best order in which to test each SC. Do I start at the top with 1.1.1 Non-text Contrast? What if an SC isn’t applicable to the webpage I’m testing?

I’ve created an accessibility testing spreadsheet to help with this.

Download the accessibility testing spreadsheet (Updated 26 January 2024)

The Overview sheet is a place to list the project name, date, WCAG version being tested, the environment used for testing (browser/screen reader combinations) and any tools used for testing.

The Scope sheet is a place to track what you’ll be testing. Later on in the Scope section, I address how to break up a page into components, or chunks of content, for testing. Provide each component with an ID, name, description of how to locate the component and the URL of test page. Update the “Status” column to reflect if the component is in testing, under review or completed.

Below I outline the columns in the Component sheet. For each component tested, create a clone of the Component sheet and rename it. The workbook includes two components to start.

Type

Each of the 50 SC are assigned a “Type” category. The spreadsheet is sorted by “Type” initially, grouping related SC into the following 12 categories:

  • Active Controls
  • Adaptability
  • Contrast
  • Dynamic Changes
  • Errors
  • Interactivity
  • Keyboard
  • Multimedia
  • Order and Focus
  • Page Level
  • Semantics
  • Sensory

Let’s look at the SC under “Active Controls” to understand how grouping SC makes testing easier:

  • 1.3.5: Identify Input Purpose
  • 2.4.4: Link Purpose (In Context)
  • 2.5.3: Label in Name
  • 3.2.1: On Focus
  • 3.2.2: On Input
  • 3.3.2: Labels or Instructions

We can see from the SC numbers that these are spread across a number of guidelines with gaps in the numerical order. While there’s nothing wrong with starting at 1.1.1, I find it easier to test related SC at the same time.

WCAG success criterion

This spreadsheet lists all 50 SC (rows 2-51). Each success criterion is linked to its W3C “Understanding” page which lists use cases and remediation resources. You can sort the “WCAG Success Criterion” column to reorder the SC from 1.1.1 to 4.1.3 instead of using the “Type” grouping.

Level

Each success criterion displays its WCAG level, A or AA. You can filter the “Level” column to display just A or just AA.

WCAG criterion description

The quick reference description of each success criterion is listed in the “WCAG Criterion Description” column. In part two, I discuss what each success criterion covers from a testing perspective and what to look for. In some cases, I’ve added examples or code snippets for reference. Take some time to read these over and understand what WCAG covers.

Status

The “Status” column has a pick list with three values for tracking your testing progress: N/A, Pass and Fail. I like to go through the 50 SC and mark any that don’t apply to the content I’m testing as “N/A”. For example, if there is no video or audio content on your testing page, you can eliminate six of the “Multimedia” SC from testing right away.

If you do not find any issues for a success criterion, mark it as “Pass”. If you find any issues at all for a success criterion, mark it as “Fail”. Filter the “Status” column to see only “Pass” or “Fail” results.

Issues

This is where you will write up any accessibility issues you find while testing the component. Let’s look at an example. You’re testing 2.4.4: Link Purpose (In Context). The test page lists several articles with “Read More” links. You test with a screen reader and they are all announced the same: read more. This means an assistive technology user cannot distinguish one link from the next.

In the “Issues” column, write a concise description explaining

  • What content has the issue (“Read More” links)
  • What the issue is (links don’t provide context)
  • Why it’s an issue (users of assistive technology can’t distinguish between “Read More” links)

For simplicity, I suggest writing up all issues you find with each success criterion in the same “Issues” cell instead of creating a new row for each issue.

Use the browser inspector (F12) to determine if there are issues with the HTML.

The Inspector tab displaying HTML and CSS code in Firefox Devtools
Firefox Devtools – Inspector tab

Recommendations

In addition to identifying issues, we should provide advice on how to remediate the issues. This takes time. It’s only through the experience of testing different kinds of content that you learn the best way to solve accessibility issues. You might want to make recommendations after completing all testing if you need to look up examples and research how to fix problems. Writing solid remediation advice comes with patience and practice.

You need to have a good understanding of how HTML and CSS work to display content. You need to have a basic idea of how JavaScript controls page behavior, for example, enabling a button to open a menu. You don’t necessarily need to be able to code JavaScript but you do need to be able to discuss expected behavior and how to achieve it programmatically, like adding keystroke events for ENTER and SPACE keys to make a custom button keyboard accessible.

Use the browser inspector (F12) to test proposed solutions by modifying the HTML and CSS. I address this further in part two.

Source code

The browser inspector enables you to copy a source code snippet from the HTML or CSS to paste into the “Source Code” column of the spreadsheet. This enables developers to locate the issue more easily in their code base for remediation.

<a href="article.html">Read More</a>

Environment

Industry standard is to test with two or more of these combinations:

  • Chrome browser and JAWS screen reader on Windows
  • Firefox browser and NVDA screen reader on Windows (this demonstration)
  • Safari browser and VoiceOver screen reader on Mac

Below are the most-used browser/screen reader combinations in use according to the WebAIM Screen Reader Survey #9 (May-June 2021).

Most common screen reader and browser combinations
Screen Reader & Browser# of Respondents% of Respondents
JAWS with Chrome50032.5%
NVDA with Chrome24616.0%
JAWS with Edge19412.6%
NVDA with Firefox1499.7%
JAWS with Firefox744.8%
VoiceOver with Safari724.7%
NVDA with Edge553.6%
ZoomText/Fusion with Chrome332.1%
JAWS with Internet Explorer301.9%
VoiceOver with Chrome241.6%
ZoomText/Fusion with Edge181.2%
Other combinations1449.4%

Tools

I’m using a group of freely available tools for this demonstration but I am not endorsing any one in particular. It’s important to test with different tools to find out what works best for your situation.

A good way to learn about accessibility problems is to use an automated accessibility checker. These tools will scan the source code and outline certain issues that can be tested for automatically, like color contrast. It’s important to double-check these flagged issues for yourself and add issues to the spreadsheet only if the flagged issue is valid.

For this demonstration, I’m using the axe devtools Firefox extension. Once installed, it will add an “axe DevTools” tab to your browser devtools (F12). In part two, we’ll run the scan and explore the results.

Browser devtools panel in Firefox with the axe DevTools tab displayed
axe DevTools tab in Firefox

I use the following tools to help me test for other issues:

Scope

It’s important to assess what webpage or website you’re going to be testing so you can break up repeated elements into smaller components. This eliminates the problem of testing the same thing on multiple pages, e.g. navigation. If you’re testing a single webpage, that can often be treated as a single component. But if you’re testing a website with multiple pages, try to save yourself some work by breaking up the content:

  • Header
  • Navigation (global and local)
  • Search
  • Sidebar
  • Footer
  • Main content

Within the main content, you might want to break up lengthy pages into smaller components like carousel, image gallery, form, video player, etc. depending on what content you’re testing. There’s no sense in testing the same kind of content over and over; you want a good sample of the different types of content found within the website. It’s a judgement call by you as the tester as to how large or small a component should be.

Conclusion

You should now have a better idea of what to test and how to test it. In part two, I perform an accessibility test and track my results in the accessibility testing spreadsheet.

View the video: Preparing for an accessibility test

Read part two: Performing an accessibility test

Performing an accessibility test

Last updated 8 May 2023.

The companion videos and information on performing an accessibility test are available from the Accessibility Testing project page.

In part one, we looked at setting up our environment for accessibility testing, including configuring the accessibility testing spreadsheet. We scoped the homepage of the test site, BoomerangTags.com, into the following eight components to make testing easier:

  1. [Site] Header
  2. [Home page] Hero
  3. [Home page] Pet tags
  4. [Home page] Ratings
  5. [Home page] Video
  6. [Home page] Business information
  7. [Site] Footer
  8. [Site] Cookies settings

The [Site] components of Header, Footer and Cookie settings appear on every page of the site and need to be tested only once. Additionally, we have a “Common Issues” component for tracking any site-wide problems affecting multiple pages, like 2.4.1: Bypass Blocks.

Explore the component

We begin our testing with the 01 [Site] Header component.

screen shot of boomerangtags.com site header with horizontal navigation, dropdown menus, a search widget with autocomplete
BoomerangTags.com homepage site header

Using a mouse, keyboard and screen reader, explore the component. We want to get an idea of what kinds of controls and widgets the header contains. We can also do a first pass for success criteria like 2.4.7: Focus Visible when we’re tabbing through the focusable elements with the keyboard. We can discover hidden menus that display on hover or widgets that expand by activating a button.

The [Site] Header contains the following content:

  • Utility navigation with two links
  • Navigation landmark with a list of 10 items with links
    • Two links have menus
    • The last link in the list opens a search widget
  • Search field with autocomplete and a button
    • Autocomplete list contains several links
  • Shopping cart link

Test the success criteria

Do a first pass through the 50 success criteria (SC) to identify which ones are applicable to the component you’re testing. For [Site] Header, we’re able to mark 21 SC as not applicable either because the content doesn’t exist (video/audio) or the content is out of scope (page level). We then did automated and manual testing of the remaining SC.

Fails (10)

  1. 1.1.1: Non-text Content
  2. 1.4.3: Contrast (Minimum)
  3. 1.4.11: Non-text Contrast
  4. 1.4.13: Content on Hover or Focus
  5. 2.1.1: Keyboard
  6. 2.4.3: Focus Order
  7. 2.4.4: Link Purpose (In Context)
  8. 2.4.7: Focus Visible
  9. 3.2.4: Consistent Identification
  10. 4.1.2: Name, Role, Value

Passes (19)

  1. 1.3.1: Info and Relationships
  2. 1.3.2: Meaningful Sequence
  3. 1.3.3: Sensory Characteristics
  4. 1.3.4: Orientation
  5. 1.4.1: Use of Color
  6. 1.4.4: Resize text
  7. 1.4.5: Images of Text
  8. 1.4.10: Reflow
  9. 1.4.12: Text Spacing
  10. 2.1.2: No Keyboard Trap
  11. 2.1.4: Character Key Shortcuts
  12. 2.3.1: Three Flashes or Below Threshold
  13. 2.4.6: Headings and Labels
  14. 2.5.2: Pointer Cancellation
  15. 2.5.3: Label in Name
  16. 3.2.1: On Focus
  17. 3.2.2: On Input
  18. 3.3.2: Labels or Instructions
  19. 4.1.1: Parsing

Conclusion

While we found failures of only 10 of the SC, we found more than 10 total issues. For 1.4.3: Contrast (Minimum); we have five different examples of foreground text that does not have at least 4.5:1 contrast with the background color. For 4.1.2: Name, Role, Value, we have four different issues.

Download the BoomerangTags.com testing spreadsheet to explore the results.