Blog

Comparing Level Access automated tools to manual accessibility testing

Last updated: 27 January 2023

This article is in response to Adrian Roselli’s article Comparing Manual and Free Automated WCAG Reviews. Go read it first for background.

Automated accessibility testing tools cannot test all of the Web Content Accessibility Guidelines (WCAG) success criteria. Adrian tested four free tools and compared them to his manual testing results. My intent is to add to the body of knowledge by providing results from the Level Access AMP accessibility testing tool. Level Access also provides a free browser extension that runs the same tests called Access Assistant.

I wanted to compare Level Access’s tools to Adrian’s manual testing findings because these tools are what we use at my work and I share his concerns that too many stakeholders lean on automated testing when it uncovers only a portion of potential problems.

Process

I tested the same page, web.dev, and performed a review with the following tools against WCAG 2.1 Level A and Level AA:

I performed the tests on 19 January 2023 with a live version of the site. It does not seem to have changed since Adrian’s testing on 14 January 2023 of these four automated tools:

All references to manual testing results are Adrian’s data. I did not find anything additional in my manual testing.

a screenshot of the web.dev page with a navigation bar across the top with a search icon. Below is some introductory text, a looping video, a decorative image and an image with the text case study.
web.dev webpage – dark mode

Highlights

Number of success criteria failed
Tool Total A AA
Manual 18 11 7
AMP 5 5 0
axe 2 2 0
ARC 3 3 0
WAVE 0 0 0
EAAC 3 3 0

We compare this to the total number of unique failures as some issues have multiple instances but are counted only once and some success criteria have multiple issues.

Number of success criteria failures
Tool Total A AA
Manual 37 24 11
AMP 4 4 0
axe 2 2 0
ARC 3 3 0
WAVE 0 0 0
EAAC 5 5 0

Some tools provide alerts for issues to check manually, including AMP.

Number of alerts
Tool Alerts
Manual 0
AMP 10
axe 0
ARC 7
WAVE 6
EAAC 20

Raw results

This section provides the output as provided by AMP for the various success criteria. It also includes alerts or what AMP calls “Needs Review” issues.

WCAG failures

The following table compares WCAG Level A failures between manual testing and AMP automated results.

Comparing WCAG Level A manual and automated test results
WCAG 2.1 SCs at Level A Manual AMP
1.1.1 Non-text Content Fail
Fail
  1. This svg element does not have a mechanism that allows an accessible name value to be calculated.
    • Rule: Provide alternative text for images
    • 1 instance
    • <svg viewBox="0 0 238 36" fill="currentColor" height="36" width="238" xmlns="http://www.w3.org/2000/svg">...</svg>
1.2.1 Audio-only and Video-only (Prerecorded) Pass
1.2.2 Captions (Prerecorded) N/A
1.2.3 Audio Description or Media Alternative (Prerecorded) N/A
1.3.1 Info and Relationships Fail
Fail
  1. This A does not have a ul element (without an ARIA-assigned role); ol element (without an ARIA-assigned role); an element with a role set to the value: list as a parent; or a ul element (without an ARIA-assigned role), ol element (without an ARIA-assigned role) or element with a role set to the value ‘list’ with an aria-owns attribute set to the ID of the element in the same DOM
    • Rule: Ensure list items are found in a list container
    • 28 instances
    • <a role="listitem" href="/new-patterns-july-2022/" data-category="web.dev" data-action="click" class="card card-vertical"a>...</a>
1.3.2 Meaningful Sequence Pass
1.3.3 Sensory Characteristics N/A
1.4.1 Use of Color Fail
1.4.2 Audio Control N/A
2.1.1 Keyboard Pass
Fail
  1. This A is focusable and has an aria-hidden attribute set to true
    • Rule: Avoid placing inactive elements in the focus order
    • 2 instances
    • <a aria-hidden="true" href="/interop-2022-wrapup/"></a>
    • <a aria-hidden="true" href="/web-platform-12-2022/"></a>
2.1.2 No Keyboard Trap Pass
2.1.4 Character Key Shortcuts N/A
2.2.1 Timing Adjustable N/A
2.2.2 Pause, Stop, Hide Fail
2.3.1 Three Flashes or Below Threshold Pass
2.4.1 Bypass Blocks Pass
2.4.2 Page Titled Fail
2.4.3 Focus Order Fail
2.4.4 Link Purpose (In Context) Pass
2.5.1 Pointer Gestures N/A
2.5.2 Pointer Cancellation Pass
2.5.3 Label in Name Fail
2.5.4 Motion Actuation N/A
3.1.1 Language of Page Pass
3.2.1 On Focus Pass
3.2.2 On Input Fail
3.3.1 Error Identification Fail
3.3.2 Labels or Instructions Fail
4.1.1 Parsing N/A
4.1.2 Name, Role, Value Fail
Fail
  1. The role attribute value of ‘listitem’ given to this A is not allowed. The element’s role attribute should be set to one of the following text values: button | checkbox | menuitem | menuitemcheckbox | menuitemradio | radio | tab | switch | treeitem; or the role attribute can be removed
    • Rule: Ensure ARIA roles, states, and properties are valid
    • 28 instances
    • <a role="listitem" href="/new-patterns-july-2022/" data-category="web.dev" data-action="click" class="card card-vertical">...</a>

NOTE: The rule “Ensure list items are found in a list container” failed in AMP for two success criteria, both 1.3.1 Info and Relationships and 4.1.1 Parsing.

AMP did not fail any WCAG Level AA success criteria so I am not including that table of results. Manual testing found 11 unique failures.

Alerts

AMP provides additional potential issues as a list of “Needs Review” items. These alerts may or may not be WCAG failures and require manual review to determine if there is an accessibility issue.

  1. Avoid inappropriate use of ARIA roles, states, and properties, cites 4.1.2 Name, Role, Value. The A element has an aria-hidden attribute set to the value: true. [2 instances]
  2. Provide valid, concise, and meaningful alternative text for image buttons, cites 1.1.1 Non-text Content and 4.1.2 Name, Role, Value. This button element has a suspicious accessible name value of: all. [1 instance]
  3. Ensure link text is meaningful within context, cites 2.4.4 Link Purpose (In Context). This A element has a suspicious (i.e. lacks purpose or is >150 characters) calculated accessible name value of: css. [1 instance]
  4. Provide synchronized captions for video (which includes audio) or other multimedia, cites 1.2.2 Captions (Prerecorded) and 1.2.4 Captions (Live). This video element does not have a track with kind=captions. [1 instance]
  5. Ensure heading level matches the heading’s visual importance/level, cites 1.3.1 Info and Relationships. [4 instances]
    • This article element contains an incorrect or missing heading level which may cause improper nesting within the document heading hierarchy.
    • This H3 creates an inappropriate jump in heading levels within the document heading hierarchy.
    • This H5 creates an inappropriate jump in heading levels within the document heading hierarchy.
    • This H1 creates an inappropriate jump in heading levels within the document heading hierarchy.
  6. Provide an informative context-sensitive page title, cites 2.4.2 Page Titled. This title has a suspicious value. [1 instance]

Access Assistant

The Access Assistant browser extension Quick Test returned nearly the same results as the AMP test but as a list of issues with code snippets. Accessing the link for each issue displays an explanation of the issue but does not reference any rules or standards.

Screen shot of the Access Assistant browser extension window that displays quick test results for the URL web.dev in a list, the first issue being provide alternative text for images with a code example.
Access Assistant extension for Chrome

* denotes violations flagged by AMP:

  • Provide alternative text for images.*
  • Ensure text and images of text provide sufficient contrast.
  • Provide valid, concise, and meaningful alternative text for image buttons.
  • Ensure heading level matches the heading’s visual importance/level. [4 instances]
  • Ensure list items are found in a list container. [28 instances]*
  • Ensure all active elements receive keyboard focus or can be activated with the keyboard.
  • Avoid placing inactive elements in the focus order. [2 instances]*
  • Provide an informative, context-sensitive page title.
  • Ensure link text is meaningful within context.
  • Ensure ARIA roles, states, and properties are valid. [28 instances]*
  • Avoid inappropriate use of ARIA roles, states, and properties. [2 instances]
  • Provide synchronized captions for video (which includes audio) or other multimedia.

The Quick Test found one issue to check that was not flagged by AMP as either a violation or alert: Ensure text and images of text provide sufficient contrast.

Conclusion

The automated testing results from the Level Access tools are comparable with the other automated tools Adrian tested with manual testing finding more than 9x the unique success criteria issues. Use automated testing tools in tandem with manual testing to find the most potential accessibility issues. Relying on any automated testing alone will leave you with access gaps for your users.

Update: 27 January 2023

I’ve updated the Highlights data tables to reflect Adrian’s findings for the four automated tools he tested. For the sample, Access Assistant found more issues than WAVE Evaluation Tool, axe DevTools, and ARC Toolkit, and fewer than Equal Access Accessibility Checker.

Component library accessibility audit

The first project my manager tasked me with at my new job as a senior accessibility engineer in the Design Engineering organization was to perform an accessibility audit of the component library our team provides to the engineering team who codes the dotcom website. These components are generally page level rather than UI level, think a card or a form. In total, I audited 28 components in the context of the component library, not as the components have been implemented into the dotcom site.

Auditing solo components

Auditing components is necessary—all code should be tested for accessibility standards—but presenting a component alone on an empty page in the context of a component library has limitations.

Placeholder content

I don’t like working with placeholder content. Design should follow content creation so examples should be able to use real text and images. Instructions about how to create content should be presented separately along with instructions for how to markup content, e.g. “Optional Header” is an <h2>.

Announcing Optional Header (60 chars) The main body of the banner should be used to display the content of the banner's messages, supported by the header. (150 char max)
An example banner from the component library

This is the issue I created for this example banner which is HTML text over a background image:

In banner variations with text on a lifestyle background image, when zoomed in up to 400% at a viewport width of 1280px, some text may overlap portions of the image that do not provide enough color contrast.

It’s difficult for me to tell if this is just a bad example image or a true color contrast concern.

No context

This complication popped up on the first component I tested: Alerts. The library page presented these message boxes on a blank page. I wasn’t sure if the message boxes were supposed to be there on page load or if something triggered them. This matters because the message box containers have the role="alert" attribute.

two example alerts with an icon, message text and learn more link
Example alerts from the component library

The alert role is a type of status message. It’s supposed to be assigned to a container that is empty on page load. Then, when something happens on the page, the alert is loaded into the empty container and immediately announced to assistive technology because of the alert role. It’s not designed to include a call to action like “learn more” links.

I looked at the dotcom production site and saw an alert displayed on page load. Because it has the alert role, assistive technology announces the content of this message box before anything else, before even the name of the webpage or website. It also announces the message box content twice because it is the first thing on the page.

I decided someone got carried away with the ARIA and advised that they remove the role="alert" attribute because these message boxes do not fit the expected design pattern for an alert.

Not our problem

In The Book on Accessibility chapter “Accessibility Coaching Guide”, the section “Not our problem” covers one of the pitfalls of depending solely on component library accessibility: The component library is accessible, so the development team doesn’t think it has to worry about anything.

While an individual component can be completely accessible, they can be used in patterns that are completely inaccessible.

A helpful analogy is building a wall with bricks. Individually bricks are quite strong, but they can be arranged in a way that is fragile and very weak.

This is ultimately why I was tasked with this audit. Our team wants to ensure we are not introducing any accessibility issues in the code we provide. Then we can trace any accessibility issues in production to either the dotcom engineering team’s implementation of these components or to the content entry team.

Example component audit results

The format I used in my report was to provide a bulleted list of accessibility issues I found. I did not note the specific WCAG success criteria affected because I didn’t think the team needed to know that information.

Below the bulleted list of issues, I have a “Recommendations” heading where I repeated the same bulleted list but with the advice for how to remedy each issue. Below that is an optional section for “Resources” where I link to different articles or documentation to support the remediation advice.

Carousel

The carousel does not follow the expected design pattern.

A carousel with a banner image and text. Controls include a play/pause button, tabs for each slides and previous/next buttons.
An example carousel from the component library
  1. The carousel is missing the expected role and accessible name.
  2. Slides and slide picker controls are missing the expected role and accessible name.
  3. Carousel controls are located after the slide content.
  4. The “play/pause” control has a confusing accessible name.
  5. The slide picker controls do not have sufficient contrast with the page background.
    • Foreground: #999898
    • Background: #FFFFFF
    • Contrast ratio: 2.88:1
  6. “Previous” and “next” controls do not have appropriate accessible names.
  7. The carousel does not stop advancing when a keyboard user activates the “previous” or “next” controls.
  8. Visible text beneath the slide heading is hidden from assistive technology.
  9. The “Call to action” control is a <button> element inside a link.
  10. Decorative slide images are announced by assistive technology.
  11. Hidden slide content is accessible to assistive technology.

Recommendations

  1. Add the aria-roledescription="carousel" attribute to the <section> element used to markup the carousel container. Provide an accessible name with the aria-label attribute.
  2. Markup slides and slide picker controls with tabpanel and tab roles with accessible names. See example. This includes enabling arrow keys to switch between slide tabs.
  3. Ensure carousel controls get keyboard focus before slide content. Group the “play/pause”, “previous” and “next” controls.
  4. When changing the name of a control depending on its state, do not use a toggle control. Remove the aria-pressed attribute from the “play/pause” control.
  5. Ensure slide picker controls have at least 3:1 contrast with the background.
  6. Remove the title and role attributes from the button <svg> elements for the “previous” and “next” controls; add the aria-hidden="true" attribute to hide them from assistive technology. Use the aria-label attribute on the button to provide the control with an accessible name.
  7. The carousel should stop advancing when any part of it has keyboard or mouse focus.
  8. Remove the aria-hidden="true" attribute from the visible slide text so that it is conveyed by assistive technology.
  9. Use either a link or a button for the “call to action” but not both.
  10. Ensure decorative slide images are hidden from assistive technology by providing an empty alt attribute.
  11. When a slide is visually hidden, it should also be hidden from assistive technology. This can be achieved by using the display:none CSS property on hidden slides or by adding the aria-hidden="true" attribute.

Resources

Common issues

Overall, the issues I found were pretty typical. It’s obvious people working on this component library have some accessibility knowledge and tried to create an accessible experience but likely did not do adequate testing with assistive technology, like a screen reader.

  1. Multiple ARIA issues with controls missing the expected roles and accessible names
  2. Decorative images and icons not hidden from assistive technology
  3. Information not available in smaller viewports or when zoomed to 400% at 1280px wide
  4. Color contrast issues with both text (4.5:1) and control borders (3:1)
  5. Some controls are not keyboard accessible

Conclusion

Testing a component library is challenging when it presents placeholder content without surrounding content for context. Testing the structure of a component is good for catching ARIA and resize issues but has limited value in ensuring the resulting website is accessible. Remember to test a representative sample of pages from your website that uses each of the library components with real content. What matters is how accessible your final content is.

5 neurodivergent UX fails while buying Alamo Drafthouse tickets online

I went to the movies this week for the first time since the pandemic began. I looked up showings at the Alamo Drafthouse and discovered little has improved with their payment process since I reviewed it on desktop back in 2016. This time, I completed the purchase on an iPhone using the responsive mobile website in the Firefox browser.

While I found several accessibility issues with the site, I’m highlighting concerns with the “Payment” screen. I’m neurodivergent and this post focuses on five things that cause me anxiety and make the experience frustrating:

  1. Required fields are not marked
  2. Submit button is disabled
  3. Error messages don’t offer suggestions
  4. Data formats are placeholder text
  5. Optional checkbox is already checked

Required fields are not marked

I’ve done enough online ordering that I assumed that all the credit card-related fields are required, but many people will not understand that. E-commerce research suggests that marking all fields, required or optional, improves the customer experience. It certainly lessens my anxiety to know exactly which fields to complete.

screenshot of the payment screen with form fields with placeholder text for Card Number, Cardholder Name, EXP, CVV and Zip code. The "Buy Tickets" button is disabled.

Not only are required fields not clearly marked, but merely interacting with a field causes the display of an angry, red “Required” message. (This does not work for the “EXP” field even though it is required.) These input fields use a combination of the HTML required attribute with an aria-describedby attribute for the error message which causes assistive technology to announce fields are required multiple times.

screenshot of the payment screen with red error messages denoting several fields are required

Submit button is disabled

From the previous screenshots, we can see the next issue that causes me a lot of anxiety when using a website. The “Buy Tickets” button, which is the submit button for the form, is disabled by default. The button becomes enabled only after data has been entered into all the required form fields, which are not clearly marked.

There are two more required form fields below the credit card fields but they are easy to miss because of the sticky footer with the “Buy Tickets” button, meaning that after entering all credit card details, this button is still disabled.

Error messages don’t offer suggestions

The default error message for empty fields is “Required”. If a user enters data in the wrong format, the error messages change to “Invalid”. This doesn’t help the user in any way to figure out how to fix the error.

screenshot of the payment screen with bad data entered into several fields which each have an error message of invalid.

Here are some examples of helpful error messages:

  • Card Number: Please enter 16 digits
  • EXP: Please enter 2-digit year
  • CVV: Please enter 3 digits
  • Zip Code: Please enter 5-digit US zip code

“Zip Code” is the only field requesting numerical data that displays the numerical keyboard on mobile devices. Adding the inputmode="numeric" attribute to every field requesting numerical data will display the numerical keyboard too, which improves the accuracy of data entered into these fields.

Screenshot of the payment screen with focus in the Zip Code field which displays the 10-key numerical keyboard on mobile devices

Data formats are placeholder text

The requested data format for all the “Payment” screen fields are implemented as placeholder text. This means that once a person starts to enter data into the field, the required formatting of that data disappears. People are forced to recall from memory how to enter the data correctly. On top of this, the “EXP” and “CVV” fields allow someone to enter more digits than the data format allows.

The video below demonstrates what a person using assistive technology, like a screen reader, experiences when exploring the form. Notice how placeholder data are not consistently announced by VoiceOver.

Optional checkbox is already checked

Following the credit card-related fields is the “Email Confirmation” section which includes a checkbox that is already checked:

Join Alamo Victory

By checking “Join Alamo Victory” you start earning visits with this purchase for rewards and you agree to Alamo Drafthouse Cinema’s terms of use.

Email confirmation section with email address and confirm email address fields followed by a checkbox that is already checked for Join Alamo Victory

This is an optional field. It should require that I choose to check it. Because of its location behind the sticky footer, it’s very likely people will not see this checkbox and inadvertently join this program. Having to look for sneaky UI patterns like this makes for a bad experience.

Conclusion

This website has made some improvements like providing a “Back” button after timeout and inline form field validation. I also give the developers kudos for appropriately implementing the autocomplete attribute on the credit card-related fields. I’d make the following changes to the “Payment” screen to create a better experience for neurodivergent people:

  1. Clearly indicate which form fields are required
  2. Don’t disable the submit button
  3. Offer suggestions for fixing data input errors
  4. Display required data formats at all times
  5. Don’t pre-check checkboxes for optional promotions