I’ve added the six new WCAG 2.2 level A and AA success criteria (SC) to the “WCAG Success Criterion” column in each component’s tab. This brings the total number of SCs for an audit to 55. (4.1.1 Parsing is deprecated but still included for 2.1.) Each new SC is highlighted in light green and includes “NEW in 2.2” in its name.
If you’re not ready to test for WCAG 2.2 SCs, just mark them as N/A.
WCAG failure tracking
I’ve added rudimentary tracking of the number of WCAG violations found for each component which is displayed on the “Scope” tab in the a new “WCAG Violations” column. Each cell counts the number of “Fails” selected in the “Status” column of each component’s tab.
On the “Overview” tab, you can see a running total of all WCAG violations found for all components in the audit.
Prioritizing issues
I’ve also added a “Priority” column to each component’s tab. Each SC can be assigned a priority on a five-point scale:
Critical – Blockers that prevent someone from completing a task, e.g. a button that cannot be used with a keyboard
High – Issues that present a significant barrier to someone completing a task, e.g. application times out without allowing the user to extend it
Medium – (Most issues) Issues that present a barrier to some users from fully accessing the information or interface, e.g. text has poor color contrast
Low – Issues that present an unnecessary barrier but do not prevent a user from completing a task, e.g. image text that isn’t sufficiently descriptive
Best Practice – Issues that present some accessibility barriers but are not considered failures under the Web Content Accessibility Guidelines (WCAG), e.g. using a button instead of a link to open a webpage in a browser
If you have any questions about the spreadsheet or additions you’d like to see, please contact me.
I created this bookmarklet for testing WCAG success criterion 1.3.5 Identify Input Purpose (AA) which requires that if an input field requests personal information about the user, we must apply the appropriate autocomplete value to the field. For example, an input field requesting the user’s full name would have an attribute of autocomplete="name". This allows the browser to attempt to autofill the input field with previous values entered for the same information. It is very much browser-specific.
This bookmarklet checks the webpage for any input fields that contain the autocomplete attribute. If none are found, it returns an alert message.
If autocomplete attributes are found, the script returns the values and displays them in context of the input field.
This makes it easy to determine if input fields have autocomplete attributes defined and if they are valid. Click the bookmarklet again to remove the overlay text.
Automated accessibility testing tools cannot test all of the Web Content Accessibility Guidelines (WCAG) success criteria. Adrian tested four free tools and compared them to his manual testing results. My intent is to add to the body of knowledge by providing results from the Level Access AMP accessibility testing tool. Level Access also provides a free browser extension that runs the same tests called Access Assistant.
I wanted to compare Level Access’s tools to Adrian’s manual testing findings because these tools are what we use at my work and I share his concerns that too many stakeholders lean on automated testing when it uncovers only a portion of potential problems.
Process
I tested the same page, web.dev, and performed a review with the following tools against WCAG 2.1 Level A and Level AA:
I performed the tests on 19 January 2023 with a live version of the site. It does not seem to have changed since Adrian’s testing on 14 January 2023 of these four automated tools:
axe DevTools v4.47.0 browser extension (using axe-core v4.6.2) for Chrome and Firefox.
All references to manual testing results are Adrian’s data. I did not find anything additional in my manual testing.
Highlights
Number of success criteria failed
Tool
Total
A
AA
Manual
18
11
7
AMP
5
5
0
axe
2
2
0
ARC
3
3
0
WAVE
0
0
0
EAAC
3
3
0
We compare this to the total number of unique failures as some issues have multiple instances but are counted only once and some success criteria have multiple issues.
Number of success criteria failures
Tool
Total
A
AA
Manual
37
24
11
AMP
4
4
0
axe
2
2
0
ARC
3
3
0
WAVE
0
0
0
EAAC
5
5
0
Some tools provide alerts for issues to check manually, including AMP.
Number of alerts
Tool
Alerts
Manual
0
AMP
10
axe
0
ARC
7
WAVE
6
EAAC
20
Raw results
This section provides the output as provided by AMP for the various success criteria. It also includes alerts or what AMP calls “Needs Review” issues.
WCAG failures
The following table compares WCAG Level A failures between manual testing and AMP automated results.
Comparing WCAG Level A manual and automated test results
WCAG 2.1 SCs at Level A
Manual
AMP
1.1.1 Non-text Content
Fail
Fail
This svg element does not have a mechanism that allows an accessible name value to be calculated.
1.2.3 Audio Description or Media Alternative (Prerecorded)
N/A
1.3.1 Info and Relationships
Fail
Fail
This A does not have a ul element (without an ARIA-assigned role); ol element (without an ARIA-assigned role); an element with a role set to the value: list as a parent; or a ul element (without an ARIA-assigned role), ol element (without an ARIA-assigned role) or element with a role set to the value ‘list’ with an aria-owns attribute set to the ID of the element in the same DOM
Rule: Ensure list items are found in a list container
The role attribute value of ‘listitem’ given to this A is not allowed. The element’s role attribute should be set to one of the following text values: button | checkbox | menuitem | menuitemcheckbox | menuitemradio | radio | tab | switch | treeitem; or the role attribute can be removed
Rule: Ensure ARIA roles, states, and properties are valid
NOTE: The rule “Ensure list items are found in a list container” failed in AMP for two success criteria, both 1.3.1 Info and Relationships and 4.1.1 Parsing.
AMP did not fail any WCAG Level AA success criteria so I am not including that table of results. Manual testing found 11 unique failures.
Alerts
AMP provides additional potential issues as a list of “Needs Review” items. These alerts may or may not be WCAG failures and require manual review to determine if there is an accessibility issue.
Avoid inappropriate use of ARIA roles, states, and properties, cites 4.1.2 Name, Role, Value. The A element has an aria-hidden attribute set to the value: true. [2 instances]
Provide valid, concise, and meaningful alternative text for image buttons, cites 1.1.1 Non-text Content and 4.1.2 Name, Role, Value. This button element has a suspicious accessible name value of: all. [1 instance]
Ensure link text is meaningful within context, cites 2.4.4 Link Purpose (In Context). This A element has a suspicious (i.e. lacks purpose or is >150 characters) calculated accessible name value of: css. [1 instance]
Provide synchronized captions for video (which includes audio) or other multimedia, cites 1.2.2 Captions (Prerecorded) and 1.2.4 Captions (Live). This video element does not have a track with kind=captions. [1 instance]
Ensure heading level matches the heading’s visual importance/level, cites 1.3.1 Info and Relationships. [4 instances]
This article element contains an incorrect or missing heading level which may cause improper nesting within the document heading hierarchy.
This H3 creates an inappropriate jump in heading levels within the document heading hierarchy.
This H5 creates an inappropriate jump in heading levels within the document heading hierarchy.
This H1 creates an inappropriate jump in heading levels within the document heading hierarchy.
Provide an informative context-sensitive page title, cites 2.4.2 Page Titled. This title has a suspicious value. [1 instance]
Access Assistant
The Access Assistant browser extension Quick Test returned nearly the same results as the AMP test but as a list of issues with code snippets. Accessing the link for each issue displays an explanation of the issue but does not reference any rules or standards.
* denotes violations flagged by AMP:
Provide alternative text for images.*
Ensure text and images of text provide sufficient contrast.
Provide valid, concise, and meaningful alternative text for image buttons.
Ensure heading level matches the heading’s visual importance/level. [4 instances]
Ensure list items are found in a list container. [28 instances]*
Ensure all active elements receive keyboard focus or can be activated with the keyboard.
Avoid placing inactive elements in the focus order. [2 instances]*
Provide an informative, context-sensitive page title.
Ensure link text is meaningful within context.
Ensure ARIA roles, states, and properties are valid. [28 instances]*
Avoid inappropriate use of ARIA roles, states, and properties. [2 instances]
Provide synchronized captions for video (which includes audio) or other multimedia.
The Quick Test found one issue to check that was not flagged by AMP as either a violation or alert: Ensure text and images of text provide sufficient contrast.
Conclusion
The automated testing results from the Level Access tools are comparable with the other automated tools Adrian tested with manual testing finding more than 9x the unique success criteria issues. Use automated testing tools in tandem with manual testing to find the most potential accessibility issues. Relying on any automated testing alone will leave you with access gaps for your users.
Update: 27 January 2023
I’ve updated the Highlights data tables to reflect Adrian’s findings for the four automated tools he tested. For the sample, Access Assistant found more issues than WAVE Evaluation Tool, axe DevTools, and ARC Toolkit, and fewer than Equal Access Accessibility Checker.