I’ve added the six new WCAG 2.2 level A and AA success criteria (SC) to the “WCAG Success Criterion” column in each component’s tab. This brings the total number of SCs for an audit to 55. (4.1.1 Parsing is deprecated but still included for 2.1.) Each new SC is highlighted in light green and includes “NEW in 2.2” in its name.
If you’re not ready to test for WCAG 2.2 SCs, just mark them as N/A.
WCAG failure tracking
I’ve added rudimentary tracking of the number of WCAG violations found for each component which is displayed on the “Scope” tab in the a new “WCAG Violations” column. Each cell counts the number of “Fails” selected in the “Status” column of each component’s tab.
On the “Overview” tab, you can see a running total of all WCAG violations found for all components in the audit.
I’ve also added a “Priority” column to each component’s tab. Each SC can be assigned a priority on a five-point scale:
Critical – Blockers that prevent someone from completing a task, e.g. a button that cannot be used with a keyboard
High – Issues that present a significant barrier to someone completing a task, e.g. application times out without allowing the user to extend it
Medium – (Most issues) Issues that present a barrier to some users from fully accessing the information or interface, e.g. text has poor color contrast
Low – Issues that present an unnecessary barrier but do not prevent a user from completing a task, e.g. image text that isn’t sufficiently descriptive
Best Practice – Issues that present some accessibility barriers but are not considered failures under the Web Content Accessibility Guidelines (WCAG), e.g. using a button instead of a link to open a webpage in a browser
If you have any questions about the spreadsheet or additions you’d like to see, please contact me.
I created this bookmarklet for testing WCAG success criterion 1.3.5 Identify Input Purpose (AA) which requires that if an input field requests personal information about the user, we must apply the appropriate autocomplete value to the field. For example, an input field requesting the user’s full name would have an attribute of autocomplete="name". This allows the browser to attempt to autofill the input field with previous values entered for the same information. It is very much browser-specific.
This bookmarklet checks the webpage for any input fields that contain the autocomplete attribute. If none are found, it returns an alert message.
If autocomplete attributes are found, the script returns the values and displays them in context of the input field.
This makes it easy to determine if input fields have autocomplete attributes defined and if they are valid. Click the bookmarklet again to remove the overlay text.
Automated accessibility testing tools cannot test all of the Web Content Accessibility Guidelines (WCAG) success criteria. Adrian tested four free tools and compared them to his manual testing results. My intent is to add to the body of knowledge by providing results from the Level Access AMP accessibility testing tool. Level Access also provides a free browser extension that runs the same tests called Access Assistant.
I wanted to compare Level Access’s tools to Adrian’s manual testing findings because these tools are what we use at my work and I share his concerns that too many stakeholders lean on automated testing when it uncovers only a portion of potential problems.
I tested the same page, web.dev, and performed a review with the following tools against WCAG 2.1 Level A and Level AA:
1.2.3 Audio Description or Media Alternative (Prerecorded)
1.3.1 Info and Relationships
This A does not have a ul element (without an ARIA-assigned role); ol element (without an ARIA-assigned role); an element with a role set to the value: list as a parent; or a ul element (without an ARIA-assigned role), ol element (without an ARIA-assigned role) or element with a role set to the value ‘list’ with an aria-owns attribute set to the ID of the element in the same DOM
Rule: Ensure list items are found in a list container
The role attribute value of ‘listitem’ given to this A is not allowed. The element’s role attribute should be set to one of the following text values: button | checkbox | menuitem | menuitemcheckbox | menuitemradio | radio | tab | switch | treeitem; or the role attribute can be removed
Rule: Ensure ARIA roles, states, and properties are valid
NOTE: The rule “Ensure list items are found in a list container” failed in AMP for two success criteria, both 1.3.1 Info and Relationships and 4.1.1 Parsing.
AMP did not fail any WCAG Level AA success criteria so I am not including that table of results. Manual testing found 11 unique failures.
AMP provides additional potential issues as a list of “Needs Review” items. These alerts may or may not be WCAG failures and require manual review to determine if there is an accessibility issue.
Avoid inappropriate use of ARIA roles, states, and properties, cites 4.1.2 Name, Role, Value. The A element has an aria-hidden attribute set to the value: true. [2 instances]
Provide valid, concise, and meaningful alternative text for image buttons, cites 1.1.1 Non-text Content and 4.1.2 Name, Role, Value. This button element has a suspicious accessible name value of: all. [1 instance]
Ensure link text is meaningful within context, cites 2.4.4 Link Purpose (In Context). This A element has a suspicious (i.e. lacks purpose or is >150 characters) calculated accessible name value of: css. [1 instance]
Provide synchronized captions for video (which includes audio) or other multimedia, cites 1.2.2 Captions (Prerecorded) and 1.2.4 Captions (Live). This video element does not have a track with kind=captions. [1 instance]
Ensure heading level matches the heading’s visual importance/level, cites 1.3.1 Info and Relationships. [4 instances]
This article element contains an incorrect or missing heading level which may cause improper nesting within the document heading hierarchy.
This H3 creates an inappropriate jump in heading levels within the document heading hierarchy.
This H5 creates an inappropriate jump in heading levels within the document heading hierarchy.
This H1 creates an inappropriate jump in heading levels within the document heading hierarchy.
Provide an informative context-sensitive page title, cites 2.4.2 Page Titled. This title has a suspicious value. [1 instance]
The Access Assistant browser extension Quick Test returned nearly the same results as the AMP test but as a list of issues with code snippets. Accessing the link for each issue displays an explanation of the issue but does not reference any rules or standards.
* denotes violations flagged by AMP:
Provide alternative text for images.*
Ensure text and images of text provide sufficient contrast.
Provide valid, concise, and meaningful alternative text for image buttons.
Ensure heading level matches the heading’s visual importance/level. [4 instances]
Ensure list items are found in a list container. [28 instances]*
Ensure all active elements receive keyboard focus or can be activated with the keyboard.
Avoid placing inactive elements in the focus order. [2 instances]*
Provide an informative, context-sensitive page title.
Ensure link text is meaningful within context.
Ensure ARIA roles, states, and properties are valid. [28 instances]*
Avoid inappropriate use of ARIA roles, states, and properties. [2 instances]
Provide synchronized captions for video (which includes audio) or other multimedia.
The Quick Test found one issue to check that was not flagged by AMP as either a violation or alert: Ensure text and images of text provide sufficient contrast.
The automated testing results from the Level Access tools are comparable with the other automated tools Adrian tested with manual testing finding more than 9x the unique success criteria issues. Use automated testing tools in tandem with manual testing to find the most potential accessibility issues. Relying on any automated testing alone will leave you with access gaps for your users.
Update: 27 January 2023
I’ve updated the Highlights data tables to reflect Adrian’s findings for the four automated tools he tested. For the sample, Access Assistant found more issues than WAVE Evaluation Tool, axe DevTools, and ARC Toolkit, and fewer than Equal Access Accessibility Checker.