Video: Preparing for an accessibility test

Welcome to the first video in the Accessibility Testing series. Please reference the blog article Preparing for an accessibility test.

The next video in the series is Performing an accessibility test.

Transcript

Rachele: Hello, my name is Rachele DiTullio, and I am here to talk to you about accessibility testing. You can reach my accessibility testing project page on my website at https://racheleditullio.com/projects/accessibility-testing.

On this project page, you can reach the blog articles and videos for part one and part two, as well as download the accessibility testing spreadsheet. My contact information is my name: Rachele DiTullio, R A C H E L E D I T U L L I O. On Twitter @racheleditullio. My email is racheleditullio@gmail.com and my website is racheleditullio.com. So let’s get started.

The website we’re going to look at testing today is boomerangtags.com. This is a site that I’ve been using for many, many years. I’ve seen it go through a lot of changes and it’s a site I’m familiar with, but I’ve also [00:01:00] seen get updated over time. This site used to be a table-based site; it’s now a responsive site.

I think it’s a really good example of code that you’ll encounter legacy code a lot of times when you’re dealing with accessibility testing. And so this gives us a good opportunity to look at things that are often an oversight and just things that people may not have known were accessibility issues.

You know, what we’re dealing with in this space a lot of times is education—getting people to understand what we mean by web accessibility and making websites work for people who are disabled. So you have to get your environment set up. So for this particular test my environment is going to be the Firefox web browser, the NVDA screen reader.

And we will look at some automated accessibility testing using the axe dev tools, Firefox browser extension. [00:02:00] So with that I’m going to first open up the spreadsheet and talk about how we start filling that out to make our testing easier. So let me bring up the spreadsheet here. So to start I’ve got an overview page on the spreadsheet where I like to put the name of the project, the date of the project.

I like to list what version of the web content accessibility guidelines, or WCAG, that I am testing. So today we’ll be performing WCAG 2.1 level AA testing so that includes both level A and level AA items for WCAG 2.1. Like I mentioned, the environment is while I’m also on a windows machine, but we’re going to be testing in Firefox using the NVDA screenreader and then list of tools that, that we’re going to use today.

Like I said, we have an automated checker with axe dev tools. It’s just a browser extension that you’ll see in your developer [00:03:00] toolbar if you press F12. We have a tool called the Colour Contrast Analyser which helps you determine color contrast between different colors. There’s an accessibility bookmarks link bookmarklets link, which goes to a page that has a lot of different kind of helper tools for testing individual WCAG success criteria.

So we’ll look at those as we need them. We’ll be using the W3C validator to check parsing on the page for validation errors. We’ll use a bookmarklet called WCAG-only parsing, which will help us cut down any errors that we find just to those that would fail WCAG. We will use a trigger keyboard shortcuts bookmarklet; a text spacing bookmarklet and then one other Firefox extension called window resizer extension. [00:04:00] And that helps us test, reflow and resize text.

So once we’ve got our project baseline established, the first thing that you have to do for accessibility testing is basically scope what you’re going to test.

So your client may say we want our website tested. But it’s very inefficient to just go through a website and test a bunch of pages— the entire page contents. What we want to do is get a representative sample of the content in the website. What we’ll be doing today is just looking at the website’s homepage and determining some items like the header and footer that are site-wide and only need to be tested once.

And then we’ll look at individual smaller bits of content, which I call components on the homepage and breaking those up into chunks that are more easily accessibility tested. It’s really quite a lot [00:05:00] to scope your project at the page level usually because there, you know, pages can be lengthy, there can be a lot of different custom things going on in your webpage and you may find it easier to chunk that up.

That said, you know, that that’s really up to you as the tester, but for this example here, we’ve, I’ve chunked up the homepage into several different components for ease of use. All right. So let’s look at the scope tab. There are several columns in the spreadsheet. The first one is ID. The second column is name.

The third column is description. The fourth column is status and the fifth column is URL. So the first thing we do when we’re creating components for testing is we want to give it a unique ID. Just something that we can track it by. I use that to cross-reference the different testing tabs for each [00:06:00] components.

They have the ID number. So we have here at the bottom to start out 00, 01, 02 and 03. So 00. It has the ID 00 has the name of Common Issues. We typically have this component for every project. And it’s just a way to log issues that happen across the site that it may be repetitive to log it as an issue for every component.

You can just add it to the common issues. So the description for common issues is any recurring issues that affect most of the pages in the sample. All right. So now we know, so for other, and for other components, I should say, this is a good place, the description column to indicate what it is exactly that we’re testing on the page, just so that we can remember as we’re going back through our notes.

We’ll look at descriptions for the other components here in a moment, but let me talk about the fourth column, which is “Status”. [00:07:00] Now let me just drop this down here. This is a pick list for status. So you can select that you’re actually in the process of testing this component; that it’s under review, you know, possibly by the client or another coworker; or that you’ve completed testing all together.

Feel free to go to the data tab in this sheet and you, you can add other statuses that you know, might be helpful for your work and this list should automatically update. The last column we’ve got in here is URL. Since this is a multiple page site we put the URL here to let us know where it was that we actually tested the component and issue.

So everything that we’re looking at today can be reached from the homepage. So every URL in our components that we have set up here is boomerangtags.com, the homepage. We’re going to look at the site in a second and I’ll show you how I broke up the site into [00:08:00] these components. But let me go ahead and, and explain what we’re going to look at.

So there’s a, a site header. There’s a, a hero on the homepage. There’s something I called pet tags. There’s an area called that for ratings and comments. There’s a video, a YouTube video player. There’s some information about the business and then the site has a footer and the site also has a cookie settings banner that shows up until you have interacted with it.

So for that, let’s go ahead and look at the website and we can talk about how I broke up this page into these different areas. So I actually have another tab here where I highlighted the different areas of the site and I’ll go through a little bit of, of why I picked what I did.

So the first component here we have area with some navigation components. There’s a search little widget here. There’s a shopping cart icon. [00:09:00] There’s a shopping cart links above the navigation bar. The navigation has some drop down menus that kind of thing. So I like, that’s a good little chunk of, of content that we can test.

As, as our first component now, as we move down the page the next group of content that I have, I’ve called the hero area. And so we’ve got the company logo. We’ve got a button that says order tags. We’ve got some text about their shipping policies and another little image with some text on it that says a heartwarming story for dog lovers.

Our next piece of content, as we scroll down the page is an area that has a header called “For pets that come back”. And then there’s some images with the examples of the different kinds of pet tags that the site offers from stainless steel to brass. So that’s another little group of [00:10:00] content with some links. There’s, there’s kind of a hover effect on some of them. And that’s a good, another little chunk to test by itself.

If we keep scrolling down the page here now I’ve got a, a highlight around a widget called Trustpilot that looks like it has a star rating and then three customer testimonials. So we’ll test that as a chunk. Keep going down the page.

Next there’s a video that is from YouTube, so we can test that as its own little thing. I’m not going to play the video right now, but what we’ll be looking for are things like captions and audio description. Do the buttons in the player have the appropriate names? Do they work with the keyboard, that kind of thing. Keep going down the page.

There’s some links or buttons to go to other information like frequently asked questions, and then there are some columns of text that talk about the different aspects of the business, like free [00:11:00] shipping, secure payment, no tax, et cetera. There, the image, some images and links in there. And we’ll test that as another component.

And then when we get to the bottom of the page here, there’s a footer that is pretty consistent across the site. So this is what we would determine this and the header are site-wide components and we’ll just test those we’ll test those separately so that we’re not testing them every time we go to a new page. Cause that’s redundant. We really don’t need to test the same kind of content over and over. Like I said, we want to get a representative sample and test that.

And then the last component here on the page at the very bottom, there is a cookie banner and there’s actually there’s an okay button, which would dismiss the banner, but then there’s a learn more button that we’ll see a little bit later, we’ll open a dialog that we have to interact to actually change our cookie settings. So that’s another good thing to test as its own component.[00:12:00]

So let me go back to the top of the page here. So those are all our highlighted sections. Let’s go look at the real site. So the real side of course does not have those highlights. But we can look at the spreadsheet and see how, how we’ve determined these components and kind of the information that we’ve written up for them.

So, first component is the site header. Now in the name I put “Site” in brackets before the word header, and that’s just a convention to kind of remember is something site level, if it’s on a particular page. You could add a whole other column to the spreadsheet if, if you want to track that separately this is just quick and dirty for this example, just a way to, you know, separate your components between each other. So, you know what you’re talking about.

So “site header”, I added the description that it’s the navigation bar at the top of the screen and including the shop and cart links that are above the navigation bar. It has, we’ll go ahead and set this status to “testing” and we’ll set the [00:13:00] common issues to “testing” status since that’s what we’re going to be starting. And like I said, the URL for this is the homepage so boomerangtags.com.

Our second component is the homepage hero, which is the Boomerang Tags image, the “order tags” button and the shipping text. Our third component is the home page pet tags area, where it has that heading “For pets that come back” with a row of pet tags underneath it.

Component four, we’ve got that ratings area on the homepage. It has the Trustpilot widget and some customer comments. Next, we have the homepage video, which is just the, the YouTube video player. Our sixth component is the homepage business information. And again, that’s the FAQs, “quantity orders discount”, “become a retailer” buttons. And then all of the text that follows that.

Our seventh component is the site footer, which starts with [00:14:00] the text “made in America” and goes to the end of the page. And then our eighth component for this demonstration is the cookie settings banner that is site-wide. The description for that is “cookie banner and cookie and privacy settings dialog”.

So that gives us a good idea of, of the scope of, of what we’re testing. And it gives us a good way to break up the homepage into the different chunks that makes it a lot easier to test than trying to attack the entire homepage as a single test. And trying to log, you know, every issue that, that you find with the page into one testing sheet.

It is really overwhelming if you try to bite off a huge chunk to test. So again, I, that’s why I usually dissuade people from just saying you know, The homepage will be a test. The contact page will be a test. The about page will be a test. You really want to chunk up the content into testable components.

That’ll make it a [00:15:00] lot easier on yourself and it also makes it it’s a little more agile to then go ahead and fix those issues. Because a development team can attack a component, something at the component level a lot easier than trying to look at the entire page scope. All right. So with that said so we know that common issues is in testing.

So if we come across anything that I think is a common issue, we’ll go ahead and log it in there. But I’m going to go ahead now and go to page 01. And if you, I’m sorry to sheet 01. And like I said, in the ID column I’ve got links from the ID, says 01 site header here. If we, if I access that link, it’ll actually open up the correct tab for me here in the sheet at the bottom, the one that’s titled 01.

So now we’re on the workbook for testing the homepage header component. In this sheet I’ve got a couple of things going on. So first we’ve got the first row has the [00:16:00] component name just a place for, for you to type that name again, it makes it a little bit easier when you’re switching between these sheets.

So let me show you, we’ll switch between common issues here back to homepage header, and you can see that at least you’ve got some context when you’re on the sheet of, of which component you’re testing, because this can get a little tricky. We’ve got several columns in this sheet for tracking all of the WCAG success criteria and what they are, what the issues are, the remediation and the status that we’re in at fixing it.

So let’s go through these columns real quick. The first the first column is the “Type” column. This is the column that the spreadsheet is sorted on by default. What I’ve done here is the spreadsheet lists in the second column, it lists all 50 WCAG success criteria for WCAG 2.1 level A and level AA.

 It’s not always easiest or most efficient to test from the very first success criteria, [00:17:00] 1.1.1 Non-text Content straight through to 4.1.3 Status Messages. What I’ve done is group these by type. And the first type we can see here is a group called active, “Active Controls”.

And that just makes it, it just groups like success criteria that are sort of addressing similar issues here, active controls. These would be things that you’re, you’re interacting with on the page, like a link, a button, some kind of widget, a form, that kind of thing, and groups, all of those success criteria together just for ease of testing.

You can certainly I’ll show you right now. You can certainly go to the “WCAG success criterion” column and you can sort A to Z on that if you would prefer. And we can see now 1.1.1 Non-text Content is the first criteria. I’m going to go ahead and sort it back to what I had it on.

So we’ve got the “Type” column, then we’ve got the [00:18:00] “WCAG Success Criterion” column. Each of these criterion is linked to its understanding page on the W3C. So I’m going to just gonna click on this. I’ll show you here, edit hyperlink. These links are in the spreadsheet here, and here’s an example of an understanding page.

So if you have further questions about how something functions, what it’s intent for. If you’re looking for examples or techniques for mitigating an issue, you can come here and you can see a whole lot of information that goes beyond the quick description that is in our spreadsheet.

The third column is “Level”. So this allows you to sort by just A or AA success criteria. Maybe you’re working on the level A ones first, and you want to sort by that you have that ability to do that here. The [00:19:00] WCAG criterion “Description” column has the the description that is available in the WCAG quick reference guide.

It just gives you a really simple overview of, of generally what the success criterion is, is looking for, for passing. That said, you know, there, there are a lot of rabbit holes that we could go down with testing. There are things that, you know, people don’t always agree on, you know, exactly what the failure is or if something is a failure.

So just keep that in mind, you, you can find multiple answers to the same problem. But this is a good just overview of, of what it is that we’re looking for. And I’ll, you know, go into them a little more in detail as we encounter them. And in some instances I’ve put example, a little code snippet or an example of, of something that would fail the criteria criterion. And you can look at those.

All right next, we have [00:20:00] the “Status” column and the “Status” column has three values. You can choose that a success criterion failed. You can choose that it passed, or you can choose that it’s not applicable. And we’ll talk about what the not applicable means here in a minute, when we start actually testing our page. But that’s a way for you to denote, you know, whether something passed, failed. And you can also sort on this column, if you only want to look at failures or you only want to look at successes.

Next we have the “Issues” column, which is where we’re going to write up any problems that we find on the site. We’ve got a “Recommendations” column where we’ll write up our remediation advice. There’s a “Source Code” column, or we can put a little snippet of the exact HTML that’s causing that has the problem. And then there’s a “Notes” column in case you have additional content that you need to log.

All right.

So let’s go ahead and get started on actually testing [00:21:00] something. Let me go back to our website. So the first thing that I like to do when I start testing is just kind of play with the site and see what it does. Now I have already gone through this page a bit so that I could scope it. But I’ll give you an example of some of the things that I like to do.

So we’ve got a couple of things. We need to see how it works with a mouse. We can see here, there’s kind of some kind of weird thing going on, where, when I try to mouse over this drop-down menu it disappears. So we know there’s kind of something going on there. We can see there’s a little bit of maybe a color contrast issue on hover.

As far as like, what are we going to be testing? We can see there’s like some kind of little widget there. There’s some links here, some links up here. So after I’ve kind of played around with it and I get an idea of what’s going on, I then use the keyboard. So now I’ve got keyboard focus on this window and I’m going to start tab- using the tab key.

[00:22:00] And that’s the other thing you need to, if you, if you don’t know how to use the screen reader that that’s a big part of accessibility testing, so you’ll need to practice that. And as I’m going through here, I’ll, I’ll try to explain what keystrokes I’m using and what expected keystrokes would be.

So the tab key is how you cycle through the focus order of elements that can gain focus. Things like links and buttons and widgets. So let’s start tabbing and we can see, I can tell right away that there’s no focus indicator because every time I hit this tab key, I can see the URL at the bottom of the screen is updating. So I know the focus has moved, but I don’t see it.

So I’m going to show you the first a bookmarklet that can be really useful for testing. So we can, there’s one called force focus. So I’m gonna go ahead and click that and now you can see, oh, the focus is actually on “Military” but we couldn’t see it because there was [00:23:00] no focus indicator.

So now as I tab through, I can tell, oh, cool. Like, actually these are all getting focused, but there’s no focus indicator. And something else I can notice is I know that this “About” menu has or this “About” link has a menu under it, but as I’m tabbing through, I don’t get to see that menu. If I hit, if I go ahead and hit the enter key to select it, I would expect that menu to drop down.

But nothing happens when I hit the enter or space keys. So I know there’s, there’s a hover issue with that. So we just kind of get an idea of, okay, all these things can get focused. So that’s, that’s important. We just can’t see the focus. We know some menus aren’t working correctly. Let’s see if this search button will expand with the enter key.

Yeah, it does. So we know that enter is triggering that component correctly. And we’ll do some other testing with that as well. We can see our forced focus indicator is on here. Just kind of [00:24:00] testing, playing with that. So now that we have kind of an idea of what’s going on with this component I’m going to open up the browser dev tools.

So that is the F 12 key. And what will open up is you’ve got your code inspector, you’ve got a console for, for JavaScript. And then assuming that you’ve installed the axe dev tools extension, you’ll see an option in your tab bar for, for dev tools here to go to axe dev tools. And when you open that up, you’re going to see a little control panel that asks, you know, should I, you know, scan the page and let’s see all the issues.

So let’s go ahead and do that.

Okay. So we can see that it’s got 48 issues. And we can start looking at those. But one thing I want to note here before we keep moving forward, one of the downsides of these automated tester checkers is, you know, it’s testing the entire page, but we’ve already determined that we’re, [00:25:00] we’re not going to test the whole page at once.

We’re going to test specific components. So this panel it’s really good when you’re learning how to accessibility test, because it will flag things you, you probably just didn’t even know to look for. It’s, it’s a great tool for learning WCAG failures, but it’s not so great for component testing because you’re going to have to figure out which of these issues affects the component that I’m working on.

So there’s a little bit of finagling and figuring that out, but you’ll also, as you get better at accessibility testing, you’ll be able to just identify, you’ll know what to look for. You’ll be able to kind of pick things, these things out without a lot of automated testing. You’ll just get better and better at it.

So that said before we start looking at these individual issues, I want to turn on the screen reader and I’m going to again, use my keyboard to [00:26:00] tab through these options, these links in the header, to see what they sound like with the screen reader, because ultimately we’re concerned with how the interface works with assistive technology.

That’s a, that’s a big part of, of what we’re testing accessibility for. Are, are, are the components, are the links, are the widgets announcing what they are correctly? Do they have names? Do they function with the keyboard since that’s, what assistive technology users are typically using? And so let’s just get a sense for how that feels.

So I’m going to turn on NVDA.

Okay. So NVDA started. I’m going to put focus in the browser window.

NVDA: Boomerang tags, trademark pet ID, color tags for cats and dogs and people Mozilla Firefox boomerang tags, trademark pet ID, color tags for cats and dogs and people clickable cart visited link.

Rachele: So it was [00:27:00] reading the the page title of the page when we first went to it. And I just moved focused to the the cart link. You can see there. I’m going to go back in the focus order, which is shift tab.

NVDA: Shop vis…

Rachele: Make sure I’m at the beginning of the page. Okay. So now I know I’m at the beginning of the page, so let’s see what the first tab stop is on the page. All right. I went to the shop link, but it didn’t announce anything.

Let’s go to cart.

NVDA: Cart, visited link shop visited link.

Rachele: So shop is announcing, it just got stuck. Okay. So shop visited link.

NVDA: Cart, visited link

Rachele: Cart, visited link

NVDA: Navigation, landmark list with 10 items, clickable order tags visited link.

Rachele: So we know they’re using some landmarks that said they’re using a navigation landmark and it said order tags, link.

NVDA: Clickable stainless steel visited link clickable brass visited link, clickable color tags, trademark visited link clickable military visited link [00:28:00] clickable plastic visited link clickable colors visited link clickable about link clickable contact visited link clickable search link search landmark search edit has auto complete blank. List with one item zero visited link.

Rachele: All right, so I’m gonna turn off the screen reader. Now the short key the hot key for. Closing NVDA is on windows is the, you hold down the insert key and Q, and that will close the screen reader. I’m also going to refresh the page so that we stop seeing the forced focus indicator, because we know we know what’s going on there, they’re just missing a focus indicator.

And at this point, so we, we’ve kind of gotten a feel for what kind of things are part of this navigation header. Got an idea of some things that are issues possibly. And we’ve also got an idea of some things that are not issues so [00:29:00] that when you, when you first start working with that component spreadsheet, and I’m gonna come back over here. So we’re on our 01 spreadsheet for homepage header.

The very first thing I like to do is go through these controls sorry, these success criteria. And I like to mark not applicable the ones that I know do not have anything to do with this particular content that I’m on. So. We know that we have inter… we have active controls because there are some links, drop down. There’s a search. So I know that most of these under active control I am going to have to test.

So I’m going to scroll down past that to our next content group, which is or type rather, is adaptability. So this is whether the site works in a reflow context, does it resize, can you use it on mobile?

So we can, we know we’re going to have to test for that. So I don’t necessarily have to do that right now, but [00:30:00] what I will say is this is a good opportunity to decide if the whole site has an issue with being responsive and you can log it under your common issues or does the page, generally work, is responsive, can be used in different orientations, but maybe has a little, you know, certain content that doesn’t resize correctly.

So let’s just scrunch up the page right now and we can see, you know, they’ve got the navigation here, here’s the search. The page looks like, you know, it reflows, it’s, it’s designed to do that. We’ll just have to see if it’s doing everything correctly. The other thing to note is because of reflow, when we test this navigation area, we also have to test the, the mobile version of it.

So for example, if I start tabbing through here and actually let’s turn on the force focus indicator so we can see if the menu button gets focus. It does. Okay. So, and using the enter key [00:31:00] opens it, and we can see focus moves to this X for, for closing this dialog. So we know that that’s generally working, we’re going to have to test all the things, you know, can we move through these, can we go, it looks like we can actually go to the dropdown menu in the mobile view. When I’m pressing enter this menu is expanding and collapsing. So that’s interesting.

We can get some other ideas of things that we need to test. So I know that we will have to test all this stuff for adaptability. Next is contrast both for text and non-text like the icon used for the shopping cart. So we’ll have to test that. Now here’s an interesting one.

So. There are some success criteria that just are not frequently encountered. So dynamic changes is one of those. Dynamic changes refers to any content that’s auto updating. Think something like a stock ticker or [00:32:00] has content that’s moving and scrolling or blinking like an autoplay animation, something like that.

I didn’t see any animation. I don’t see anything moving. So I don’t think that we have to worry about this 2.2.2 Pause, Stop, Hide. So this is a good example of where we can just select not applicable because we’re not going to encounter that. Next we have errors. There are a couple of success criteria related to errors. But since there’s, there was, there’s a there is a form element.

So we can go ahead and click to see if there’s anything about that form that has an error, but it doesn’t look like there is. It looks like you can submit the form without any text in it. So we don’t have to worry about error identification. We can not applicable that. We don’t have to worry about suggesting ways to fix errors, which is what 3.3.3 Error Suggestion entails.

So we can set that to not applicable. And then this form is not… The search [00:33:00] form is not submitting any legal or financial kind of data. You’re not making any financial commitments when you’re submitting this form. So we also, again, we don’t have to worry about making sure that there’s a way to undo our search action. There’s no legal consequence to it.

All right. Next we’ve got interactivity. So we’ve got content on hover and focus. We did see that we have those, those menus that are dropping down. So we definitely will have to test for that. Keyboard, we have to make sure everything works with the keyboard.

No keyboard trap. We have to make sure that the keyboard, keyboard users can always tab out of or escape out of whatever content they are in. They can’t get trapped into some widget. Next we’ve got character keyboard shortcuts, which we have a little bookmarklet for testing that. That’s just making sure that none of the single print characters like any of the letters, numbers, symbols are mapped to character key shortcuts in the [00:34:00] website, unless there is a way to remap them or turn them off. But that’s not one that comes up very much at all.

K. Next is pointer gestures. We have to make sure that everything that is interactive on the site can work with a single pointer gesture and doesn’t require a path unless that is an integral to the functionality. For example, if there were a space to sign your name digitally and you had to do a path based gesture on that, something like that. But again, that’s not something that comes up very frequently either.

There’s also pointer cancellation. You have to make sure that when you click and drag you can stop some kind of action that would have happened if when you release the mouse button. So things should always happen on the up event and not the down event, unless it’s reversible. And that also doesn’t come up very much.

Next we can, this is one we can set to not applicable, motion actuation. There wasn’t anything that looked to me like it would [00:35:00] require using the site with a mobile device and somehow turning the orientation to affect an action. So we can select that is not applicable for our header homepage header.

Next we have non text content. So this is things like images, and we definitely had some images, so we’ll have to we’ll have to test for that. The next few components here, though under multimedia, we can definitely set to not applicable because they have to do with video and audio recording.

So there’s, there’s several criteria about that. And since this part of the page, the header doesn’t have any video or audio, we can go ahead and set all of these to not applicable. So that is for audio only, video only prerecorded, captions prerecorded, audio description or media alternative, captions live, not applicable.

And then audio description for prerecorded, not applicable because there’s no audio. And then as well, the [00:36:00] last one audio control, there’s no audio, so we don’t have to provide a way to adjust the volume. So those are all not applicable. All right. And we’ve got images of text. If I look back at our, our widget, I don’t think any of all this appears to be plain text to me that these aren’t images.

So we can go ahead and even in here, we don’t have any images of texts, so we can go ahead and mark that as a pass because there’s no images of text. Meaningful Sequence. So that’s pretty much kind of the reading order of the page and does it make sense. We’ll have to test for that for sure.

Focus order, when you’re using the tab key and the keyboard, are you able to reach everything in a logical order? Focus Visible, we already encountered that. We definitely have to, that’s going to be a failure. We can go ahead and just mark that as fail cause we know we have to test that. There was no visible focus.

Next we get into what I call the page level elements and these are things that typically get [00:37:00] addressed once per page and often once per site. So let’s look at these a little bit more closely. The first page level element we’ve got is Bypass Blocks. So that’s providing a skip link to skip over this main navigation area and go straight to the main content without having to tab through all of these things.

So that’s something that it doesn’t appear is on the page because the first tab stop was this “shop” link here. I didn’t encounter a skip link, so we’ll definitely need to test for that. And we’ll probably mark that up under common. So I’m actually going to mark that as a not applicable on this, because it’ll be part of our common issues.

So let me go over to common issues and we’ll have to do the same thing here. Now, the thing with common issues is generally you’re going to mark all of these as not applicable because there’s nothing that’s going on site-wide. But if we look for sorry, let’s go back over here. So we were looking for Bypass Blocks.

So we’ll go ahead and say this fails. And we know we have to test for that on our common [00:38:00] issues, because that is something that affects every page. Every page will be missing that bypass block in the header. Next we have Page Title. Again, this is something that you can test, you know, once per page.

Maybe you always test it as part of the, you know, main content of the page. Something like that. What you’re basically looking for is that pages in the website have unique page titles so that people who are navigating with assistive technology understand what page they’re on.

Since this is the navigation for the homepage and that’s a site wide, I’m going to say this is not applicable, page title, because we’re just testing the navigation header with this component.

Next Multiple Ways. Again, that’s another sort of site level common issues type of component. It is there more than one way to get to any particular webpage on the site. And we saw that there’s a search feature in addition to the navigation bar. So we know there [00:39:00] are multiple ways but it’s not really applicable to our header navigation. You could test it as part of your header navigation, or you could test it as part of your common issues.

It’s, it’s kinda up to you. I’m going to test it as part of common issues. So we’re going to say multiple ways doesn’t really affect the header because we’re not concerned if there’s multiple ways to get to the header. We’re talking about pages on the site.

Language of Page, so this is a pretty simple one again, page level, and it usually affects every page in the site the same way. We’re going to look at the source code and see if the HTML element has a lang attribute. And I have a little code snippet of that in the WCAG criterion description column. So let’s go back to our site. I’m going to do CTRL U, which brings up the source code. And let me bump the size of that up.

So we can see the HTML element and it has a lang, setting the language code of the page to English. So that’s important. That allows assistive technology like a screen reader to announce the content [00:40:00] on the page with the correct voice and accent. So we’ll come back to the Language of Page success criterion, and I’m going to go ahead and set that to not applicable. We’ll test that as part of common issues.

Next we have Consistent Navigation. And this is just determining whether any navigation items are consistent page to page. So again, this is more of a site level, common issues issue. We’ll look at other pages and make sure that that navigation bar is consistent across the site.

Next we get into the group type called semantics. The first one in there is 1.3.1 Info and Relationships. This is a pretty big catch all criterion that determines whether semantic HTML elements have been used correctly. If there is text that looks like a heading, is it marked up as a heading? Are they using the landmarks on the page completely and properly? Things that look like lists, are they marked up semantically as lists? If there’s a data table, is that marked up correctly with like column headers? [00:41:00] If there’s footnotes text, is the footnote indicator correctly linked with the footnote itself? Is there a link between it and some kind of semantic relationship? So there’s a lot of different things that can fall into that category. And we’ll just talk about that as it comes.

Next in semantics, we’ve got Headings and Labels. So that’s determining whether any headings or labels that are marked up, are they describing the topic or purpose of what they are intending to? So that’s a little bit subjective, but you know, you can definitely make a determination on that.

Next we have Language of Parts. And again, I’ve got a little code snippet in here where it shows a <p> element with the lang attribute setting the language of that paragraph to Spanish. So lang="es" and then the word “hola” is in the paragraph. So what that allows screen readers and other assistive technology to do is switch language, switch screen reader voice appropriately to announce the content that’s [00:42:00] not in the default language of the page in the called out language of the page like Spanish.

I didn’t see any non-English content in the header navigation, so we can go ahead and just say that’s not applicable. There is no language of parts. And then we’ve got Consistent Identification. Consistent identification determines whether the common components used across the page are identified in the same way.

So for example, there’s that search box in the header of the site. We would need to look and see is that search consistent across multiple pages. So we’ll go ahead and we’ll test for that and leave that blank for now. Parsing. Next we will pull out the HTML for just the header area that we’re trying to test and we’ll run it through the W3C validator and see if we find any parsing errors. Almost [00:43:00] done.

Okay. Next we have in semantics, we have 4.1.2 Name, Role, Value. This is another big catchall. And this one primarily is concerned with how things are announced by assistive technology. Are the semantics of the components conveyed? . If you have a button that expands a menu, does that button have aria-expanded attribute that will indicate to assistive technologies if that element is expanded or collapsed? Do form fields, when they get focus, announce their labels? Is that label and form field connection made? So things like that. So we can end up with a lot of things in there but we’ll find out. All right.

Next in semantics is Status Messages. So this is when something gets updated on the page. Let’s say, for example, you’re on the search page, you search for something and a message that says “no results found” pops up on the page. Well, sighted users can see that, [00:44:00] but users of assistive technologies need to have that content announced automatically to them. And that’s where status message is concerned. I didn’t see any status messages as part of the navigation header, so I’m going to mark that is not applicable.

Sensory Characteristics refer to whether things like shape, color, size of components convey any meaning alone, something that is only visually distinguishable that cannot be determined by people with assistive technology. So we’ll definitely take a look at that. Use of Color, we want to make sure that color is not the only visual means of communicating something because some people don’t see colors the same as other people. Certain colors can look similar to one person and different to another. So we have to have an additional kind of visual cue to indicate the meaning of something. We can’t just rely on color alone.

And this often happens with links. Folks will have a different link color from their text color, but they [00:45:00] won’t underline the link. And if they don’t underline the link, and the color contrast ratio between the link text and the surrounding text is not 3:1, then there’s no visual distinction and they’re relying on use of color alone. So that’s where you’ll often find that issue. So we’ll definitely test for that.

And then the last one in sensory are the Three Flashes and Below threshold. I didn’t see any animated content or flashing content, so we can go ahead and say this passes because there was nothing flashing. So that’s gets us through the spreadsheet.

I’m going to go ahead and sort on, let’s see, let’s just look at the blanks, cause those are things that we need to test. We can see that’s reduced our list from, we started with 50 success criteria that we had to test for, and now we’re down to 29. So that’s always a good first pass to go through with any component is figure out what it is that you need to test.

And of course, if you [00:46:00] come across something that you missed in your initial assertion, you can come back in and test that. But for now we know we have just 29 success criteria that we need to assess for our header. So that already makes our work a little easier. So now let’s actually get into testing a component.