112: Real People, Real Feedback: The Why and How of User Testing, De La Calle Tepache Assorted Flavors

In this episode, we talk through the ins and outs of user testing, why we do it, how it is distinct from auditing, and much more.

Listen

Watch

YouTube video

Mentioned in this Episode

Transcript

Chris: Welcome to the Accessibility Craft Podcast, where we explore the art of creating accessible websites while trying out interesting craft beverages. This podcast is brought to you by the team at Equalize Digital, a WordPress accessibility company, and the proud creators of the Accessibility Checker plugin. And now, on to the show.

Amber: Hey everybody, it’s Amber and I’m here today with Steve.

Steve: Hello everyone.

Amber: And Chris.

Chris: Hey everybody.

Amber: And we are going to be talking about user testing. This is episode 112 of Accessibility Craft. If you would like to access a transcript and show notes, you can get those if you go to AccessibilityCraft.com/112.

Today’s Beverage

Amber: We always start every episode off with a beverage. What are we drinking today, Chris?

Chris: We are having, and I hope I don’t butcher this pronunciation, but it’s called Tepache, and it’s a traditional South American drink made out of fermented pineapple juice. And we’re having a series of Tepache flavors produced by De La Calle, and which is a South American beverage producer.

And they have all kinds of fun and festive kind of out of the box flavors and I think we’re gonna all be trying an assortment of those today or at least one different flavor each. So, I’ll lead with mine. I’ve got, I’m keeping my can cold, of course but I’m having a watermelon jalapeno.

Steve, what are you having? What did you go with?

Steve: I went with the I thought it was Tepache, but you might be right. The…

Amber: Well, I also was going to say, I think the name of the company is going to be De La Calle. Because isn’t two L’s a Y?

Chris: Yeah, probably.

Amber: In Spanish?

Chris: Yeah, probably.

Steve: So I went with the Hibiscus Citrus.

Chris: Yum.

Amber: And I have a Mango Chili.

It’s a little spicy, maybe. We’ll see.

Chris: Yeah, we’ll see. We’ll see. But we can each give our impressions. I’m going to go ahead and crack mine open here. I know my kids have been tasting them.

Amber: Oh, you don’t even want to talk about the can. You’re thirsty, huh?

Chris: I am thirsty. But the our kids have been trying them and they don’t like them, so I’m guessing it won’t be sweet.

Amber: Yeah, mine says, mango chili, deliciously delicate, yet bold balance of sweet, juicy mangoes and a hint of savory chili spice.

Chris: I definitely smell the jalapeno on the nose for mine.

Amber: Craft fermented, best served chilled. It smells jalapeno y. More than watermelon?

Chris: Yeah, but I definitely get, okay, this is good.

It’s off sweet, a little bit of bubbles, not super bubbly. The watermelon flavor is definitely pronounced. There’s just the slightest hint of burn, but I really smell the jalapeno on the nose. This is good.

Amber: I just opened, I just opened mine, and I’ve got it I don’t know, 12 inches away from me, and I can smell the mango.

This is a very, I don’t know.

Chris: Aromatic.

Amber: You can smell it. You don’t, some drinks, I have to put it right up to my nose to really be able to smell it, but it’s I open the can. And I got mango immediately.

Steve: So mine says a refreshing harmony of sweet floral hibiscus balanced by a sweet, a squeeze of citrus. And my can is, is that mauve, like a pink, like an off pink?

Chris: Yeah. And mine’s like watermelon pink.

Amber: Yeah, these all have nice, nice colors. Mine’s red with yellow and greens on it. It’s very bright.

Steve: Oh, yeah.

Amber: And artistic. Also, the other thing I noticed, even without having it really close to my ear, and I don’t know if I’ve ever experienced this with any other beverage that we’ve opened.

I have earbuds in, I can hear it crackling or fizzing. It didn’t fizz up, I don’t see anything, but I can hear a bubble sound in the can. Through my earbuds, and it’s not right next to my ear, I put it next to my ear, I can literally hear it fizzing. The Rice Krispies?

Steve: Yeah.

Amber: That’s what it sounds like to me.

Steve: It’s very bubbly.

Amber: Can you hear it too?

Steve: Can you guys hear that when I poured it?

Chris: I did hear a bit of a hiss there on the mic. Yeah.

Amber: What color is that? That’s beautiful.

Chris: Yeah, it’s like a, it’s got that hibiscus pink

Steve: Looks like strawberry lemonade or something. Yeah. It’s interesting because I go into it thinking it’s sweet, like I, and, but it’s not it. I’m the citrus is strong. Like I can really get the citrusy kind of taste.

Amber: Yeah, it’s really not sweet at all, which makes me wonder.

Chris: Mine… Has erythritol in the third ingredient, which is like a nicer artificial sweetener. I don’t know if y’all’s has that. But the rest of the ingredients look pretty good.

Amber: So it says 8 grams of total sugars. But also seven grams of erythritol. Is that going to give us cancer?

Chris: I don’t think erythritol is as bad. And it’s got the asterisk on it. So, and it says organic ingredients. So I guess it’s organic erythritol.

Steve: Yeah.

Amber: Yeah, so this is interesting. Carbonated water is the first ingredient, not like a juice.

And that kind of makes me think like the way it tasted to me is almost a little bit more like a sparkling water with flavor, but it doesn’t taste like a juice or a really sweet soda. I like it. I think, but mine has carbonated water. I don’t know. Tepache? Whatever that is. They’re special fermentation cultures.

Oh, it says it’s made with pineapple, orange, apple, grapefruit, lemon, and lime juice, tamarind, and turbino sugar. So that’s like the really big sugar crystals that are brown. Akashai, natural flavors, lime juice, concentrate, agave, lime extract, cinnamon, chili, ascorbic acid, and rosemary extract.

Okay. Oh wait, I missed. There is mango juice. For a second I was like, where’s the mango? No, there is mango. It’s up there. I don’t know. These have, it seems like it has pretty good ingredients. I gotta put it back in my koozie now that I’ve read the can. But, I don’t know. So you think the kids don’t like it because it’s not sweet enough?

Chris: Yeah, I’m guessing so.

Steve: Yeah, it’s pretty good. I mean. It’s pretty, it’s like neutral. It’s neutral.

Amber: It’s yeah, it’s interesting. This is one where I feel like it maybe smells better than it tastes.

Not that I think it tastes bad. It’s just, I opened it and it smelled so fruity that I had this flavor expectation and then I took a sip and I was like, oh.

Steve: Yeah. Yeah, I would agree.

Amber: But, have you, has your family tried it yet, Steve? I know sometimes they try.

Steve: Actually, I had to find the box in the bottom of the pantry, I was like, where are these drinks? It was in the bottom of the pantry, with dog foods on top of it.

Amber: Whoops. Well, maybe you can report back later if your kids try it, if they like it, or if they’re like ours and they’re like, I think our, actually, our big ones are the ones who tried it and didn’t finish it, but the little ones drank it.

Steve: Interesting.

Amber: I don’t know. No. I will say my mango chili does not have any chili kick to it at all.

Like we’ve had drinks on this podcast that were spicy and made my mouth feel spicy. I don’t get any chili. Do you get jalapeno, Chris, or no?

Chris: Just the slightest bit. I feel like maybe the, like you said the nose on these, like the smell and the flavor descriptions are over promising and then the beverage itself is under delivering a little bit for me.

I wish the flavors were a little more concentrated. I don’t know, maybe, honestly, they may be able to fix that if it were up to me, right? And they may be able to fix that by just having less carbonated water and more of the other juices. So it’s a bit more, a bit less watered down, maybe that’s the core issue for me.

It’s just a little bit watered down.

Amber: Yeah. So, so where do you land on our very scientific thumbs up, thumbs in the middle, thumbs down scale of would you buy it again?

Chris: I’m in the middle. Yeah. You’re in the middle?

Steve: I’m in the middle.

Amber: Steve’s in the middle? I think I’m in the middle, too.

Chris: All the flavors, right? That’s what really grabbed me was like, mango, chili, watermelon, jalapeno. I’m like, ooh, I wanna try that, this sounds great.

Amber: Yeah, I definitely give them props for having unique flavors and I think decent ingredients. And it’s interesting that it’s made with fermented pineapple juice to create the carbonation and not like artificial. I don’t know carbon dioxide or whatever it is that they use to carbonate our favorite pop that we advertise for incidentally on every episode. But yeah, I don’t know if it’s the most thrilling thing I’ve drank.

Steve: = If you’re like just mowed the grass or something and you just want something like to sit on the porch and cool down with it’s watery and…

Amber: It’d be good refresher. Yeah, maybe it’d be a good mixer.

Steve: Might be.

Amber: Or a cocktail.

Steve: Put some alcohol in it. Makes everything better, right?

Amber: I don’t know. I am curious to try the other flavors. I’m definitely going to try them. So, well, anyway, while we finish our beverages.

What is user testing?

Amber: We are going to be talking about today’s topic, which is real people, real feedback, the why and how of user testing.

Chris: So, I’m hoping, Amber, maybe to start us off, you can explain fundamentally what user testing is and how this type of testing is different from other accessibility testing.

Amber: Sure. The basics of it is that user testing is testing of a website or an application that is done by people who use assistive technologies every day in order to experience the web.

This might be blind people who use screen readers, or a deaf person who relies on captions, or someone who has mobility differences that uses voice activation, or a sip and puff device, or an alternative input device like a Darcy USB keyboard rather than a traditional mouse and keyboard. So it is testing that is done by people who use technology every day and how they experience the website.

And it typically follows some sort of path or journey and is guided by an accessibility professional, giving them prompts, asking them to try and accomplish things, observing what they do. In contrast to a standard accessibility test where a trained accessibility professional is going to look at every component of a page from top to bottom and compare each component’s markup basically to what Web Content Accessibility Guidelines requires. This follows the path and is maybe less thorough in some ways but also might provide a more clear picture of what a person with disability might experience.

Why is user testing important?

Steve: Yeah. Awesome. Cool. So, so that’s what accessibility testing is. It’s a great description, but why is it important? And specifically why do we need to do user testing with people with disabilities?

Chris: I think if I were to compare it to like, other types of, user acceptance testing that’s not accessibility related.

It’s like why you have someone other than the designer and the developer test your website before it launches. Try to get the client in there try to get an everyday user in there. Because the type of feedback you get is different. You know it’s not this thing is three pixels out of place or you’ve violated WCAG two dot whatever, right? And this is what you need to change.

It’s more like qualitative feedback about their overall experience and where they’ve gotten blockers. And it’s not necessarily related to a specification or a standard or that sort of thing. Amber, do you have more to add?

Amber: Yeah. I mean, we’ve talked about this before on the podcast that it’s, there’s a saying from the disability rights movement, which is: Nothing about us without us. And it is really important to include people with disabilities in accessibility strategy, which does include accessibility testing, but goes beyond that, like how might we approach certain things in our organization or in our website process?

Because people who live something every day are going to have very different experiences from someone who just works in the field. And they’re going to bring different perspectives. And so it is very important to include those real world users. The other thing that I would say that is always very interesting about user testing with people with disabilities is that frequently it brings up user testing in general, brings up problems that are beyond accessibility problems and are more like higher level structural problems. You might find out that the way you organize your navigation menu is not intuitive for anyone other than you. When you do user testing because you might have three or four people not know that if they go in the navigation menu and go to the specific page they’re going to find X thing that you want them to find.

And instead you’re going to watch them go to all the wrong pages looking for the thing that you’ve said, hey go here and do this. And so really doing any sort of testing with real people who have been apart from the whole design and development and content creation planning process is going to give you a very eyeopening look at what a real person experiences on a website and it’s not always the same as what we might think.

Steve: Yeah. And like when you, user testing like you stated with like people with typical abilities and people with disabilities people with typical abilities have, they have a wide spectrum of even ways they interpret an application or a website. And the same with people with disabilities doing user testing, like you could have two user testers and, like we’ve had one that was like more technical and then we’ve had the more like typical accessibility, assistive technology user. And you get a lot of different results, even with those two, two users with a disability. So it’s interesting.

Amber: Yeah, I’ll say we’ve had projects where we’ve brought in. Blind people who are also certified accessibility specialists, which is fabulous because they can give really good feedback on maybe the HTML and how it should be improved, but isn’t, but we also do testing with someone who is blind and just uses the internet as a normal, to shop and that kind of stuff, not in their career. And that, to some degree, is something I think you don’t want to skip doing because they’re not going to look at a problem like the way an accessibility specialist looks at a problem.

Brought to you by Accessibility Checker

Amber: I want, I feel like we should talk about this more in detail, but we’re going to take a short break and we’ll be right back.

Steve: This episode of Accessibility Craft is sponsored by Equalize Digital Accessibility Checker, the WordPress plugin that helps you find accessibility problems before you hit publish. Thousands of businesses, non profits, universities, and government agencies around the world trust Accessibility Checker to help their teams find, fix, and prevent accessibility problems on an ongoing basis.

New to accessibility? Equalize Digital Accessibility Checker is here to teach you every step of the way. Whether you’re a content creator or a developer, our detailed documentation guides you through fixing accessibility issues. Never lose track of accessibility again with real time scans each time you save, powerful reports inside the WordPress dashboard, and a front end view to help you track down hard to find issues.

Scan unlimited posts and pages with Accessibility Checker Free. Upgrade to Accessibility Checker Pro to scan your website in bulk, whether it has 10 pages or 10, 000. Download Accessibility Checker today at EqualizeDigital.com/Accessibility-Checker/. Use coupon code AccessibilityCraft to save 10% on any plan.

How do you run a user testing session?

Chris: We’ve done our fair share user testing sessions across all sorts of different projects. Amber, you typically lead those can you walk our audience through what that looks like from the very beginning to the very end of one of those sessions?

Amber: Sure. I will also call out, we put a link to this in the show notes. But I didn’t realize how long ago it was. One of the first meetups that I presented at, I gave a talk on how to run user testing sessions with real world users. So I would definitely recommend people check that out to see all of this written out. But what I would say from a high level is that we start by having a conversation with the website owner.

Whoever that might be, or anyone who works on it, and figuring out what are the goals or objectives for what you want people to do on the website, whether that is searching for something, filtering in a certain way, and then going to a product page, learning about it, adding to cart, going to checkout and making a purchase.

So, or going and reading blog posts and clicking on affiliate links, right? Whatever the objective of the website is, determining that. And then from there, we map out a plan. Our user testing sessions usually include around It has eight to twelve prompts, depending on what the goals are for the website and how much time we have in a user session.

We try to limit them to about 90 minutes max, and we do them all over Zoom. So I write out my questions for what I’m going to have someone do, and it might just say, starting on the homepage, and you, this is a shoe website, and you are looking for, Men’s hiking boots in size 12 that are waterproof. What would you do?

That’s an example of a prompt. And then you watch them like, go, and maybe along the way you, I have to spend time going through all the things that I think they’re going to do. And we’ll discuss this with our clients and figure that out. What are their expectations about what users might do? For example, oh, this website has a chat bot.

So a user, might use the chat bot to say, Hey, can you help me find this? Or they might search or they might do different things. So I have to spend time experiencing this myself first to try and figure out what are the paths and what are problems that I might want to dig into. Or if I know that it’s important to get feedback on the chat bot and a user doesn’t use the chat bot, then towards the end of a session, we might circle back and say, Hey, did you know there was a chat bot on this website?

So first of all, we find out, did they even know it was there? And we’re like, okay, why’d you use it, why’d you not? So thinking about some open ended questions that we might want to talk about. And then I might literally say, okay, now I want you to use the chatbot, because I want to get feedback from you on the chatbot.

And then having, going to those things that maybe I thought earlier they were going to experience. But they didn’t. They skipped it for whatever reason. So we’ll circle back. So I map all of this out. Then we actually get on Zoom. We run it. We have the users sharing their screen. And for our user testing sessions, we allow our customers to join us if they want to so that they have the opportunity to ask questions.

Typically, we have them hold their questions until the end. So we’ll do the whole session. It’s recorded. We have general feedback questions that we ask at the end, like, how would you rate this on a scale of 1 to 10 for accessibility? Do you use websites similar to this one? So like we did some user testing for the city bus websites in Fort Collins, Colorado, which there’s a meetup presentation about this that talks a little bit about that as well.

And so we would always ask them, Do you normally ride the bus? And all of our testers on that said yes. And then we’d say, okay, how do you get information? They weren’t in that city. But we would ask, how do you get information about the bus? Since you do normally do this, what would you do as an alternative?

Is there an example of a bus website that you think is better than what you just experienced? So we’re doing some kind of reflection type questions at the end. And then we’ll allow our clients to ask questions. And then for our purposes, what we’re doing, we then have to go back through the recording afterwards and we provide a written report and pull out the important parts and do some reflection on where they struggled and what our recommendations are for that.

Is user testing the same as an audit?

Steve: So we talk a lot about, manual audits and we talk a lot about the, plugins that can help you do audits to do automated audits. Is, is user testing a replacement for either of those when should, and when should we be doing user testing?

Chris: I think early on in, in my, if you want to call it accessibility career, I don’t know, but early on when I was still learning some of the ropes, I would conflate user testing with auditing before I, understood the difference between the two, thinking that you would get similar feedback, but and we alluded to this a little bit earlier, but basically the type of feedback you get is very different manual accessibility evaluation for an audit by a professional is going to get you very quantitative data in terms of the exact specific accessibility issues that can be measured and proven that have been observed right against a set of standards, whatever set of standards you’re using. Whereas user testing is much more qualitative feedback.

It’s how is the experience? Have you done this before? What can you think of other times where it was easier? Where did you get confused? Where did you get hung up? And that kind of qualitative feedback, it’s much more like a, almost like a focus group. And in real world product vernacular, like it’s getting one or multiple people together and having them provide you with just their personal feeling about how your website functions and how it makes them feel as a user.

For that reason, I think in a lot of senses, it makes, it makes the most sense to do that type of qualitative analysis after you’ve done the quantitative piece. So after you’ve basically gone through and to the best of your team’s abilities, made sure that you believe it’s accessible. And then you bring in users to say, well, you got this right, but this could be better, right?

Or technically I could get through this, but man, it wasted a lot of my time, right? Because there’s, oh, and I think maybe Amber could expound on this a little bit even, but there are times I feel like, or Steve too, where, you can technically meet the standard, but it’s still not actually a good experience for a disabled user.

Amber: Yeah. The usability is poor, even though it’s maybe technically WCAG compliant.

Steve: Yeah. Yeah. I would just add like when should it be done? I mean, I think that’s, we get different types of clients, right? Amber said, we did some, we’ve done some for other clients and sometimes that happens after the application or the website is done.

And I think from a technical standpoint…

Amber: it’s already launched.

Steve: Yeah, I think from a technical standpoint, that can be a little difficult and I think we’ll touch on that here in a little bit, but just because, the app an applications already created, refactoring things can be a little hard. I think, if you have buy in with your stakeholders and whatever project you’re doing, the further left you can shift in that process, I think is better.

Even if there’s some bit of user testing that can be done even in designs or prototypes or proof of concepts, I think is definitely helpful. But you know, if a client comes to you with an already made. application, then, then developers have to figure out a way to implement those changes.

Amber: Yeah. We usually advise people that the order goes automated scanning because, you don’t need a human being to tell you every time you forgot alt text. There are obvious things. So do automated scanning first, with a tool like Accessibility Checker. Then you do manual testing with the keyboard, then you do manual testing with the screen reader, and the order for those is because if you can’t reach items with the keyboard, then you won’t be able to screen reader test them anyway, so you might as well just test them with the keyboard first.

And then once everything is fixed, then it, you’d want to do user testing. I think it is, there are times when we have had people, come in for user testing only. They didn’t have us do an audit, they were internally working on their own accessibility, but they didn’t have an accessibility specialist, so they never had an audit.

And when we do that, we typically prefer to have multiple user testing sessions and to space them out, because what we have found occasionally with those is that there might be some major blockers that they have not discovered themselves, that will make it incredibly difficult for a user to use the website and it’s not super helpful to have multiple users, that are just like, I cannot accomplish this task, I cannot accomplish this task, I cannot, because everything is a div and nothing is a button, right? Whatever that might be.

So, while it can be done, when we do that, we usually are like, alright, we’re gonna have one session, and we of course advise everyone, have the audit first, but if you don’t want to have an audit, it doesn’t fit in your budget, it doesn’t mean you can’t do user testing.

So we’ll have one session, we’ll provide a report that might include, not a comprehensive, but it would say, these things are WCAG failures, it will alert to that. And then we wait, whatever time makes sense, 30, 45 days, before we do the remaining user testing sessions to give them time to address those major blockers.

So that becomes more of a usable feedback, that’s not just I literally can’t do this.

I think you can do it in a different order, but… I don’t know. It probably depends on how confident you are in your dev skills and your own ability to test.

When should you do user testing?

Amber: Steve, you alluded to what happens if you do user testing after something is already launched and it’s live. I wonder if you could talk a little bit about, how website teams might take feedback that comes from user testing sessions, how you might prioritize things and then what happens if you get that feedback really late in the process, like you were talking about?

Steve: Yeah, totally. So, depending how that user testing information is passed off it may be, it may not have been analyzed. So like you, you may be required as a developer or a tech lead of some sort to, to organize that into some fashion, but I would definitely try to identify the severity of issues and frequency, the impact on specifically on disabled users and the effort to fix it.

So just like with most. things, you can put them into a priority matrix, right? Most projects have a some kind of priority matrix. We typically follow critical high, medium, low. And and those can have an action applied to them. Typically critical is it’s broken, fix it now.

And then it goes down from there. But I think to your, to your question about, what if you already have an application coming after the fact and then user testing and now it’s time to fix those things. I think you definitely want to assess the risk, right?

Can the user still complete the tasks, despite the issue? If not, then it’s definitely something critical that needs to be addressed right away. And look for quick fixes things that can be high impact, but low effort. Those go a long way. And we do that even with our, back to our automated testing, right?

Our plugin will surface kind of high level things for you to focus on. So you can make big wins with little effort. And if there are issues, have communication with the client, or if there’s a stakeholder, involved, if it’s your own company, have those conversations about understanding what the impact is for different users and potential business risks that could arise from these problems.

And those risks could be legal, right? If If you’re user testing services that people can’t even access a checkout or they, they can’t access their medical records. That is a significant business risk to address. And and then I think Sometimes you, especially if you have a a website or application that’s already live, right?

What do you do? Like we know these are accessibility issues. We either don’t have the bandwidth or it’s going to take a certain amount of time to actually remediate those issues. How do we be accessible at the same time? Well, I think what you do is you’d be honest and you’d be open with your users.

You put those things inside of an accessibility statement and you put on your website or your application that, Hey, we have identified these things through user testing and we are planning to fix these things within this timeframe. So I think a lot of times with accessibility, it’s just about being honest and open and being proactive with your approach to it.

Amber: Yeah, I really agree. I think that’s a great recommendation.

Our Best Advice Around User Testing

Chris: So Steve just gave a really good golden nugget, but I’m going to turn right around and ask each of us, or if Steve can come up with another golden nugget, I’m sure he can, but if each of us can go through, maybe we’ll let Steve go last. And just one piece of advice to a company launching a website app or product, what would that be regarding user testing?

I’ll lead cause I haven’t spoken in a minute and give you two time to think, but I think from a like procurement angle, it’s really important to understand, first of all, that user testing does not have to be incredibly expensive. Usually the levers to pull for scope are, number of testers, number of testing sessions, and how much we’re testing.

And if you can’t have a full focus group of multiple different types of users, just see how you can scale that back to something that works for your budget. Don’t have all or nothing thinking about it. A good solid user testing session can be from a few hundred to maybe a thousand dollars depending on who you’re working with.

And it could be even less than that, if you’re working with just an individual who’s qualified, who’s off of wherever you might hire someone for that, maybe Fiverr or something. The other thing that I will say is if you are contracting for user testing through a firm, like Equalize Digital or another one, I think it’s a really smart idea to ask some pointed questions around, hey, do you pay your testers?

How much do you pay your testers? And, where are your testers located? Get some basic information and do some basic vetting, because the unfortunate reality is, that a lot of individuals with disabilities are underpaid or are being told to do this, through an internship or some other illicit thing where they’re basically working for free or almost no money and it’s a real problem.

So make sure that whoever you’re having do your testing that they’re compensating their testers fairly.

Amber: Well, or if you’re doing it yourself, if you’ve emailed to your email list and said, Hey, we want to do some user testing. Is there anyone in our network that wants to do it? Don’t expect your audience to volunteer their time to test your website.

Offer to pay them. Even if it’s we’ll pay you 50, 75 bucks an hour or 25. Whatever makes sense, I would offer to pay people for their time because their feedback is really valuable.

So, so my advice, we have a few resources that we can link. I mentioned the Meetup. There are a lot of different tools out there that can be used or processes in order to run your own user tests. Obviously, it is very helpful to bring in a company like Equalize Digital because you get the benefit of having an accessibility professional running it and interpreting what the people are seeing and planning and giving you that very detailed feedback, but I don’t want pricing, or cost to limit people and so even if you’re running your own user testing sessions, you’re still going to get very valuable feedback.

So my recommendation is to plan it into your process at least annually or whenever you have very significant changes to your website or your product. If you’re a WordPress plugin developer, this would be the same thing. You want to know how people are using your plugin? You should get on zoom, don’t give them leading questions, but ask them to accomplish tasks in your plugin and watch them do it.

It’s really going to provide so much great feedback. So plan it regularly. If you can’t hire a professional, there are ways to run these sessions yourself. Not, that would be my advice. What about you, Steve?

Steve: Yeah. So I don’t know, have another golden nugget. I didn’t, I don’t really know what the first golden nugget was, but I’ll go back a little bit to the first user testing that, that I remember being involved with, and it was, we did it for a government portal application we were putting together and, our team, we live in different states, so. Amber did the user testing and she recorded it.

Amber: Yeah. We had the individuals come into our office. And bring their own devices, but sit at desks and we had cameras on boob arms looking down so you could get like what, you could see what their hands were doing at the same time.

Steve: And you could hear them talking about, talking through what they were doing. And I would say, it’s one thing to listen to a podcast about user testing and try to change your mind or have a paradigm shift in your thought about the importance of this but man, for me, when I saw it, when it was tangible, when I could see what they were doing on their keyboard, I could see, I could hear the feedback that they were giving and I could see on the screen what they were doing.

It really was much more than a mind shift. It was a heart shift a little bit to say, Hey, like it, it made it tangible and real to me that, that these are real people, trying to use something that, that I made and it’s important for me to make it so that they can use it. And so I would just say that, if you can just experience, a disabled person doing a user testing, it really shifts your mind and shifts your heart too.

Amber: Yeah. So even if you only do it once, do one.

Steve: Yeah.

Amber: And share it with everyone in your organization who has any say on the website because it might help. Well, I think that’s probably a good, well rounded discussion about user testing. We’ll see. Our users are very welcome to provide us feedback.

So, I think we’re probably gonna sign off for. The day, and we’ll be back in a couple of weeks with another episode.

Steve: Cheers guys. Bye.

Chris: Thanks for listening to Accessibility Craft.

If you enjoyed this episode, please subscribe in your podcast app to get notified when future episodes release. You can find Accessibility Craft on Apple Podcasts, Spotify, and more. And if building accessibility awareness is important to you, please consider rating Accessibility Craft 5 stars on Apple Podcasts.

Accessibility Craft is produced by Equalize Digital and hosted by Amber Hinds, Chris Hinds, and Steve Jones. Steve Jones composed our theme music. Learn how we help make thousands of WordPress websites more accessible at EqualizeDigital.Com.