Guidelines and specs are helpful in the accessibility world, but they don’t always help us get to the heart of the matter: Is your site, product, service truly usable? Joanna Briggs shares why moderated, remote usability testing is so effective.
While there are numerous guidelines and technical specs to follow when creating accessible products, they just aren’t enough if you want to truly measure whether you’ve hit the mark. At Simply Accessible, we want to know—really know—the experience someone is having across various digital platforms. User testing isn’t just about finding out whether or not the site works with peoples’ screen readers. It goes beyond functionality to understanding how people with all kinds of disabilities are actually interacting with the digital world—and your product in it—on a day-to-day basis.
Remote testing is effective testing
When it comes to usability studies, there is no win or lose. Usability studies are an opportunity to learn about how people actually use the thing you made, in their lives. To that end, our usability studies look beyond just how the assistive technology (AT) works to include how people use assistive technology. We specialize in moderated, remote usability studies. This is the spice in our recipe.
When we do moderated, remote sessions, we gain intimate access into other people’s computers and phones. Our participants allow us into their own digital world. Unlike the testing experiences users have likely endured before, where they have to go to an office and work in a room on devices they may not ordinarily work with, we keep individuals in their own home or office environments. In doing so, we gain insight that isn’t otherwise available. We get the most accurate glimpse of a user experience from the words of our participants. Living inside the same experience with people using their own devices, we’re hearing all of it. We’ll hear when a participant says, “Did you hear that? Did you hear what it said?” and get a realtime look at when things don’t work.
Remote testing with our usability testers allows us to observe how something’s used and get honest reactions. When something’s terrible, believe me – they let us know. We know from manual accessibility testing if something gets focus or gets announced. But, it’s only a person in a usability session who can tell us if it’s useful, or annoying, or even physically painful. They’re the judge of whether or not an error message makes sense. Maybe they didn’t notice the most important alert at all. Being able to learn in the moment what works and what doesn’t is always revealing, and reminds us of the need for good user experiences over technical compliance, every time.
Unmoderated vs. moderated testing
We’ve tried unmoderated remote testing platforms at Simply Accessible, and honestly, they didn’t give us the answers we (and our clients) need. Foremost, we didn’t find out the why. We couldn’t ask any of the follow-up questions we’re always dying to ask. This is where many insights can be found, in those deeper follow-up discussions. (By the way, auto-generated heat maps don’t show anything when the participant isn’t using a mouse.)
Once, before agreeing to participate in one of our sessions, a participant asked me what Simply Accessible does to accommodate people with different cognitive disabilities. I let her know that we ask first instead of assuming to know what anyone needs. Again, there are no right or wrong answers because we’re here to observe. If anything, the right answer will always be to communicate what the experience is for the user. Accommodations are there for everyone. If someone gets mentally or physically tired, we’ll take a break or finish up another day. Some participants prefer a text chat to communicate with us during the session, so we use that too.
As the head of usability testing at Simply Accessible, the best part of my day is listening to people talk about themselves. I love knowing what’s on their desktop and home screens, why they use one assistive technology (AT) over another, how they use mobile devices, what they love or hate and what’s just annoying. It’s equally amazing to show our clients how their customers with low vision read content on their mobile devices or how someone else may use their voice as a mouse on a desktop computer to navigate a web site.
There are as many unique needs as there are unique users, and I love knowing that the feedback we receive from these sessions goes directly into improving the user experience for everyone.
We aren’t just looking at what assistive hardware and software people use or what apps and browser plug-ins they use. We get to peek into how someone organizes their digital life. Just as with other usability testing, this kind of exploration reveals how people think and what they want when interacting with digital content. For example, in one recent Simply Accessible mobile study, we observed how different people arrange their home screens. While one participant had what first appeared to be multiple screens of chaos, she explained how it was actually an order she created; it works for her.
Another participant with limited dexterity arranged and grouped his iPhone apps into categories on a single screen. This meant he didn’t need to swipe, which is something he prefers avoiding because tapping into a second level is easier. His arrangement of app groups created space around them so that the targets were easier to hit, resulting in fewer mis-taps.
When it comes right down to it, if a site, service or product is poorly designed, someone who has a disability is just going to encounter the same problems as everyone else, only much more pronounced.
If a purchasing process is long and tedious, it’s universally long and tedious. Bad user experiences discriminate against no one. We learn quickly through our usability testing where the problematic issues are, and we also find barriers that might prevent someone from completing a long and tedious task that someone without a disability can complete without as much tedium.
Meet your users
We try to cover a spectrum of people in our usability studies. Sure, it’s not absolutely every scenario, but we do have a wide range of participants that represent all sorts of user needs whether they be auditory, speech, physical, neurological, cognitive, visual or a combination. There’s a brilliant quote from Stephen M. Shore that goes, “If you’ve met one individual with autism, you’ve met one individual with autism.” This speaks to why Simply Accessible includes as many people as we can in our usability work.
In our studies, we meet our users where they live. Our participants join us from many countries. People have their own unique preferences across many differing devices when it comes to using assistive technology, so instead of trying to recreate the environment they might have in a cold lab, we take our tests directly to them.
When people think of digital accessibility, they commonly just associate it with screen readers. Covering different kinds of screen readers is vitally important – but how is an individual actually using that technology? Screen readers aren’t people – people are people.
Naturally, our participants are using both Mac and Windows on the desktop and iOS, Android and Windows mobile phones and tablets. They’ve got their favourite browsers with plug-ins and add-ons. They have figured out ways to work within the digital landscape and the parameters of their abilities. The information and insight available to us by seeing our participants navigate their digital world is invaluable.
This is how we keep improving
Accessibility has come a long way in the past decade, and the advances in technology that now allow us to facilitate remote usability sessions have helped us improve and evolve our services. We have gained a deeper understanding of just how different apps, devices, programs, services, and websites are used by people. We have a wider interpretation of “disability” and who accessible guidelines really benefit. Sites, products, services, all of these can’t just be accessible—they have to be usable. Remote testing with real users in their own digital environments is how we make progress when it comes to making things better—for everyone.
Curious about our approach to usability testing? Read more about it here.