Skip to content
  • Design

What is assistive technology and why should you test it in your user research?

Assistive Technology Braille

by Em Sutcliffe

Are you confident that people with accessibility needs will be able to use your product or service?

If you don’t get accessibility user testing right, you risk designing a service that’s exclusive when everyone has the right to access it.

Equipment or software that enables users who interact with digital services to live more independently — this is the definition of assistive technology. When did you last test your service with someone who interacts in this way? 

As a research community, we need to do more to include everyone in our research. To help you get started in running testing, let’s explore some examples of assistive technologies.

Examples of assistive technologies

Screen readers

Screen readers read aloud what is on screen as users navigate the page using a keyboard. They can also translate what is on the screen to a Braille display.

Blind and sight-impaired people typically use screen readers, but they could also be used by anyone who prefers to absorb information verbally rather than visually.

The most commonly used screen readers include:

  • JAWS (Desktop)
  • NVDA (Desktop)
  • VoiceOver (Mobile)
  • TalkBack (Mobile)

Speech-to-text software

This is software that converts speech into commands to navigate a service, and/or is used to dictate text.

People who cannot use a keyboard or mouse for long periods (or at all) sometimes use this software. It can also be helpful for anyone who prefers to interact using speech.

Dragon is a well-known speech-to-text software. Many devices also have this feature built into them, such as the Siri speech-to-text software which is available on Apple devices.

Screen magnification software

Screen magnifiers can enlarge part or the whole of a screen. Sometimes people will also use a handheld magnifier at the same time.

Zoom Text is an example of screen magnification software and, similarly to speech-to-text, some devices have inbuilt magnification capability.

Adaptive keyboards, mice, and alternative input devices

A typical keyboard and mouse have been designed to be used in one way. Adaptive keyboards and mice have been modified to meet particular needs and help people to interact with digital devices.

Examples of alternative keyboards include:

  • Larger keys
  • Coloured overlays
  • Braille displays
  • Ergonomic shape

Examples of alternative mice include:

  • Rollerballs to move the cursor
  • Horizontal bar mice
  • Vertical mice
  • Ergonomic shape

Why should you include assistive technology testing in your user research?

Around 20% of the population interacts with their devices using assistive technology. This works out to approximately 13 million people in the UK. Just thinking about people who might use screen readers, there are 2 million blind people in the UK, and every six minutes someone is told they’re going blind.

Releasing inaccessible websites risks excluding a large number of people from independently completing tasks that are critical to them going about their day-to-day life. Imagine not being able to get a new passport, order food from the shops or even book an important appointment.

Not only is it important to think about the people you could exclude, but if you’ve worked within the public sector, you’ll be familiar with the Government Service Standard. The Service Standard includes a set of principles to help teams create and build a good service. Your service must work with at least a list of common technologies before it is released to real users. This is key to meeting the standards. Even if you work outside the public sector, I’d recommend using this list as a starting point.

Some teams rely on automated and/or manual testing of the code to ensure their service works with these technologies. Although this testing can help identify issues, from my experience I don’t think this is enough. I’m a firm believer that without including assistive technology users in your research, you can’t be confident you’ve got this right.

Here are some examples from my experience to show you why.

Example 1:

Your service includes a section of content explaining to the user that they can save their progress in a registration form and come back later. The content has a heading of ‘Return to a saved registration’.

  1. The tester does a manual test of the page using a screen reader. They navigate the page using the keyboard with no problems.
  2. You test the same page with a blind screen reader user. They navigate to the heading of this section of content and hear ‘Return to a saved form’. What do they do? They keep pressing enter as they understood it as an action. They are then stuck thinking it’s a broken link and don’t know how to progress.

Example 2:

Your service includes a series of guidance pages.

  1. The tester manually tests the page to check that a user can navigate using only a keyboard. They press the space bar to navigate and are jumped down the page. It looks like there are no problems.
  2. You test this page with a user with arthritis who navigates using their ergonomic keyboard to reduce the strain on their wrists. They press the space bar, and the page jumps down too quickly, without letting them see an overlap of some of the content they are scrolling past. This makes them worry that they might have missed something important, so they have to scroll up using their mouse to check manually. This happens on every page of guidance, putting strain on their wrists.

In summary, running automated and manual tests is a great start to uncover accessibility issues. But, you must test with actual users of assistive technology to get a full picture of their context of use, and what you need to do to create an accessible service.

If you have any questions about assistive technology, please do get in touch!

Em Sutcliffe's avatar

Em Sutcliffe

Lead Design Researcher

Contact Em

Our recent insights

Transformation is for everyone. We love sharing our thoughts, approaches, learning and research all gained from the work we do.

Mica Sdingov

A user centred approach to effective governance

User centred design (UCD) can simplify complex governance in the public sector, making processes more efficient and empowering for all involved.

Visual facilitation and thinking with the Office for National Statistics

A sneak peek at our visual facilitation training with ONS, sketching, simplifying concepts, and applying visual thinking in complex contexts.

Developing greener services principles with DEFRA

Supporting Department for Environment, Food & Rural Affairs to define shared principles that enable all to embrace planet-centred thinking.