On this page

Audio: Listen to this article

We are living in a golden age of assistive technology for people with vision loss. With advances in machine learning and artificial intelligence there are now apps using image recognition technology that can describe people, objects and the physical environment.

Screen readers can read out websites, apps and describe images. Personal assistants can take instructions by voice and control a smart home. Glasses and apps can connect a user to a remote human agent to help them navigate the physical environment and use their computer.

The line is blurring between what is "assistive technology" and what is just technology. Our previous article five everyday items originally designed to help people with disability discussed everyday objects that have been with us for a while and that most of us take for granted.

Likewise, we don't call personal assistants an assistive technology but they are popular in the blind community. Autonomous cars won’t be classified as an assistive technology but in many ways they are.

One of the topics we cover in our inclusive design workshop is gaining an understanding of the diverse range of ways people interact with digital and physical products and services today.

Some people prefer to listen to content. Others may prefer to read it with their eyes, screen reader, or using a braille note.

Some may prefer to complete forms using their voice. Others may prefer to have a remote human agent help them complete a form. And this can change for an individual depending on the scenario and situation the person is in.

Below are some examples of new assistive technologies people with vision loss are using that have us excited.

Remote human agents

Remote human agents such as Aira and Be My Eyes enable people who are blind or have low vision to connect to a remote human agent for visual assistance through a live video call through glasses or a smartphone.

They are useful in all kinds of scenarios and situations including sorting mail and medications, finding meeting rooms at work, navigating public transport, solving CAPTCHA, and cooking.

Woman using an Aira press kit tapping the kit with her finger

Image recognition technology

Apps such as Taptapsee and Seeing AI and glasses including OrCam MyEye use artificial intelligence to identify objects, text and people in pictures and video and describe them out aloud to users.

Some of the scenarios where this technology is used include reading printed text, describing the colour of clothes, scanning barcodes to identify products, and identifying currency.

The OrCam MyEye was even piloted for use at the recent Israel election.

A man using his Orcam MyEye to read the newspaper

Personal assistants

Going beyond asking what the weather is, personal assistants such as Amazon EchoGoogle Home, and Apple HomePod integrate into a smart home, letting people who are blind turn on the air conditioner, control the light switch and play music controlling the devices just with their voice.

They can also be useful for simple tasks including noting items to add to a digital shopping list.

An Amazon Echo situated against a white background

Final thoughts

A shift is happening where assistive technology is becoming everyday technology. Organisations need to be mindful of this. If we don't support a diverse range of ways to interact with our products and services, then people will go to a competitor who does.

Digital Access has just announced new 2019 dates for our training workshops including advanced public training workshops for Microsoft Office and Adobe PDFInclusive Design, and Web Accessibility Techniques and Testing.

Don’t miss out on the latest news and advice from the Digital Access team. Sign up to their newsletter and have it delivered straight to your inbox.