Apple's new accessibility features: all you need to know

Technology is intended to help physically challenged people to communicate and carry out their daily routines

Apple's door detection, which uses the magnifier app on iPhones, helps to locate and describe doors for users, and is particularly useful for those who are blind or have low vision. Photo: Apple
Beta V.1.0 - Powered by automated translation

Apple revealed several new accessibility features on its devices, which are aimed at assisting users who are physically challenged.

The updates, which come ahead of Global Accessibility Awareness Day on Thursday, are part of Apple's efforts to provide a more inclusive ecosystem and will be available on iPhones, iPads, Apple Watches and Macs, the company said in a statement.

“Apple embeds accessibility into every aspect of our work, and we are committed to designing the best products and services for everyone,” said Sarah Herrlinger, Apple’s senior director of accessibility policy and initiatives.

The new features "combine innovation and creativity from teams across Apple to give users more options to use our products in ways that best suit their needs and lives".

More than one billion people in the world live with some form of disability, corresponding to about 15 per cent of the global population, and they are considered to be the world's biggest minority, according to the World Health Organisation.

Technology can help them dramatically in communicating and carrying out their daily lives, and companies like Apple have been conducting research and integrating next-generation technologies into their devices to help these people.

Apple's features will be released later this year, but it is not clear whether it will be part of iOS 16, which is expected to be introduced during the next iPhone's launch in September.

We take a brief look at the new tools.

Door detection for blind or low-vision users

The most notable new accessibility feature from Apple is "door detection", which is particularly useful for users who are blind or have poor vision.

Door detection, which uses the magnifier app on iPhones, helps to locate and describe doors for users, including how far they are from it, if it's open or closed, and how it can be opened (if it requires turning a knob or pulling a handle) if it's the latter.

If there is a queue, the app will use "people detection" to inform the user how far he or she is from the next person to avoid being too close — particularly important in an age of physical distancing.

It can also read signs on a door, which can further give a user details on what establishment is being approached. This feature uses Apple's Lidar technology, meaning this can be used only on the Pro models of the iPhone 12 or iPhone 13, or the latest iteration of the iPad Pro.

New languages for VoiceOver

Apple has also added support for more than 20 languages on its VoiceOver screen reader, which aids blind and low-vision users.

Additionally, VoiceOver users on Mac now have a new text checker tool, which discovers common formatting issues such as duplicate spaces or misplaced capital letters, making proofreading documents or e-mails easier.

Live captions for the deaf and hard of hearing

Apple added a major feature, "live captions", for users who are deaf and challenged with hearing. Live-captioning isn't new — this is already being used on a number of streaming platforms, such as news — but Apple has specifically configured it to help users with accessibility issues.

Live captions are available on FaceTime calls, video conferencing and social media apps andwhile streaming media content. Users can also adjust font sizes to make text easier to read.

On FaceTime, users with hearing disabilities can benefit from live captions' auto-transcribe feature. On Macs, users can type a response and have it spoken to the rest of the participants.

Are live captions secure?

Yes. Since they are generated within Apple devices, user information stays private and secure.

Apple watch mirroring for those with mobility issues

Apple watch mirroring, a new tool for the digital timepiece, will help users control an Apple Watch remotely from the iPhone it's paired with.

Normally, controlling an Apple Watch requires tapping on it, but with this new feature, commands can be executed using voice, sound actions or even head-tracking. It is also compatible with external Made for iPhone switches.

Assistive features such as voice control and switch control, as well as health apps, including blood oxygen and heart rate, can be used with mirroring — by just using hand gestures.

There are also new quick actions on the Apple Watch, including double-pinching to answer or end a phone call, dismiss notifications, taking photos, play or pause media, and start, pause or resume workouts.

Assistive features such as voice control and switch control, and even health apps such as blood oxygen and heart rate can now be used with Apple watch mirroring. Photo: Apple

Buddy controller, making Siri 'pause' and more

Apple's updates include "buddy controller", a feature in which a user can ask someone — a care provider, in this instance — for help in playing a game, especially if the user is physically challenged.

It combines two game controllers into one, so several controllers can drive the input for a single player. There is no specific controller needed to use this feature; as long as it is compatible with the Apple device, it can be used.

Users with speech disabilities, such as those who stutter or pause in between words, can use the new Siri Pause Time. The feature adjusts the time Siri waits before responding to a request, thereby giving enough time to the user.

Voice control spelling mode, meanwhile, is a new option to dictate custom spellings using letter-by-letter input, while sound recognition can be used to recognise sounds specific to a user's environment, such as those coming from doorbells, alarms or even appliances.

Updated: May 18, 2022, 2:59 PM