In celebration of Global Accessibility Awareness Day today, earlier this week, Apple announced a raft of new accessibility features and updates – some of which will hopefully prove to be genuinely transformative for users with disabilities.
Slated for release at an unspecified time later this year, the most universal new feature is undoubtedly what the Cupertino-based tech giant is terming Accessibility Nutrition Labels. These will apply to apps in the Apple App Store and allow developers to append a description of what on-device accessibility features are supported. These include tools like VoiceOver, Voice Control, Larger Text, Sufficient Contrast, Reduced Motion, and captions. Not only does this provide users with disabilities the option of making more informed choices, but also importantly introduces the notion of accessibility features positioned at a point of sale or download, incentivizing developers to leverage accessibility as a competitive advantage.
Apple plans to provide additional guidance on the criteria developers will need to benchmark against before they can claim that their app supports accessibility features, and is likely to share more information on this at its annual Worldwide Developers Conference, which usually takes place in the summer.
Commenting on the upcoming Accessibility Nutrition Labels, Eric Bridges, the American Foundation for the Blind’s president and CEO said in a media release, “Accessibility Nutrition Labels are a huge step forward for accessibility,” Further adding, “Consumers deserve to know if a product or service will be accessible to them from the very start, and Apple has a long-standing history of delivering tools and technologies that allow developers to build experiences for everyone. These labels will give people with disabilities a new way to easily make more informed decisions and make purchases with a new level of confidence.”
Bold vision
Another announcement that may end up flying under the radar but could prove to be entirely game-changing for individuals living with low vision is Apple’s plan to enable zooming of the camera feed on its mixed reality headset, the Vision Pro. When the device was first launched early last year, access to its impressive camera array was blocked for third-party developers, and there was no native ability to zoom the passthrough image of the outside world. Zoomable video magnification will now enable wearers with poor eyesight that cannot be corrected with traditional lenses to significantly enhance their perception of their surroundings. This may prove very useful for capturing more context and detail in real-world scenes, recognizing faces, and enjoying live spectator events.
Apple plans to further expand the use of its camera array to enable VoiceOver users, who may have more advanced sight loss than those who rely more on magnification, to use on-device machine learning to identify objects, describe surroundings, and find documents. Furthermore, those who use visual interpretation services like Be My Eyes will now be able to connect their Vision Pro directly to an operator for on-the-spot visual assistance. To offset privacy concerns, Apple will restrict access to the Vision Pro’s main camera API to pre-approved developers only.
For more ICT-related situations, a new feature entitled Magnifier for Mac will allow users to connect their iPhone or third-party camera to their laptop and use the Mac’s new magnification feature to zoom in on and apply filters to what the camera is filming. This is likely to prove most useful in lessons and lectures where users will be able to comfortably view the whiteboard whilst simultaneously toggling through different elements such as documents and notes. There are also integrations with the new Accessibility Reader, which not only allows fonts, colors, and spacing on apps to be modified according to user preferences but can apply the same type of personalization to real-world textual images like books on a shelf or items on a restaurant menu.
Addressing different needs
Though many of the brand-new features help with vision, one for those with hearing loss to look out for is surely the extension of Live Listen to Apple Watch. Live Listen turns Apple devices into a remote microphone to stream content directly into headphones or certain hearing aids. The new integration with Apple Watch will allow captions to be additionally displayed on the watch, which can also control the entire stream for a more lean-back and immersive experience.
These are the main new features, but there is an additional raft of updates to features announced last year, which cover a wider span of disabilities such as cognitive impairments, speech difficulties, and motor disorders. Perhaps the most eye-catching of these is the updates to Personal Voice. Personal Voice is a feature that allows users who are in danger of losing their speech to create a synthetic version of their voice. Previously, it required the user to repeat 150 phrases, which is still relatively fast, to create the digital twin voice. Now, Apple has brought this training data set down to just 10 phrases whilst also providing a more authentic match to the original voice. Improvements have also been made to eye and head tracking capabilities, which allow users with dexterity and speech problems to navigate on-screen items using eye movement and gaze. Intriguingly, a new protocol has also been created to enable switch control through emerging brain-computer interfaces, which can output commands based on reading the user’s brainwaves.
Commenting on the exciting updates, Apple CEO Tim Cook said, “At Apple, accessibility is part of our DNA.” Further adding, “Making technology for everyone is a priority for all of us, and we’re proud of the innovations we’re sharing this year. That includes tools to help people access crucial information, explore the world around them, and do what they love.”
“Building on 40 years of accessibility innovation at Apple, we are dedicated to pushing forward with new accessibility features for all of our products,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “Powered by the Apple ecosystem, these features work seamlessly together to bring users new ways to engage with the things they care about most.”
Those things will, of course, be different for everyone, but with new features and updates as sweeping as they have been this year, it’s hard to imagine that there will be too many folks feeling like they’ve been left out in the cold.

