Apple is introducing a set of new accessibility features for the iPhone and iPad, according to a recent press release. The new features aim to improve accessibility for users with cognitive, audio and visual impairments.
These additions include:
Assistive Access — lightens cognitive load by stripping apps to their essential features, adds high contrast buttons and larger text labels, provides more control over the home screen layout.
Live Speech — lets users type out responses during phone and FaceTime calls which are narrated out loud, can save commonly used phrases for quicker access.
Personal Voice Advance Speech Accessibility — after reciting a random set of text prompts for 15 minutes, on-device machine learning generates a realistic voice that sounds like the individual.
Detection Mode in Magnifier — a new Point and Speak function within the magnifier app uses the camera and LiDAR sensor, along with machine learning, to narrate the text of real-life objects within the viewfinder.
Apple mentioned some additional, smaller tweaks that are also being added to macOS. These include easier adjustment of text size across Mac apps, improved Made for iPhone hearing device connectivity, and the ability to adjust the speed at which Siri responds to voice commands.
“Accessibility is part of everything we do at Apple. These groundbreaking features were designed with feedback from members of disability communities every step of the way,” says Sarah Herrlinger, Apple’s senior director of global accessibility policy and initiatives.
You can read more about these new accessibility features on Apple’s website. Unfortunately, we don’t have a release date beyond “later this year.”
It seems likely that the features will be rolled into iOS 17 and iPadOS 17, which are expected to be announced at WWDC in June.