To participate in world accessibility day, Apple unveiled a variety of new features coming to all of its products to help make them more accessible, as well as helping people with tasks in the real world.
For instance, one of the most exciting features is an expansion of the Magnifier tool, but now it pairs with your Mac to help people with vision impairments use their iPhone like a telescope. Apple shows it off best in a new ad, but basically, you set up your iPhone pointed at what you want to see, such as a presentation or some papers with small fonts. Then you can see the video feed on your large Mac screen. To take things further, you can even adjust things like scaling, contrast, colour and more to make it super legible.

An example of someone using Apple’s Magnifier tool to enlarge small font in a book.
One of the other unique features is support for brain control interfaces. You’ve likely heard of Elon Musk’s Neuralink startup, and if tech like that does start to get used more in the future, Apple is building support for it into its products.
According to a Wall Street Journal report, Apple is working with a company called Synchron on this, but only ten people have been implanted with its brain device. When connected to your phone, you would, in theory, be able to control your device with your mind.
Another feature that will likely get more widespread notice is a new section in the App Store that will list what accessibility features each app has, so users can be better informed before they download. This will show things like whether it supports VoiceOver, dark background, solid contrast, reduced motion, and much more. However, it will be up to developers to supply these ‘nutrition labels’ to their apps.

Apple is also improving a lot of its existing features. Vehicle Motion cues are getting better for fighting motion sickness in cars, and it’s also coming to the Mac. Live Listen is getting support for captions, and you can even see live captions on your Apple Watch if your phone is far away.
The company also said it’s improving eye-tracking on iPads and iPhones. Specifically, it should be easier to type using eye-tracking now.

There are tons of other updates for things like Braille support, CarPlay, Apple TV, hearing features and much more that you can read about in Apple’s full press release. There is even a few features coming to the Vision Pro that might be laying the groundwork for a smaller and lighter AR glasses or AirPods with cameras.
None of these features are releasing today, but we’ll likely learn more at WWDC this summer, and I expect we’ll see most of these come in Apple’s major fall updates to iOS 19 and the next macOS.
Source: Apple
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.
