Notebookcheck Logo

Apple's upcoming accessibility features allow users to control iPhones and iPads with their eyes

Thank to Apple's upcoming Eye Tracking feature, anyone will be able to control their iPad or iPhone with only their eyes. (Image via Apple)
Thank to Apple's upcoming Eye Tracking feature, anyone will be able to control their iPad or iPhone with only their eyes. (Image via Apple)
Apple announced several new accessibility features coming to iOS and iPadOS. Among them are Eye Tracking, which will allow users to interact with an iPhone or iPad with only their eyes; Vocal Shortcuts, which lets users customize voice commands; and Vehicle Motion Cues, which may help reduce motion sickness.

Apple announced new features coming to iOS and iPadOS targeted at making devices more accessible to everyone, including individuals with disabilities. 

The highlight of the new feature set is Eye Tracking. True to its name, this enables an iPhone or iPad to track a user's eyes for input. By shifting the focus of their gaze, users will be able to navigate through and interact with apps. The feature allows users to mimic functions like tapping elements, scrolling, and other gestures "solely with their eyes," according to Apple.

While a similar feature has been available on other devices for a while, it required extra hardware and specialized software. Eye Tracking is not app-specific and doesn't require extra hardware. Compatible devices accomplish this by using the front-facing camera and "on-device machine learning," per Apple. The company says that all data used in this feature will be kept on-device and won't be shared. The feature could prove incredibly useful to people with limited mobility, granting them an easier method of using their device.

Music Haptics make an iPhone vibrate in sync with audio played through Apple's Music app, which Apple says "is a new way for users who are deaf or hard of hearing to experience music on iPhone." While the feature is currently designed for Apple's Music app, the company will release an API for developers to plug into their apps to enable the feature. 

Vocal Shortcuts allow users to customize voice commands. Using this feature, Siri can understand non-traditional verbal cues or sounds to execute shortcuts and other tasks. In the same vein, Listen for Atypical Speech lets the device learn how a user talks, meaning those with affected speech can train their phone to better understand them.

Vehicle Motion Cues populates the device's screen with moving dots to visually cue the user to surrounding motion, such as when riding in a car. The feature uses the iPhone's or iPad's sensors to animate the dots in sync with motion either automatically or via a toggle that can be placed into Control Center.

CarPlay is also getting some new accessibility features, including Voice Control, Color filters for the colorblind, and Sound Recognition that can cue deaf or hard of hearing drivers into environmental noises like sirens or alarms. 

There are updates to existing accessibility features as well, which you can read about at Apple's newsroom (link below). These updates will drop sometime later this year. 

Buy an Apple iPad (10th Generation) at Amazon.

static version load dynamic
Loading Comments
Comment on this article
Please share our article, every link counts!
> Expert Reviews and News on Laptops, Smartphones and Tech Innovations > News > News Archive > Newsarchive 2024 05 > Apple's upcoming accessibility features allow users to control iPhones and iPads with their eyes
Sam Medley, 2024-05-15 (Update: 2024-05-15)