Apple Hints At Its AI Plans, Reveals Eye-Tracking, Music Haptics Features Ahead Of WWDC 2024
Apple has introduced another feature called Listen for Atypical Speech, employing on-device machine learning to enhance Siri's comprehension of a broader spectrum of voices.
Apple has recently unveiled a set of new accessibility features, some of which subtly hint at its upcoming AI endeavours. One such feature, known as Eye Tracking, employs artificial intelligence to enable users to control their iPhone and iPad solely with their eyes. This innovation, primarily intended for individuals with physical disabilities, presents an intriguing glimpse into Apple's AI advancements. The setup for Eye Tracking is straightforward and swift, requiring users to calibrate it using the front-facing camera for a few seconds.
Apple emphasises that all data used for configuration and control remains secure on the device and is not shared with the company.
Eye Tracking functionality extends to apps on both iPadOS and iOS, eliminating the need for additional hardware or accessories, as noted by Apple. Through Eye Tracking, users gain the ability to navigate within apps and activate various elements using Dwell Control. They can execute actions such as button presses, swiping, and gestures, all with the movement of their eyes.
ALSO READ | Google Pixel 8a Review: Excellent Mid-Ranger With Flagship Features
What Other Features Have Been Introduced
Additionally, Apple has introduced another feature called Listen for Atypical Speech, employing on-device machine learning to enhance Siri's comprehension of a broader spectrum of voices. Moreover, Vocal Shortcuts empowers users to teach Siri specific words or phrases for expedited tasks like launching apps or shortcuts. These enhancements cater to individuals facing challenges in clear speech due to conditions like cerebral palsy or amyotrophic lateral sclerosis (ALS).
Among the new accessibility features introduced today is Music Haptics, enabling individuals who are deaf or have hearing impairments to sense music beats via vibrations on their iPhones. Additionally, there are Vehicle Motion Cues, designed to ease iPhone or iPad usage in a moving vehicle, particularly beneficial for those susceptible to motion sickness. Apple is also integrating systemwide Live Captions into visionOS, the software that drives the Apple Vision Pro.
Apple in a press release said, “These features combine the power of Apple hardware and software, harnessing Apple silicon, artificial intelligence, and machine learning to further Apple’s decades-long commitment to designing products for everyone.” They will be available “later this year” most likely with the iOS 18 and iPadOS 18 updates that are expected to be introduced during this fall.