Explorer

Apple Hints At Its AI Plans, Reveals Eye-Tracking, Music Haptics Features Ahead Of WWDC 2024

Apple has introduced another feature called Listen for Atypical Speech, employing on-device machine learning to enhance Siri's comprehension of a broader spectrum of voices.

Apple has recently unveiled a set of new accessibility features, some of which subtly hint at its upcoming AI endeavours. One such feature, known as Eye Tracking, employs artificial intelligence to enable users to control their iPhone and iPad solely with their eyes. This innovation, primarily intended for individuals with physical disabilities, presents an intriguing glimpse into Apple's AI advancements. The setup for Eye Tracking is straightforward and swift, requiring users to calibrate it using the front-facing camera for a few seconds.

Apple emphasises that all data used for configuration and control remains secure on the device and is not shared with the company.

Eye Tracking functionality extends to apps on both iPadOS and iOS, eliminating the need for additional hardware or accessories, as noted by Apple. Through Eye Tracking, users gain the ability to navigate within apps and activate various elements using Dwell Control. They can execute actions such as button presses, swiping, and gestures, all with the movement of their eyes.

ALSO READ | Google Pixel 8a Review: Excellent Mid-Ranger With Flagship Features

What Other Features Have Been Introduced

Additionally, Apple has introduced another feature called Listen for Atypical Speech, employing on-device machine learning to enhance Siri's comprehension of a broader spectrum of voices. Moreover, Vocal Shortcuts empowers users to teach Siri specific words or phrases for expedited tasks like launching apps or shortcuts. These enhancements cater to individuals facing challenges in clear speech due to conditions like cerebral palsy or amyotrophic lateral sclerosis (ALS).

Among the new accessibility features introduced today is Music Haptics, enabling individuals who are deaf or have hearing impairments to sense music beats via vibrations on their iPhones. Additionally, there are Vehicle Motion Cues, designed to ease iPhone or iPad usage in a moving vehicle, particularly beneficial for those susceptible to motion sickness. Apple is also integrating systemwide Live Captions into visionOS, the software that drives the Apple Vision Pro.

Apple in a press release said, “These features combine the power of Apple hardware and software, harnessing Apple silicon, artificial intelligence, and machine learning to further Apple’s decades-long commitment to designing products for everyone.” They will be available “later this year” most likely with the iOS 18 and iPadOS 18 updates that are expected to be introduced during this fall.

Top Headlines

Want To Recover Deleted WhatsApp Messages? Try This Simple Step-By-Step Guide
Want To Recover Deleted WhatsApp Messages? Try This Simple Step-By-Step Guide
AI Boom Will Require Trillions In Investment And Millions Of Skilled Jobs, Says Nvidia's Huang
AI Boom Will Require Trillions In Investment And Millions Of Skilled Jobs, Says Nvidia's Huang
Will iPhone 18 Get Major Redesign? Latest Leak May Disappoint Apple Fans
Will iPhone 18 Get Major Redesign? Latest Leak May Disappoint Apple Fans
iPhone 16 Price Drops By Rs 19,000: Here’s How You Can Get This Deal
iPhone 16 Price Drops By Rs 19,000: Here’s How You Can Get This Deal

Videos

Breaking News: Middle East War Sparks LPG Crisis in India; Mumbai Dhobi Ghats Hit Hard
Breaking News: Domestic LPG Shortage Hits Major Indian Cities, Long Queues Outside Gas Agencies
Breaking News: IRCTC Orders Railway Canteens to Switch to Microwave & Induction Amid LPG Shortage
Delhi Politics: Sanjay Singh, Ram Gopal Yadav Slam Govt Over LPG Crisis, Hotels & Factories Hit
Delhi Update: Kejriwal Blasts Modi Over LPG Shortage, Hotels & Restaurants Face Shutdown

Photo Gallery

25°C
New Delhi
Rain: 100mm
Humidity: 97%
Wind: WNW 47km/h
See Today's Weather
powered by
Accu Weather
Embed widget