Explorer

Meta Ray-Ban Smart Glasses Update: Live AI, Live Translation And Shazam Showcased By Zuckerberg Earlier Finally Arrive

The standout feature of this update to Meta Ray-Ban smart glasses, Live AI, leverages the glasses' cameras to enable continuous real-time video processing.

Meta has steadily enhanced its Ray-Ban smart glasses with new features, and now three additional capabilities—Live AI, live translation, and Shazam functionality—have been introduced. These features are available to users enrolled in the early access program in Canada and the US. According to the company, these new AI-powered functionalities are part of the v11 software update, which is now rolling out to eligible devices.

The standout feature, Live AI, leverages the glasses' cameras to enable continuous real-time video processing.

ALSO READ | Best AI Smartphones Of 2024: Samsung Galaxy S24 Ultra, Google Pixel 9 Pro XL, More Phones That Blew Us Away

Meta Ray-Ban Smart Glasses New Features: What Do We Know

This allows the AI chatbot to visually interpret surroundings and provide instant, context-aware answers. Users can interact with Meta AI seamlessly without needing the "Hey Meta" wake phrase, switch topics effortlessly, and revisit previous discussions. This update operates similarly to Google’s Project Astra, powered by Gemini 2.0.

Additionally, the Ray-Ban Meta smart glasses now support live translation for English and three other languages—Spanish, French, and Italian. With this feature, the AI can translate conversations in real-time, enabling users to hear the translated speech in English while communicating with someone in any of these supported languages.

Additionally, if the person you’re speaking with is also wearing Ray-Ban Meta glasses, the translated audio can be played directly through the glasses’ open-ear speakers. Users also have the option to view the translated text as a transcription on their smartphones, making this feature particularly useful for travellers navigating language barriers.

Another notable addition is the ability to stream music from platforms like Spotify, Amazon Music, and Be My Eyes directly through the smart glasses. The glasses also feature music recognition—simply ask, "Hey Meta, what song is this?" and the system will identify the track within seconds.

These features were showcased by CEO Mark Zuckerberg during the Meta Connect 2024 event in September and are now being gradually rolled out to early-access users. However, Meta has acknowledged that these features might not always function perfectly and has emphasised its commitment to refining the AI through user feedback. No timeline has been provided for a global release of these updates.

Top Headlines

US Launches High-Risk Bid To Seize Russian-Flagged Oil Tanker Linked To Venezuela: Reports
US Launches High-Risk Bid To Seize Russian-Flagged Oil Tanker Linked To Venezuela: Reports
X Responds To Govt Over Misuse Of AI Tool Grok: Sources
X Responds To Govt Over Misuse Of AI Tool Grok: Sources
Delhi HC Issues Big Directive On Survey Of Encroachments Around Jama Masjid
Delhi HC Issues Big Directive On Survey Of Encroachments Around Jama Masjid
Turkman Gate Case: First Pictures Of Accused Surface; Kasif, Kaif, Arib, Adnan & Sameer Arrested
Turkman Gate Case: First Pictures Of Accused Surface; Kasif, Kaif, Arib, Adnan & Sameer Arrested

Videos

Delhi News: Why Bulldozer Action Was Conducted at Night? DCP Nitin Valson Explains Key Reasons
Delhi News: Cold Wave Intensifies Across North India, Capital Records Coldest Day of the Year
Delhi News: MCD Removes Encroachment Near Faiz-e-Ilahi Mosque After Stone Pelting, Area Secured
Delhi News: Illegal Encroachment Near Faiz-e-Ilahi Mosque Cleared, Mosque Declared Safe Amid Public Fear
Delhi News: FIR Registered Over Objectionable Slogans at JNU, University to Expel Involved Students

Photo Gallery

25°C
New Delhi
Rain: 100mm
Humidity: 97%
Wind: WNW 47km/h
See Today's Weather
powered by
Accu Weather
Embed widget