Explorer

Meta Ray-Ban Smart Glasses Update: Live AI, Live Translation And Shazam Showcased By Zuckerberg Earlier Finally Arrive

The standout feature of this update to Meta Ray-Ban smart glasses, Live AI, leverages the glasses' cameras to enable continuous real-time video processing.

Meta has steadily enhanced its Ray-Ban smart glasses with new features, and now three additional capabilities—Live AI, live translation, and Shazam functionality—have been introduced. These features are available to users enrolled in the early access program in Canada and the US. According to the company, these new AI-powered functionalities are part of the v11 software update, which is now rolling out to eligible devices.

The standout feature, Live AI, leverages the glasses' cameras to enable continuous real-time video processing.

ALSO READ | Best AI Smartphones Of 2024: Samsung Galaxy S24 Ultra, Google Pixel 9 Pro XL, More Phones That Blew Us Away

Meta Ray-Ban Smart Glasses New Features: What Do We Know

This allows the AI chatbot to visually interpret surroundings and provide instant, context-aware answers. Users can interact with Meta AI seamlessly without needing the "Hey Meta" wake phrase, switch topics effortlessly, and revisit previous discussions. This update operates similarly to Google’s Project Astra, powered by Gemini 2.0.

Additionally, the Ray-Ban Meta smart glasses now support live translation for English and three other languages—Spanish, French, and Italian. With this feature, the AI can translate conversations in real-time, enabling users to hear the translated speech in English while communicating with someone in any of these supported languages.

Additionally, if the person you’re speaking with is also wearing Ray-Ban Meta glasses, the translated audio can be played directly through the glasses’ open-ear speakers. Users also have the option to view the translated text as a transcription on their smartphones, making this feature particularly useful for travellers navigating language barriers.

Another notable addition is the ability to stream music from platforms like Spotify, Amazon Music, and Be My Eyes directly through the smart glasses. The glasses also feature music recognition—simply ask, "Hey Meta, what song is this?" and the system will identify the track within seconds.

These features were showcased by CEO Mark Zuckerberg during the Meta Connect 2024 event in September and are now being gradually rolled out to early-access users. However, Meta has acknowledged that these features might not always function perfectly and has emphasised its commitment to refining the AI through user feedback. No timeline has been provided for a global release of these updates.

Top Headlines

Satellite Images Reveal Damage At Venezuelan Base Where US Forces Captured Maduro: Report
Satellite Images Reveal Damage At Venezuelan Base Where US Forces Captured Maduro: Report
Maduro’s Exit Opens Venezuela Oil Stakes: What’s Next For Global Markets?
Maduro’s Exit Opens Venezuela Oil Stakes: What’s Next For Global Markets?
‘Save Yourself’: Trump Warns Another President After Maduro Is Taken To The US
‘Save Yourself’: Trump Warns Another President After Maduro Is Taken To The US
'Have Fond Memories Of Shiv Sena Bhavan': Raj Thackeray's Emotional Return After 20 Years
'Have Fond Memories Of Shiv Sena Bhavan': Raj Thackeray's Emotional Return After 20 Years

Videos

Breaking: Elderly Couple Found Murdered at Home in Delhi’s Shahdara, Police Probe Underway
US-Venezuela Crisis: US Action in Venezuela Sparks Global Debate Over Sovereignty, Oil, and Precedent
US-Venezuela Crisis: Oil or Security? Debate Grows Over US Action in Venezuela and Power Politics
Indore Water Crisis: 15 Dead After Drinking Contaminated Water, Situation Still Critical
Breaking: PM Narendra Modi to Virtually Inaugurate 72nd National Volleyball Tournament in Kashi

Photo Gallery

25°C
New Delhi
Rain: 100mm
Humidity: 97%
Wind: WNW 47km/h
See Today's Weather
powered by
Accu Weather
Embed widget