Meta Ray-Ban Smart Glasses Update: Live AI, Live Translation And Shazam Showcased By Zuckerberg Earlier Finally Arrive
The standout feature of this update to Meta Ray-Ban smart glasses, Live AI, leverages the glasses' cameras to enable continuous real-time video processing.
Meta has steadily enhanced its Ray-Ban smart glasses with new features, and now three additional capabilities—Live AI, live translation, and Shazam functionality—have been introduced. These features are available to users enrolled in the early access program in Canada and the US. According to the company, these new AI-powered functionalities are part of the v11 software update, which is now rolling out to eligible devices.
The standout feature, Live AI, leverages the glasses' cameras to enable continuous real-time video processing.
ALSO READ | Best AI Smartphones Of 2024: Samsung Galaxy S24 Ultra, Google Pixel 9 Pro XL, More Phones That Blew Us Away
Meta Ray-Ban Smart Glasses New Features: What Do We Know
This allows the AI chatbot to visually interpret surroundings and provide instant, context-aware answers. Users can interact with Meta AI seamlessly without needing the "Hey Meta" wake phrase, switch topics effortlessly, and revisit previous discussions. This update operates similarly to Google’s Project Astra, powered by Gemini 2.0.
Additionally, the Ray-Ban Meta smart glasses now support live translation for English and three other languages—Spanish, French, and Italian. With this feature, the AI can translate conversations in real-time, enabling users to hear the translated speech in English while communicating with someone in any of these supported languages.
Additionally, if the person you’re speaking with is also wearing Ray-Ban Meta glasses, the translated audio can be played directly through the glasses’ open-ear speakers. Users also have the option to view the translated text as a transcription on their smartphones, making this feature particularly useful for travellers navigating language barriers.
Another notable addition is the ability to stream music from platforms like Spotify, Amazon Music, and Be My Eyes directly through the smart glasses. The glasses also feature music recognition—simply ask, "Hey Meta, what song is this?" and the system will identify the track within seconds.
These features were showcased by CEO Mark Zuckerberg during the Meta Connect 2024 event in September and are now being gradually rolled out to early-access users. However, Meta has acknowledged that these features might not always function perfectly and has emphasised its commitment to refining the AI through user feedback. No timeline has been provided for a global release of these updates.