Meta Platforms has upgraded its Ray-Ban Meta smart glasses, with the integration of real-time video-powered AI assistance and live language translation.
These updates form part of the v11 software release, which began rolling out this week.
The glasses now feature AI-powered video assistance, enabling users to receive real-time insights based on what the glasses “see.”
In processing live visuals, the glasses can answer questions and assist users in various scenarios, from cooking to navigating unfamiliar areas.
Added to these, the glasses provide real-time language translation between English and Spanish, French, or Italian. For instance, conversations in these languages can be seamlessly translated into English through the glasses’ speakers or displayed as text on a connected smartphone.
Included in the update is the integration of Shazam, allowing users to identify songs by simply asking, “What is this song?” This feature aims to enhance convenience and is currently available for users in the US and Canada.
With these improvements, Meta is working to make smart wearables more interactive and useful. During its September Connect conference, Meta announced several features for the Ray-Ban glasses, including the ability to set reminders, scan QR codes, and search for content using voice commands.
Meta’s early-access programme is central to the rollout of these new tools, providing a testing ground for users in the US and Canada to explore and provide feedback.
Meta has also incorporated unique personalisation options for its AI assistant, such as celebrity voice packs for certain regions, pointing to its focus on creating a user-friendly experience.
With more updates planned for 2025, the Ray-Ban Meta smart glasses will continually be updated to lead innovation in the wearable tech space.