Meta Unveils Live AI, Translations, and Shazam Integration for Ray-Ban Smart Glasses

Starfolk

Starfolk

December 16, 2024 · 3 min read
Meta Unveils Live AI, Translations, and Shazam Integration for Ray-Ban Smart Glasses

Meta has rolled out three new features to its Ray-Ban smart glasses, including live AI, live translations, and Shazam integration. These updates come as the tech giant continues to push the boundaries of artificial intelligence in wearable devices.

The live AI feature, available to members of Meta's Early Access Program, enables users to converse naturally with Meta's AI assistant while it continuously views their surroundings. This means users can ask for suggestions or information based on their current environment. For instance, if a user is browsing the produce section at a grocery store, they can ask the AI for recipe suggestions based on the ingredients they're looking at. According to Meta, users will be able to use the live AI feature for approximately 30 minutes on a full charge.

Another significant update is the live translation feature, also limited to Early Access Program members. This feature allows the glasses to translate speech in real-time between English and Spanish, French, or Italian. Users can choose to either hear translations through the glasses or view transcripts on their phone. To use this feature, users need to download language pairs beforehand and specify what language they speak versus what their conversation partner speaks.

In addition to these features, Meta has also integrated Shazam into its smart glasses. This integration enables users to identify songs they hear by simply prompting the Meta AI. A demo of this feature can be seen in a recent Instagram reel posted by Meta CEO Mark Zuckerberg.

To access these features, users need to ensure their glasses are running the v11 software and that they're also running v196 of the Meta View app. Those not already in the Early Access Program can apply through Meta's website.

The release of these features comes at a time when Big Tech is heavily investing in AI assistants for smart glasses. Just last week, Google announced Android XR, a new OS for smart glasses, positioning its Gemini AI assistant as a key selling point. Meanwhile, Meta CTO Andrew Bosworth has expressed his belief that 2024 was the year AI glasses hit their stride, and that smart glasses may be the best possible form factor for a "truly AI-native device."

The implications of these updates are significant, as they further solidify Meta's position in the smart glasses market. With the integration of live AI, translations, and Shazam, Meta's Ray-Ban smart glasses are becoming increasingly capable devices that can enhance users' daily lives. As the tech industry continues to push the boundaries of AI, it will be interesting to see how these features evolve and what new innovations emerge in the wearable space.

Similiar Posts

Copyright © 2024 Starfolk. All rights reserved.