Meta

Meta's Ray-Ban Smart Glasses Receive 'Live AI' Update for Enhanced User Experience

Snap up a pair of Meta's new Ray-Ban smart glasses and you'll be living 'Live AI' in no time! These glasses will allow users hands-free inspiration, real-time help with everyday tasks such as cooking or gardening, the ability to translate languages, and much more. Who needs magic when you've got tech like this?!

Meta has announced a new software update for its Ray-Ban smart glasses which will add "Live AI," a feature that can use a video feed to gather context for questions, similarly to Google's Project Astra.

A new update rolling out to Ray-Ban Meta smart glasses, v11, which brings a few new options.

That includes Shazam integration, which will allow users to ask the glasses "Hey Meta, what is this song" and then have the result read aloud. This feature will be available in the US and Canada.

Beyond that, Meta is also introducing new AI features, and they look enticing. The first of these new features is "Live AI," which allows Ray-Ban Meta glasses to capture video which is then used by the AI to offer "real-time, hands-free help" on the things you're actively doing.

Meta says that, eventually, this data will be used to offer suggestions before you even have to ask.

The first is live AI, which adds video to Meta AI on your glasses. During a live AI session, Meta AI can see what you see continuously and converse with you more naturally than ever before. Get real-time, hands-free help and inspiration with everyday activities like meal prep, gardening, or exploring a new neighborhood. You can ask questions without saying "Hey Meta," reference things you discussed earlier in the session, and interrupt anytime to ask follow-up questions or change topics. Eventually live AI will, at the right moment, give useful suggestions even before you ask.

"Live translation," meanwhile, will be able to translate speech in real-time, with the other person's speech being output in English through the glasses (and also transcribed on your phone). This works for Spanish, French, and Italian.

Meta will only be rolling these features out through a waitlist, an only in the US and Canada for now.

Google is working on something just like this.

At Google I/O 2024 in May, the company showed off "Project Astra," a new AI project that would be able to use a video feed to gather context, then being able to answer questions based on what it saw. Google teased the functionality on glasses, but has yet to roll anything out. The announcement of Gemini 2.0 earlier this month saw Google detailing new updates to Astra will be able to converse in multiple languages, store up to 10 minutes of memory, improve latency, and more. It's unclear how Meta's "Live AI" will compare, but it's certainly exciting to see this functionality coming so soon, especially as we won't see it fully realized for Google until sometime next year.

#Meta

Latest News

xBloom

xBloom Studio: The Coffee Maker That Puts Science in Your Cup

5 months ago

Motorola

Moto Watch Fit Priced at $200: Is It Worth the Cost for Fitness Enthusiasts?

5 months ago

iOS

iOS 18's Subtle but Significant Privacy Boost: Granular Contact Sharing Control

5 months ago

Google

Walmart Unveils Onn 4K Plus: The Affordable $30 Google TV Streaming Device

5 months ago

Apple

Judge Forces Apple to Comply: Epic Games' Fortnite Returns Hinge on Court Order

5 months ago

OnePlus

OnePlus Unveils the ‘Plus Key’: Is It Just an iPhone Knockoff or Something Revolutionary?

5 months ago