Please enable JavaScript to experience the full functionality of GMX.

Meta's smart glasses to get even smarter

Meta's smart glasses to get even smarter

Meta is to roll out new AI functions for its smart glasses this month.

The technology giant launched the souped-up Ray-Ban frames last year with the ability to make phone calls, listening to music and taking pictures, but they are being given a huge update with new functions that will allow them to identify certain animals and fruits just by looking at them as well as limited language translation for English, Spanish, Italian, German, and French, according to the New York Times.

The features have been available to some users in an early access trial from December, but they will now be handed out to all users.

The New York Times describes the new AI features as similar to the A.I. assistant in Joaquin Phoenix's 2013 movie 'Her'.

Another function set to come to the glasses in the future is landmark recognition, which would act as a virtual tour guide offering up history and interesting information about a structure the user is looking at.

Meta CTO Andrew Bosworth opened up about the new feature in a post on Threads, writing: "We continue to make improvements to @raybanmeta's multimodal AI feature across performance and domains.

"Starting this week, we’re testing the ability to get information on popular landmarks. For those who still don’t have access to the beta, you can add yourself to the waitlist while we work to make this available to more people."

He added in another post: "Beyond the improvements we’re making on multimodal AI, we’re also regularly updating the overall hands-free experience, like adding voice commands to share your latest Meta AI interaction on WhatsApp, Messenger, and text, as well as sharing the last photo you took with a contact.

"And if you’re a podcast listener at 1.5x or greater speed, you’ll soon be able to configure Meta AI readouts to be slower or faster under voice settings."

Sponsored Content

Related Headlines