« OpenTable reveals Canadian data on dining out for the festive season | Main | Capture the past with LEGO Creator Retro Camera 3-in-1 set »
Thursday
Dec142023

Meta’s Ray-Ban glasses get smarter with multimodal AI features

Image: Meta

Meta’s Ray-Ban smart glasses are getting a major artificial intelligence upgrade. The company announced today that it will begin testing its multimodal AI features, providing information and suggestions based on what the glasses see and hear.

In an Instagram Reel, Mark Zuckerberg showed how the glasses can recommend matching outfits, translate text, and generate image captions. That's just some of the things that it could do. Eventually, it may answer questions about the wearer’s surroundings and interests.

CTO Andrew Bosworth also demonstrated the glasses’ ability to describe a California-shaped wall sculpture in a video. He said the glasses can also help with captioning photos, translating and summarizing text, and other common AI tasks. The test will be open to a small group of US users who opt-in. 

Source

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>