« Apple poised for May launch of new iPads with OLED displays, larger Air model | Main | Nothing Phone 2a receives new update with focus on camera improvements »
Thursday
Mar282024

Ray-Ban Meta Smart Glasses get smarter with multimodal AI

Photo: Meta

Get ready for a boost in brainpower for your shades! According to a recent New York Times report, Meta is bringing advanced AI features to its Ray-Ban Meta Smart Glasses starting next month.

This multimodal AI can do some impressive tricks: translating languages on the fly and identifying objects, animals, and even famous landmarks. The feature has been in testing since last December through an early access program.

Activating the AI is simple—just say "Hey Meta" and ask your question. The glasses will respond through built-in speakers, providing information in real time.

The NYT took the Ray-Ban Meta Smart Glasses for a spin in various situations, from grocery shopping and museum visits to driving and exploring the zoo. While the AI excelled at recognizing pets and artwork, it wasn't perfect. 

The NYT found it struggled with distant animals behind cages and even a tricky fruit called a cherimoya (don't worry, most people wouldn't recognize it either!).

The good news? Meta's AI is constantly learning and improving. Currently, the multimodal AI features are limited to early-access users in the US, but a wider rollout is expected soon.

Source

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>