Ray-Ban Meta Smart Glasses get smarter with multimodal AI
Get ready for a boost in brainpower for your shades! According to a recent New York Times report, Meta is bringing advanced AI features to its Ray-Ban Meta Smart Glasses starting next month.
This multimodal AI can do some impressive tricks: translating languages on the fly and identifying objects, animals, and even famous landmarks. The feature has been in testing since last December through an early access program.
Activating the AI is simple—just say "Hey Meta" and ask your question. The glasses will respond through built-in speakers, providing information in real time.
The NYT took the Ray-Ban Meta Smart Glasses for a spin in various situations, from grocery shopping and museum visits to driving and exploring the zoo. While the AI excelled at recognizing pets and artwork, it wasn't perfect.
The NYT found it struggled with distant animals behind cages and even a tricky fruit called a cherimoya (don't worry, most people wouldn't recognize it either!).
The good news? Meta's AI is constantly learning and improving. Currently, the multimodal AI features are limited to early-access users in the US, but a wider rollout is expected soon.
Reader Comments