Meta’s Ray-Ban glasses get smarter with multimodal AI features
Meta’s Ray-Ban smart glasses are getting a major artificial intelligence upgrade. The company announced today that it will begin testing its multimodal AI features, providing information and suggestions based on what the glasses see and hear.
In an Instagram Reel, Mark Zuckerberg showed how the glasses can recommend matching outfits, translate text, and generate image captions. That's just some of the things that it could do. Eventually, it may answer questions about the wearer’s surroundings and interests.
CTO Andrew Bosworth also demonstrated the glasses’ ability to describe a California-shaped wall sculpture in a video. He said the glasses can also help with captioning photos, translating and summarizing text, and other common AI tasks. The test will be open to a small group of US users who opt-in.
Reader Comments