Meta is pushing out a series of upgrades for its Ray-Ban Meta glasses that should make them a bit smarter. One change is it'll be easier to engage with Meta AI. As usual, you start a conversation by saying "Hey Meta." From there, you can just continue asking follow-up questions without repeating the wake word. Plus, you no longer need to say "look and" to ask Meta AI about the object you're looking at.
You can now ask your Ray-Ban Metas to remind you about things, which you can later see on your phone. Examples Meta gave include remembering where you parked your car and setting alerts to message or call someone at a specific time.
Speaking of messaging, you can ask Meta AI to send voice messages for you on Messenger and WhatsApp. It can even scan QR codes and call numbers you're directly looking at.
Real-time translations are also coming to your glasses. When you're talking to someone speaking Spanish, Italian, or French, you'll hear what they say in English through the glasses' open-ear speakers.
Meta also plans to roll out real-time video processing, so you can get help as you go around.
The company gives examples in a blog post, "If you’re exploring a new city, you can ask Meta AI to tag along, and then ask it about landmarks you see as you walk or get ideas for what to see next — creating your own walking tour hands-free. Or, if you’re at the grocery store and trying to plan a meal, you can ask Meta AI to help you figure out what to make based on what you’re seeing as you walk down the aisles, and if that sauce you’re holding will pair well with that recipe it just suggested."