Meta is rolling out some previously announced features to its AI-powered Ray-Ban smart glasses for users in the US and Canada. CTO Andrew Bosworth posted in a thread that today’s update to Glasses should include more natural language recognition and no more stilted “Hey Meta, look and tell me” commands. Users will be able to engage with the AI assistant without the need for the “look and” part of the call.
Most of the other AI tools that were showcased during last month’s Connect event are also coming to the frame today. This includes voice messages, timers, and reminders. You can also use the glasses to have Meta AI call your phone number or scan a QR code. CEO Mark Zuckerberg demonstrated a new reminder feature on Instagram Reels as a way to find your car in a parking lot. One notable omission from this update is the live translation feature, but Bosworth did not share a timeline for when that feature would be ready.
Meta’s smart glasses have already made headlines once today, when two Harvard students used them to track virtual strangers. By combining facial recognition technology with large-scale language processing models, we were able to uncover addresses, phone numbers, family details, and some social security numbers.
