Sign Up
..... Connect Australia with the world.
Categories

Posted: 2024-10-03 14:52:00

Meta sees the future of smart glasses as being always-ready AI assistants. Its Meta Ray-Bans have already been pushing forward in that direction, and a number of features announced a week ago by Mark Zuckerberg that I tried out on Meta's campus are already becoming available starting today. The extra camera-based AI features look like they'll be further blurring the line between AI requests and the glasses' proactive use of the camera.

A new app and firmware update rolling out now promises a more natural set of requests that the glasses will respond to for taking photos. As I tried already, you could just ask about something in front of you and the glasses could use that as a trigger to take a photo.

The glasses will also recognize QR codes and open them on your phone and make phone calls based on a phone number that's seen by the camera (if you ask). 

A reminder feature can be used to jog some memories later -- another that can remember where you've parked isn't something I've tried yet but am curious how it works. It's the start of Meta's push into using future AR glasses as assistive memory devices.

The update allows the glasses to record and send voice messages on Messenger and WhatsApp, but the improved music controls I tried at Connect aren't here yet. Neither is the live translation feature Mark Zuckerberg showed off on-stage or the AI assistant feature that works while recording live video.

The future of Meta's AI-based camera features is clearly going to grow, and privacy questions will grow along with them. A group of students recently found a way to use the Ray-Bans to identify faces by relaying the photos via Instagram to another AI tool, and with Meta opening up camera access on Quest headsets to developers next year, the future of advanced camera-based AI may start heading in surprising directions fast.

View More
  • 0 Comment(s)
Captcha Challenge
Reload Image
Type in the verification code above