Connect with us


Meta just stuck its AI somewhere you didn't expect it — a pair of Ray-Ban smart glasses




Smart glasses have arguably failed to take off, but the addition of artificial intelligence (AI) could be the key to developing a truly transformational wearable Technology

In the US and Canada, Ray-Ban Meta smart glasses have received a rollout of multimodal AI Technology with software called the "Meta AI virtual assistant." With multimodal AI — which means generative AI that can process queries that involve more than one medium (for example, both audio and imagery) — the device can better respond to queries based on what a wearer is looking at. 

"Say you’re traveling and trying to read a menu in French. Your smart glasses can use their built-in camera and Meta AI to translate the text for you, giving you the info you need without having to pull out your phone or stare at a screen," Meta representatives explained April 23 in a statement.

Related: Smart glasses could boost privacy by swapping cameras for this 100-year-old technology

The device first takes a photo of what a wearer is looking at, then the AI taps into cloud-based processing to serve up an answer to a query, delivered by speech, such as "what type of plant am I looking at?"

(Image credit: Meta / Ray-Ban)

Meta first explored integrating multimodal AI into the Ray-Ban Meta smart glasses in a limited release in December 2023.

Testing the AI functionality in this device, a reporter from The Verge found that it mostly responded correctly when asked to identify the model of a car. It could also describe a type of cat, for example, and its features in an image snapped via the camera. But the AI ran into trouble in accurately identifying the species of plants belonging to one reporter and struggled to correctly identify a groundhog in their neighbor's backyard.