Meta's Ray-Ban glasses will soon describe what you're looking at, like something out of a sci-fi movie

Last April, Meta unveiled its Ray-Ban Meta smart glasses, which were meant to capture photos and videos with a subtle, wearable design. Now, the company is preparing to roll out a new feature that gives the glasses some AI-powered vision capabilities akin to those seen in sci-fi films like Minority Report and * Blade Runner*. The glasses will be able to analyze what the wearer is looking at and describe it.

The upcoming feature will allow the Ray-Ban Metas to analyze what's in front of the wearer and provide descriptive audio feedback through the glasses' built-in speakers or via Bluetooth to another device. For example, the glasses could tell the wearer about the number of people in a room, the clothing they're wearing, and even subtle details like whether a sports mascot is wearing a helmet.

Ultimately, it could even describe whether a room has a fireplace, or whether a plant is toxic or non-toxic. That level of description could prove invaluable for people with visual impairments or other disabilities.

The new feature is made possible by a new partnership with AI company COGS, which has developed a way to quickly and efficiently process visual data using AI vision software and Meta's ARR (augmented reality rendering) Engine, which overlays digital information on top of the real-world view.

COGS' technology can identify shapes, objects, and people in 61 categories, such as clothing, food, and furniture, and provide information about them in real-time. The partnership with Meta is the AI company's first major deal, though it did partner with Apple on its Live Text feature, which allows users to snap photos of things like printed recipes and automatically convert them into digital text.

Meta has recently been ramping up its AI capabilities, which include everything from its AI-powered Oculus virtual reality (VR) workouts to its evolving stable of chatbots to its controversial plan to create a universal language translator powered by AI.

The company hasn't yet revealed when the new feature will roll out, but did say it would occur "in the coming months," and that the updates would be available first on the Ray-Ban Meta 2 glasses, followed by the Ray-Ban Meta 1.

Though the new feature gives the glasses some AI-driven vision capabilities, they don't have the same kind of general sight understanding that people do. For example, they can't follow a specific person or object around the room or recognize specific individuals, at least not yet.

And regulations and ethical concerns surrounding AI-driven vision systems could also temper their usefulness or availability. But even in their current form, the Ray-Ban Metas could prove to be valuable gadgets for people with visual impairments or others who could benefit from knowing more about their surroundings.

And if Meta continues to develop the glasses' AI capabilities, who knows what they could understand in the future? The company certainly seems to be heading down a sci-fi path, at least in terms of its technology.

Read more