Meta’s smart glasses will soon provide detailed information regarding visual stimuli

The Ray-Ban Meta glasses are getting an upgrade to better help the blind and low vision community. The AI assistant will now provide “detailed responses” regarding what’s in front of users. Meta says it’ll kick in “when people ask about their environment.” To get started, users just have to opt-in via the Device Settings section in the Meta AI app.

The company shared a video of the tool in action in which a blind user asked Meta AI to describe a grassy area in a park. It quickly hopped into action and correctly pointed out a path, trees and

→ Continue reading at Engadget

Similar Articles

Advertisment

Most Popular