Meta is offering early access to its most impressive AI features for the Meta Ray-Ban smart glasses. These features utilize multimodal AI to provide information about what the glasses’ Meta’s AI assistant can see and hear through the camera and microphones.
In an Instagram reel, Mark Zuckerberg showcased the update by using the Meta Ray-Ban smart glasses to ask for fashion advice. He held up a shirt and asked the glasses to suggest matching pants. The glasses responded by describing the shirt and providing a few suggestions for complementary pants. Additionally, Zuckerberg demonstrated the glasses’ Meta’s AI assistant’s translation capabilities and displayed a couple of image captions.
During an interview with The Verge’s Alex Heath in September, Zuckerberg unveiled the multimodal AI features of Meta’s Ray-Ban glasses. During the same interview, Zuckerberg also discussed other features of the Meta’s AI Ray-Ban smart glasses. These features include the ability to request assistance with captioning photos and translation and summarization, which are commonly found in other products from companies like Microsoft and Google.