Helping to share the web since 1996


Meta’s Visionary Innovation: AI-Powered Ray-Ban Smart Glasses

black Ray-Ban Wayfarer sunglasses on beach sand

The current landscape of smart glasses is intriguing, occupying a space between two primary objectives: functioning as a device with audio and, potentially, camera capabilities, or serving as an augmented reality display for facial use. Ray-Ban’s smart glasses tend to align more with the former category, offering satisfactory audio features and a decent camera. However, it falls short in providing an intuitive method for framing photos and a vocal assistant that can rival the capabilities of Alexa or Google Assistant.

In contrast, Meta, akin to technology giants such as Google, Microsoft, and OpenAI, is fully immersed in the latest AI advancements. The company recently introduced a new image generator and AI chatbots across its diverse range of services. Presently, Meta is actively incorporating AI into its smart glasses.

The focus here lies on the device’s camera. Meta has introduced a new beta version that integrates multimodal AI-powered functionalities into the Ray-Ban smart glasses. With this beta, users can initiate the AI with the familiar “hey Meta” command, directing it to observe the surroundings before posing a question. In a demonstration, CEO Mark Zuckerberg instructs his glasses to inspect a shirt he’s holding and recommend suitable pants to complement it.

According to Meta, users can also instruct the glasses to generate a caption for a photo being taken or describe an object they’re holding. This latter capability holds significant appeal for individuals with visual impairments. The glasses provide both an audio response and a written version accessible through the Meta View app.

Newer Articles

Older Articles

Back to news headlines