Meta has announced the rollout of multimodal AI features for its Ray-Ban smart glasses, offering users an immersive experience that combines visual and auditory inputs. Mark Zuckerberg demonstrated the capabilities of the new features in a recent Instagram reel, showcasing the glasses’ ability to provide fashion suggestions, translate text, and offer image captions.
Speaking about the multimodal AI features during a September interview with The Verge, Zuckerberg emphasized the glasses’ potential to answer questions throughout the day. This could include providing information about the wearer’s surroundings or assisting with language translation. Still in beta, the glasses’ AI assistant accurately described various objects, including a California-shaped wall sculpture.
Early Access Test and User Engagement
The early access test for the multimodal AI features will be limited to a select group of users in the United States who opt-in. Instructions for participation can be found here. This exclusive trial phase aims to gather valuable user feedback and fine-tune the features before a broader release. The Meta AI assistant embedded in the latest Ray-Ban glasses promises many possibilities, utilizing AI to process and deliver audio and video outputs.
“Look and Ask” Feature:
Zuckerberg showcased the “Look and Ask” feature, allowing users to capture a photo and ask questions about it within 15 seconds. The conversational command, “Hey Meta, look and…,” opens up possibilities for users to inquire about various aspects of their surroundings. For example, it facilitates the real-time translation of foreign texts, making it a valuable tool for travellers.
ALSO READ: Meta AI Chat Early Access: How to Get It on Messenger, Instagram, and WhatsApp
Privacy Concerns
Despite the promising technological advancements, experts have raised privacy concerns. Meta’s privacy policy indicates that every photo taken with the glasses is stored and retained for training AI models. While essential data like battery status and connectivity is collected automatically, additional data, such as usage analytics and geolocation information, can be voluntarily shared by users.
The privacy debate surrounding Meta’s new features intensifies as users grapple with the decision to share additional data. Experts caution that vague statements about collecting data to respond to potential abuse or policy violations could leave room for misuse. Heather Shoemaker, CEO and founder at Language I/O, questions the clarity and nature of the data collected in the name of user protection.
Challenges Faced by Meta’s Ray-Ban
Meta’s initial foray into the smart glasses market faced challenges, with sales falling short of expectations by 20%. Even those purchased needed more utilized, with only 10% remaining active after 18 months. The introduction of advanced AI features represents Meta’s bid to reverse this trend and captivate users. However, the success of this endeavour hinges on addressing and alleviating privacy concerns.
ALSO READ: LG Unveils Stylish and Compact CineBeam Qube Laser Projector
In the quest for innovation, Meta’s journey with Ray-Ban smart glasses leaps into the future. As early access begins, the tech giant faces the dual challenge of impressing users with cutting-edge features while assuring them of robust privacy safeguards. Only time will tell whether Meta can strike the right balance and reshape the narrative of smart glasses technology.