Smart glasses with built-in artificial intelligence have been launched as a discreet and futuristic technology. But for Meta, discretion may now become a costly promise to break.
Lawsuit After Revelation of Private Content Review
Meta is now subject to legal action after an investigation revealed that subcontractors used by the company reviewed video recordings from customers using the company's AI smart glasses. According to TechCrunch, the reviewed material included nudity, sexual acts, and other sensitive content.
Lawyers behind the lawsuit argue that Meta's own marketing materials explicitly promised users control over sharing recordings, and that privacy would be protected. These promises, according to the lawsuit, were not kept.
Marketing promised privacy and user control – but in reality, private content was reviewed by unknown third parties.

What Exactly Happened?
The core of the case is that users were unaware that video recordings from their glasses could be sent to and reviewed by human workers at subcontractors. This is a practice known from other AI systems – for example, in connection with voice assistants – but which is rarely communicated clearly to consumers.
The method is typically used to train and improve AI models, often called human annotation or data labeling. The problem arises when such practices are not sufficiently disclosed in the terms of use, especially when the material being reviewed contains highly private content.

An Industry Problem, Not Just a Meta Problem
The case against Meta is not unique – it mirrors a pervasive challenge in the smart glasses and AI industry. A comparison of privacy practices among different smart glasses manufacturers shows significant differences in approach.
Apple, with Vision Pro, has chosen a strategy they call “privacy by design,” where as much data processing as possible occurs directly on the device. Eye-tracking data, hand movement information, and spatial mapping are stored locally and not shared with Apple or third-party apps without explicit permission, according to the company's own documents. Furthermore, an indicator light illuminates when recording is in progress – alerting nearby individuals.
Xreal, on the other hand, relies more on connected apps for data processing, and its privacy policy varies by country – making it more difficult for consumers to navigate.
What Does This Mean for Consumers?
For those who already use smart glasses – or are considering purchasing a pair – the case is a reminder that privacy policies should be read carefully. It is not enough to rely on marketing promises of “control” and “privacy.”
The lawsuit against Meta is currently pending, and no judgment has been rendered yet. It is important to emphasize that the allegations in the lawsuit have not yet been proven in court. Meta has not publicly commented on the case in detail at the time of this article's publication.
Regardless, the case will put further pressure on regulators and industry players to clarify standards for acceptable data collection and review in connection with wearable AI technology.
