Meta’s Smart Glasses Are Here. The Risks for Women and Kids Are Only Starting to Come Into Focus.
Meta’s Ray‑Ban smart glasses look like any other stylish sunglasses, until you realize they can quietly record what you see and send that video back to Meta to be analyzed by AI. For women and children, that mix of cameras, constant connectivity and powerful algorithms creates risks that go far beyond a typical smartphone.
Privacy regulators in Europe, child‑safety experts, pediatricians and digital‑rights groups are now warning that these glasses can supercharge harassment, stalking and surveillance in everyday spaces. Their message is simple: just because the tech is getting smaller doesn’t mean the dangers are.
What Meta’s Glasses Actually Do

Meta’s latest Ray‑Ban smart glasses include built‑in cameras, microphones and an AI assistant that can capture photos and video, livestream to social platforms, and send what you record back to Meta’s servers for analysis and storage. That means your “point of view” footage can be processed not only by algorithms but also by human reviewers.
A recent inquiry from members of the European Parliament cited a Swedish investigation showing that images and videos from Meta’s glasses were reviewed by workers in Kenya, who saw “private life and extremely sensitive moments,” including footage of people undressing and intimate scenes, often with no indication that those individuals knew they were being filmed. Meta has said recordings may be examined to improve its systems, which means users aren’t the only ones with eyes on what the glasses capture.
To signal recording, Meta relies on a small LED light on the front of the frames. European privacy watchdogs have questioned whether a tiny indicator is enough to give bystanders a real chance to understand they’re on camera and object, especially in crowded, dimly lit or kid‑focused spaces.
Why Women Are Especially Exposed
For women, smart glasses layer new capabilities onto very old problems: harassment in public, image‑based abuse, and stalking.
Digital‑rights advocates argue that camera‑equipped, AI‑powered glasses effectively turn the wearer into a walking surveillance device, making it much easier to film women without consent in bars, on public transit or in workplaces. Campaigners warn that these glasses “scan kids and families in public” and can be paired with facial recognition, raising the risk that women and girls are identified, tracked and cataloged in places where they should feel safe.
There’s also a pipeline from “just a quick video” to deeply harmful misuse. With high‑quality, close‑up video captured from eye level, it becomes easier to feed images into AI tools that generate deepfake pornography or non‑consensual sexual images. Child‑protection groups and privacy experts are already tracking sharp increases in deepfake abuse, and worry that widespread use of wearable cameras will simply flood the system with more raw material.
Then there’s stalking. European data‑protection authorities note that smart glasses can capture faces, voices and even technical fingerprints of nearby devices, such as Wi‑Fi and Bluetooth identifiers, which can be used to track someone’s movements over time. For women who are trying to leave an abusive partner or already dealing with a stalker, being unknowingly recorded and logged in specific locations—outside a workplace, in a new neighborhood, at a shelter—can be dangerous.
What This Means for Children
Children sit at the center of many of the biggest questions about these devices: consent, long‑term data collection and developmental health.
The American Academy of Pediatrics (AAP) has warned that kids now grow up inside “digital ecosystems,” where apps, platforms and AI tools can shape mood, behavior and even development. The AAP’s updated guidance emphasizes that parents should avoid intrusive surveillance and targeted advertising aimed at kids, and instead focus on quality content, family rules and device‑free times and places.
Read: These are the Apps and Platforms Law-Enforcement Says Predators Often Use to Contact Children
Smart glasses take that concern off the screen and into physical spaces. A child in a classroom, playground or doctor’s office can now be recorded by a parent, teacher, older teen—or a stranger in the next booth—without ever seeing a phone. European regulators have stressed that these devices inevitably record not just users but also “bystanders,” including children who never chose to interact with the technology.
In the United States, children’s data is supposed to be protected by the Children’s Online Privacy Protection Act (COPPA), which requires parental consent before companies collect personal information from kids under 13. That definition explicitly includes photos, videos and audio that can identify a child. The FTC has taken action against makers of internet‑connected toys and devices for allowing third parties to collect children’s recordings without proper consent, and in some cases has required the deletion of that data.
But COPPA was written long before everyday facial recognition and always‑on wearables. If a pair of smart glasses scans a playground, uploads footage to Meta, and AI tools can recognize children’s faces, it’s not clear how parents could ever provide meaningful consent for that kind of mass capture. That gap is one reason the White House’s Kids Online Health and Safety Task Force has urged tech companies to make privacy protections for minors the default setting and to prioritize kids’ wellbeing in product design.
How Regulators Are Responding
Regulators and lawmakers in multiple countries are starting to scrutinize Meta’s smart glasses more closely.
In Europe, members of the European Parliament have questioned Ireland’s Data Protection Commission, the lead privacy regulator for Meta in the EU, about how the company is protecting people who appear in smart‑glasses footage, including children and people filmed in intimate situations. Their letter highlights concerns about bystanders’ lack of consent and what happens to highly sensitive recordings once they reach Meta’s systems.
Separately, the European Data Protection Supervisor has flagged smart glasses as a high‑risk technology because they combine facial picture capture, sound recording and behavioral analysis in ways that can make it almost impossible for people to remain anonymous in public spaces. The report warns that these devices enable “intrusive analysis of behavior and profiling” for both users and the unsuspecting people around them.
In the UK and US, privacy advocates and law firms have called for investigations and lawsuits, arguing that Meta’s glasses may violate privacy and consumer‑protection laws and that live AI features could dramatically expand how much real‑world data the company gathers. Some digital‑rights groups are going further, urging family‑focused venues like theme parks, zoos, children’s museums and faith communities to ban facial‑recognition glasses outright.

What Families and Women Can Do Right Now
While regulators debate, families and women are left to make day‑to‑day decisions about how to respond. Experts say a few practical steps can help.
First, set clear expectations in the places you control. Schools, clinics, gyms, shelters and childcare centers can update their policies to restrict camera‑equipped wearables in bathrooms, locker rooms, exam rooms and classrooms, just as many already do with smartphones. Parents can also talk with kids’ coaches, camp directors and youth‑group leaders about how these devices will be handled. Check out Fight For The Future for more information.
Second, bring smart glasses into your family’s media plan. The AAP recommends that families develop a written “family media plan” that spells out which devices are allowed where, and sets screen‑free times and places such as meals and bedrooms. That same approach can cover smart glasses, for example, no recording at sleepovers, in bathrooms, or in other children’s homes without explicit permission from their parents.
READ: Screens before school: how early tech exposure could shape the next generation’s minds
Third, know that digital surveillance can be a form of abuse. Domestic‑violence organizations increasingly treat constant monitoring, through phones, tracking tags or wearables, as a red flag. If you notice a partner insisting on wearing smart glasses in private spaces, recording without consent, or “coincidentally” showing up wherever you go, it may be worth reaching out to a local hotline or shelter to talk through your options.
Finally, many child‑safety and privacy experts say this is a moment to push for stronger rules. Advocates are calling on lawmakers to modernize COPPA to address facial recognition and bystander data, and on regulators to require companies to build in data minimization and default protections for minors. For now, though, individual families and women are being asked to carry the burden of navigating a device that can record them almost anywhere, at any time, whether they agreed to be part of Meta’s ecosystem or not.
You might want to also read:
- What Parental Control Apps Miss That Predators Exploit
- 13 ways your smart TV may be tracking you and what you can do about it
- The future of play: how kids will learn, create, and connect in 2035
Disclosure: This article was researched with the assistance of AI.
Like our content? Be sure to follow us on MSN and Newsbreak
