Show HN: Real-time privacy protection for smart glasses
github.comI built a live video privacy filter that helps smart glasses app developers handle privacy automatically.
How it works: You can replace a raw camera feed with the filtered stream in your app. The filter processes a live video stream, applies privacy protections, and outputs a privacy-compliant stream in real time. You can use this processed stream for AI apps, social apps, or anything else.
Features: Currently, the filter blurs all faces except those who have given consent. Consent can be granted verbally by saying something like "I consent to be captured" to the camera. I'll be adding more features, such as detecting and redacting other private information, speech anonymization, and automatic video shut-off in certain locations or situations.
Why I built it: While developing an always-on AI assistant/memory for glasses, I realized privacy concerns would be a critical problem, for both bystanders and the wearer. Addressing this involves complex issues like GDPR, CCPA, data deletion requests, and consent management, so I built this privacy layer first for myself and other developers.
Reference app: There's a sample app (./examples/rewind/) that uses the filter. The demo video is in the README, please check it out! The app shows the current camera stream and past recordings, both privacy-protected, and will include AI features using the recordings.
Tech: Runs offline on a laptop. Built with FFmpeg (stream decode/encode), OpenCV (face recognition/blurring), Faster Whisper (voice transcription), and Phi-3.1 Mini (LLM for transcription analysis).
I'd love feedback and ideas for tackling the privacy challenges in wearable camera apps!
I appreciate your intent, But...
This does nothing to alleviate my privacy concerns, as a bystander, about someone rudely pointing a recording camera at me. The only thing that alleviates these concerns about "smart" glasses wearers recording video, is not having "smart" glasses wearers. I.e., not having people rudely walking around with cameras strapped to their faces recording everyone and everything around them. I can't know/trust that there is some tech behind the camera that will protect my privacy.
A lot of privacy invasions have become normalized and accepted by the majority of the population. But, I think/hope a camera strapped to someone's face being shoved into other peoples' faces will be a tough sell. Google Glass wearers risked having the camera ripped off their faces / being punched in the face. I expect this will continue.
Perhaps your tech would have use in a more controlled business/military environment? Or, to post-process police body camera footage, to remove images of bystanders before public release?
I feel quote uneasy about stuff being recorded and sent to big corps routinely with cameras strapped to random bystander faces. I'm much more bothered by the fact this gets sent to a central location and processed than mere fact of being recorded without consent.
However, even with this uneasy feeling, one has to recognise a street is a public space and I don't see how one can have reasonable expectation of complete privacy there. There is nothing rude about recording what you can see.
The privacy expectation I have is not that my picture will not be captured, but that such recordings from many unrelated people will not be aggregated to trace my movements .
So in summary, I think everyone has a right to photograph or record whatever they like in a public space, but the action of pooling all such recordings, and running face tracking on them to track individual people (or build a database of movements, whatever) is breaching these people's privacy and there should be laws against it.
Agreed.
Something like this tool is ridiculous against companies like Google or Meta. Just with their phone apps and OS control with a video like the displayed those companies could know exactly who each person in the video is, what are they doing, who they are with, and record that information forever.
In the video I see three young women, another woman near the zebra crossing. A young woman with a young man, a woman walking with two men on the sides, and another young couple. I know their heights, if they are fat or slim, the type of their hair and so an AI could know that and with that information and a little more like someone of one group having location activated it is enough for a computer to automatically decode the remaining information.
If enough people wear those stupid glasses it means in a city everyone is surveilled on real time with second and centimetre accuracy, included indoors in places like restaurants or malls.
This is too much power that no company or institution should have. If Meta or google have the ability to do that, they will be required by the US government to give that info automatically with some excuse like "national security".
Seriously. There has been so much progress in the area of non-consensual recording and processing of data, and so little in the area of countermeasures. You can do a web search for the former and find tons of hardware and software that'll help you spy on folks. Searching for adversarial design gives scientific papers. It implies that there is little-to-no measurable demand for privacy (at least of this sort) in the marketplace.
I agree with all you said, but I don't believe there is any way you could protect yourself from being recorded.
The only way for this to work are legal regulations. But those can be easily dismissed as not possible to implement. So this is good PoC to show what is possible and way to discover how this could function. Without such implementation I don't believe you are able to convince anybody to start working on such regulations.
it's an ass-backwards approach to privacy isn't it?
I don't need something to protect the privacy of others from me, I need something to protect my privacy from others. The majority of people who use smart glasses are not going to be using this - where is the product that will protect me from them?
> Real-time processing – 720p 30fps on laptop
Have you tried running this on a phone or standalone smart glasses? 30fps is horrible performance on a laptop given that it's probably 10-100x more powerful than the target device. And what laptop? Based on your screenshot, I'm guessing you're using a new Apple Silicon mac which is practically a supercomputer compared to smart glasses.
With proper implementation, on-device processing on a smartphone is feasible. On-glasses processing would be challenging, especially with battery constraints.
The 720p 30fps figure is from a PoC implementation, so there is still significant room for improvement. And yes, the demo is on an Apple Silicon M2.
Still much less effective than spray paint liberally applied.
I think this is interesting research. I could see BLE beacons that announce what level of sharing one is comfortable with. Not unlike systems used at conferences to denote if someone can take their picture and what they can do with it.
I'm curious how do you handle case when camera sees multiple people and you detect consent.
If I understand correctly how this works consent can come from camera operator and be attributed to recorded person
When multiple people are in view and the system detects consent, the current implementation assumes the person closest to the camera is the one giving it. This is not ideal, so active speaker detection is planned.
Verbal consent is just one example. Depending on the situation, other interfaces may work better, such as having a predefined list of friends who are always consented.
I wonder if I can make a device that recognizes Meta, etc, glasses and aim and shoot laser at the camera lens (and send my users to jail after the device misses the lens and hits the glasshole's eye instead). If there are many glassholes, the laser would just shoot at them alternatingly at eg. 30 shots/seconds.. and make pew-pew noises each time.
Using something like near infrared would dazzle the sensors in these devices and is invisible and safe for human eyes.
This is like creating a t-shirt that says “don’t shoot me”.
The problem isn’t consent, the problem is that the gun is being needlessly pointed at you in the first place.
I think mainstream adoption of smart glasses could be slowed more by privacy concerns than by hardware limitations. Remember Google Glass? While the hardware keeps improving, I want to make sure we're also addressing the software side.
Your solution does absolutely nothing to address privacy concerns with "glassholes".
Did you look at egoblur? its a lot more effective at face detection than https://github.com/ageitgey/face_recognition granted, you'd have to do your own face matching to do exception.
I'll take another look at EgoBlur to see if it's a good fit, thanks. When I checked it briefly before, I thought it was focused on post-processing (non-realtime) and didn't integrate with matching easily, but it's definitely solid tech.
Reminds me of the Black Mirror episode "White Christmas".
https://en.wikipedia.org/wiki/White_Christmas_(Black_Mirror)
Smart solution for wearable privacy. Consent-by-voice and offline processing make it practical, and I’m curious how you’ll expand beyond face blurring.
Thanks! Beyond face blurring, I'm planning to detect and redact other private information like license plates, name tags, and sensitive documents.
Most of these features are aimed at protecting bystanders, but I'm also interested in exploring privacy protection for the wearer. For example, automatically shutting off recording in bathrooms, or removing identifiable landmarks from the video when they could reveal the wearer's location, depending on the use case.
This reminds me of the Quantum Thief series of novels by Hannu Rajaniemi and its concept of Gevulot. It is a system that allows people, to set the desired level of privacy in every social encounter, to share memories and to access the exomemory. People can obscure themselves from being seen by others
Meta introduces these and sells a few million so far. Apple releasing theirs in 2026/2027 will increase adoption by ten fold. I thought they might add soomething like this where it blurs out those in public and or changes their face to a celebrity or no name faces.
Note i love my Meta Ray Bans yet they aren't durable and after 2 breaking after 20 months Im not sure I will buy another pair (maybe the upcoming Oakleys but still same not reliable to non-durable tech in those). But for me they are great for taking pics and videos (asking the time too when my phone isnt on me) of my life and when traveling solo using the Live AI feature to learn more about my surroundings. Like what's the history of the train tracks Im walking alongside.
https://imgur.com/a/Y5WjYUH
I can taste the irony.
Real-time self privacy protection people might buy into. They can trust what they are doing and they get the benefit. E.g. automatically not recording during certain periods without having to explicitly signal it each time is a convenience. I don't want my big post Taco Bell Shit to waste my storage or accidentally stream it somewhere. At the same time, I also don't want my smart glasses to be tethered to something which can run AI and video processing models all the time, or to be dumping it all to the cloud 24/7 live either, as battery is already one of the big problems with the tech currently. At least that's something the advance of technology in general may be able to solve though, so I like the concept for self-application.
I don't know this would actually make others any more comfortable/guaranteed, which is where the bulk of the privacy concerns around smart glasse lie. I'm also not sure if that's actually a problem for more technology to be able to solve. The nice thing about e.g. recording consent is you do it on your device, or in some cloud device by a 3rd party. The moment you do it through the other person's device it's no different (from a trust/easiness/guarantee factor) than just asking the person themselves to not record you (again, beyond the benefits to the operator themselves around that). If some guy walked up to you with a literal video camera pointed at you and said "it only records your face if you give it consent":
- Do you even trust the person is telling the truth?
- If you do, do you trust the technology to work 100% of the time?
- If it is a lie or doesn't work and they post something, are you any better off than if you weren't asked for consent and said no? (note: separate this from the value to the person operating the camera, who does get value from those that say yes).
- If you say no, are you really not going to feel awkward compared to not having someone recording the rest of the situation?
Maybe the answer for you to all of these is "Absolutely!", I'd bargain to bet it's not the case for the vast majority of those concerned with the privacy implications of the technology though (which I don't consider myself a part of that group necessarily, I'm just putting myself in their shoes for a second).
.
So I'd split my thoughts into two main sections: I'm not sure technology is going to solve any external privacy concerns here, but I think it's an interesting approach to internal privacy concerns with the tech.
At some point we’ll probably have smart glasses that actively block or scramble cameras, so random people can’t record you just walking down the street.
There is a strange ingredient in the ether right now - confounding privacy with censorship. If I can see something, and hear it, then blurring it is not "privacy", it's censorship.
I fully expected this to be about solutions for empowering the user to ensure that her audio and visual experiences weren't being exfiltrated to a third-party, not that they were being censored before even getting to her.
Can you expound on the difference between privacy and censorship?
Privacy is the right to wear a veil if you choose.
Censorship is telling everyone else they have to avert their eyes as you walk past.
In all places and situations?
Yeah, I think that basic distinction applies in some universal way.
Whether or not privacy is proper in a particular situation, or whether censorship is, that's another matter. But it's frustrating to see them confounded in this way.
Crippling the user's device (which is ultimately an affront to freedom of general purpose computing) in order to censor data which is plainly available through sensory input is not, in any sense, "privacy".