"At a high level, Fawkes takes your personal images, and makes tiny, pixel-level changes to them that are invisible to the human eye, in a process we call image cloaking."
"At a high level, Fawkes takes your personal images, and makes tiny, pixel-level changes to them that are invisible to the human eye, in a process we call image cloaking. You can then use these "cloaked" photos as you normally would, sharing them on social media, sending them to friends, printing them or displaying them on digital devices, the same way you would any other photo. The difference, however, is that if and when someone tries to use these photos to build a facial recognition model, "cloaked" images will teach the model an highly distorted version of what makes you look like you. The cloak effect is not easily detectable, and will not cause errors in model training. However, when someone tries to identify you using an unaltered image of you (e.g. a photo taken in public), and tries to identify you, they will fail."
This is interesting. I currently help run a private community and although the members are encouraged to operate under their real identities, they are extremely privacy conscious.
I can imagine that if we auto-applied this image cloaking to all uploaded photos, it would be a nice differentiator. Thanks for sharing.
Cool, but isn't this "product" hunt not "research paper" hunt? I keep seeing more and more closed betas, pre-releases, art projects and jokes sites posted that people can't actually try and use get posted the the site these days...
Hustle X