Fawkes is a privacy protection system developed by researchers at SANDLab, University of Chicago. — Fawkes
Have you seen? The photo on the homepage looks a bit different. I changed the original photo with Fawkes and adjusted the colors with ML, some filters (e.g. sepia) with Pixelmator Photo.
Fawkes
Fawkes is “a system that helps individuals inoculate their images against unauthorized facial recognition models. Fawkes achieves this by helping users add imperceptible pixel-level changes (we call them “cloaks”) to their own photos before releasing them.”
See also Fawkes: Protecting Privacy against Unauthorized Deep Learning Models
In this way, known and unknown third parties are restricted from creating face recognition models from publicly available photos. If they try to identify you by giving the model an unaltered / “un camouflaged” image of you (e.g. a photo taken in public), the model will not recognize you.
Fawkes is a software tool that runs locally on your computer. Here is the result of a modification and of course I will not show the original picture 🕵️♂️:
This image is generated with:
fawkes -d ./imgs --mode low
Summary
The generation of a fake image takes some time and you can see patches in my face. However, this is not a photo that you want to share. Especially when you compare it with all these photos modified with Photoshop or generated with SnapChat filters. After I ran some filters over the picture, it doesn’t look that bad. Maybe the filters would have been enough, and all the effort wouldn’t be necessary?
Any comments? Write me on Twitter @choas (DM is open).