The blur doesn’t cut it: AI can identify people in blurred images
A trio of researchers has found off-the-shelf AI software can be used to identify people in blurred or pixilated images. The researchers have uploaded a paper describing the experiments they carried out with AI software identification of people or other items in blurred out images, what they found and reveal just how accurate they found it could be.
A popular means of retaining privacy in videos or photographs while still maintaining some degree of authenticity is blurring the parts you do not want people to recognize, such as faces that appear at a protest rally, for example. But now, it appears that this technique may not be enough, because computers have become smart enough to recognize them anyway.
The AI software is not able to reconstitute an image, the team notes—rather, it analyzes the image and compares what it finds with other pictures available on Facebook, Instagram or YouTube, for example—places where there are photographs of identifiable people. Their study consisted of obtaining pictures of people from public places on the Internet and then blurring or pixelating them. Both images were then fed to the AI system to teach it how to spot one given the other.
Once that was complete, the researchers then fed the system different pictures of the same people and asked it to identify which photos corresponded with blurred images. They found the AI system able to do so with an overall average of 57 percent accuracy. That average accuracy jumped to 85 percent when the system was given four more chances with each image.
The accuracy of individual results varied by degree of blurring or pixelating, they note. They also point out that the same type of technology could be used to figure out a street address that has been blurred out or to identify some other object.
What this means, the researchers explain, is that if a person posts a picture on the Internet that has their face in it, but which has been pixelated or blurred out, there is a better than even chance that someone could identify them using similar software, particularly if they have a strong Internet presence.
They suggest that users of products such as YouTube’s blurring service be made aware that such measures are not adequate to protect a person’s privacy.
Richard McPherson, Reza Shokri, & Vitaly Shmatikov (2016). Defeating Image Obfuscation with Deep Learning arXiv arXiv: 1609.00408v2