Anti-Surveillance Clothing Thwarts Facial Recognition
The use of facial recognition software for commercial purposes is becoming more common, but, as Amazon scans faces in its physical shop and Facebook searches photos of users to add tags to, those concerned about their privacy are fighting back.
Berlin-based artist and technologist Adam Harvey aims to overwhelm and confuse these systems by presenting them with thousands of false hits so they can’t tell which faces are real.
The Hyperface project involves printing patterns on to clothing or textiles, which then appear to have eyes, mouths and other features that a computer can interpret as a face.
This is not the first time Harvey has tried to confuse facial recognition software. During a previous project, CV Dazzle, he attempted to create an aesthetic of makeup and hairstyling that would cause machines to be unable to detect a face.
Speaking at the Chaos Communications Congress hacking conference in Hamburg, Harvey said: “As I’ve looked at in an earlier project, you can change the way you appear, but, in camouflage you can think of the figure and the ground relationship. There’s also an opportunity to modify the ‘ground’, the things that appear next to you, around you, and that can also modify the computer vision confidence score.”
Harvey’s Hyperface project aims to do just that, he says, “overloading an algorithm with what it wants, oversaturating an area with faces to divert the gaze of the computer vision algorithm.”
The resultant patterns, which Harvey created in conjunction with international interaction studio Hyphen-Labs, can be worn or used to blanket an area. “It can be used to modify the environment around you, whether it’s someone next to you, whether you’re wearing it, maybe around your head or in a new way.”
Explaining his hopes for how technologies like his would affect the world, Harvey showed an image of a street scene from the 1910s, pointing out that every figure in it is wearing a hat. “In 100 years from now, we’re going to have a similar transformation of fashion and the way that we appear. What will that look like? Hopefully it will look like something that appears to optimise our personal privacy.”
To emphasize the extent to which facial recognition technology changes expectations of privacy, Harvey collated 47 different data points commercial and academic researchers claim to be able to discover from a 100x100 pixel facial image, around 2.5% of the size of a typical Instagram photo. Those include traits such as “calm” or “kind”, criminal tendencies like “pedophiles” or “white collar offender”, and simple demographics like “age” and “gender”.
Research from Shanghai Jiao Tong University, for instance, claims to be able to predict criminality from lip curvature, eye inner corner distance and the so-called nose-mouth angle.
“A lot of other researchers are looking at how to take that very small data and turn it into insights that can be used for marketing,” Harvey said. “What all this reminds me of is Francis Galton and eugenics. The real criminal, in these cases, are people who are perpetrating this idea, not the people who are being looked at.”
Guardian: Facial Recognition Prevents Terrorist Attacks:
Facial Recognition Might Stop the Next Brussels: