How to Thwart Facial Recognition

Tech
Tue 20 August 2019:
“Why not give the camera what it wants, which is a face?” says Leonardo Selvaggio, an interdisciplinary artist. Just don’t give it your face. To enable people to obfuscate facial-recognition software programs, Selvaggio, who is 34 and white, made available 3-D, photo-realistic prosthetic masks of his own face to anyone who wants one.
He tested the masks by asking people connected to him on Facebook to upload pictures of themselves in the prosthetic: It didn’t matter if they were skinny women or barrel-chested men; short or tall; black, brown, Asian or white — the social network’s facial-recognition software recognized them as Selvaggio. “There’s nothing more invisible to surveillance and security technology than a white man,” he says.

Selvaggio thought up the project, which he calls URME Surveillance, when he was living in Chicago, where law-enforcement officials have access to more than 30,000 interlinked video cameras across the city. He wanted to start conversations about surveillance and what technology does with our identity. He knew that researchers have found that facial-recognition software exhibits racial biases.

The programs are often best at identifying white and male faces, because they have been trained on data sets that include disproportionate numbers of them, and particularly bad at identifying black faces. In law-enforcement contexts, these errors can potentially implicate people in crimes they didn’t commit.

Selvaggio sees two routes to elude facial-recognition programs. The first is to disappear: go offline and off the grid. Selvaggio prefers the second option, which is to flood the system with weird, incongruous data. Wear someone else’s likeness or lend out your own. (Before donning a prosthetic mask, check to see whether your city or state has anti-mask laws, which may make wearing one illegal.) Even without a mask, though, you can confuse some facial-recognition programs by obscuring parts of your face with makeup, costuming, hairdos and infrared light. Artificial-intelligence programs look for elliptical, symmetrical faces, so obscure an eye, cover the bridge of your nose, wear something that makes your head look unheadlike. “They have all of our information,” Selvaggio says. “So then let’s make more information that isn’t even true, and then let’s make more information on top of that.”

New York – Malia Wollan
The Guardian Sport
Think your friends would be interested? Share this story!

Leave a Reply

Your email address will not be published. Required fields are marked *