Selvaggio thought up the project, which he calls URME Surveillance, when he was living in Chicago, where law-enforcement officials have access to more than 30,000 interlinked video cameras across the city. He wanted to start conversations about surveillance and what technology does with our identity. He knew that researchers have found that facial-recognition software exhibits racial biases.
Selvaggio sees two routes to elude facial-recognition programs. The first is to disappear: go offline and off the grid. Selvaggio prefers the second option, which is to flood the system with weird, incongruous data. Wear someone else’s likeness or lend out your own. (Before donning a prosthetic mask, check to see whether your city or state has anti-mask laws, which may make wearing one illegal.) Even without a mask, though, you can confuse some facial-recognition programs by obscuring parts of your face with makeup, costuming, hairdos and infrared light. Artificial-intelligence programs look for elliptical, symmetrical faces, so obscure an eye, cover the bridge of your nose, wear something that makes your head look unheadlike. “They have all of our information,” Selvaggio says. “So then let’s make more information that isn’t even true, and then let’s make more information on top of that.”
The Guardian Sport