Let's Kill Facial Recognition For Good

Sen. Ed Markey
Sen. Ed Markey
Photo: Zach Gibson (Getty Images)

There’s a buffet of things to be scared shitless about these days, but for those who have sampled even a taste of big tech’s eagerness to build and sell facial recognition surveillance software to law enforcement, it’s uniquely rancid.


Among the challenges in reining in this dangerous class of artificial intelligence—which marries the lack of oversight endemic to tech with the lack of accountability law enforcement enjoys—is that there aren’t really any law governing the use of these products to speak of. Even the most basic suggestions, like asking Amazon to submit its Rekognition suite to testing by the National Institute of Standards and Technology, have been ignored. With any luck, though, a bill announced today by four members of Congress could put us on the path toward a solution.

The Facial Recognition and Biometric Technology Moratorium Act, at least from my reading of the bill’s text, delivers on its name: a full stop to use of this entire class of software at the federal level. It broadly defines this class of technology as software that performs “remote biometric recognition in real time or on a recording or photograph”—and not just facial analysis and identification, but also identification based on other characteristics like gait or voice analysis. Fingerprint and palm-print analyses are the only acceptable carve-outs in the bill’s present wording.

Foreseeing loopholes down the line, Senator Ed Markey, the bill’s author, specifies that this ban would not only include employees of federal agencies, but contractors and subcontractors as well; it likewise specifies that, geographically, the ban applies to all of the United States: continental, territorial, ports and airports, as well as border zones.

Markey, along with Sen. Jeff Merkley and Reps. Pramila Jayapal and Ayanna Pressley, have more or less delivered on what privacy advocates have been demanding for years now. Unsurprisingly then, the bill comes with a litany of endorsements from the American Civil Liberties Union, the Electronic Frontier Foundation, Color of Change, and Georgetown University Law’s Center on Privacy and Technology, among others.

Research performed by these and other groups has, time and again, shown significant disparities in accuracy for facial recognition. While the amount of inconsistency differs, this class of software tends to struggle with accurately identifying individuals who aren’t white men. Compounding matters, reporting has shown that clients often use the software in ways they were not designed for, and that the companies selling it have little oversight over that use. Just yesterday, news broke of the first known case where facial recognition technology led to a wrongful arrest—of a Black man who looked nothing like the surveillance video he was supposedly matched to, no less.

Markey’s bill couldn’t have been drafted at a better time, strategically speaking. Increased public scrutiny regarding law enforcement’s proclivity to use force in racially unequal ways has led to Google, Microsoft, and IBM claiming they will—at least where police are concerned—exit the facial recognition space entirely. (Amazon merely announced a one-year halt to licensing such technology to police in a brief blog post that left more questions than answers.) While there are other players in this space, ones that aren’t necessarily household names, it seems likely tech companies might put up less resistance than usual to regulation—at least if they’re keen to look ideologically consistent to their customers.


That said, one of the bill’s sponsors—the ACLU—today released emails that revealed Microsoft courted the Drug Enforcement Administration to buy its face recognition suite for well over a year. Though the emails predate the current promise to halt sales to police, that claim, like Amazon’s, does not specify whether sales to non-police government agencies are similarly off the menu. Those emails mention all three classes of surveillance covered in today’s bill—pattern, facial, and voice recognition—as being available for the DEA to use at its discretion.

“Facial recognition technology doesn’t just pose a grave threat to our privacy, it physically endangers Black Americans and other minority populations in our country,” Markey said in a statement. “I’ve spent years pushing back against the proliferation of facial recognition surveillance systems because the implications for our civil liberties are chilling and the disproportionate burden on communities of color is unacceptable. In this moment, the only responsible thing to do is to prohibit government and law enforcement from using these surveillance mechanisms.”


Thus far, bans have been limited to individual cities like San Fransisco and, more recently, Boston, meaning police in the majority of the country are still free to spend taxpayer dollars on this unproven and incredibly dangerous software. If passed, the Facial Recognition and Biometric Technology Moratorium Act would be a landmark win for civil liberties and would hopefully set a precedent for states to follow.

“By leveraging federal grants, this legislation will help move state and local governments to pass their own prohibitions on discriminatory policing technology,” Brandi Collins-Dexter, a senior campaign director with Color of Change, wrote in Markey’s press release about the bill. “Ultimately, facial recognition software will always heighten the tremendous attacks that Black communities already face from law enforcement. We support this bill as a critical step toward a society where our communities can live without surveillance.”


Senior reporter. Tech + labor /// bgmwrites@gmail.com Keybase: keybase.io/bryangm Securedrop: http://gmg7jl25ony5g7ws.onion/


Why Facial Recognition is needed:

Investigators have a hard time solving crimes now, only about 1/3 of property crimes are solved. That is with using various software.

Traditional investigations often involve an investigator sending out the image to beat cops saying “does anyone know this guy”, or “if anyone sees this guy, get me his info”. Which leads to move encounters between BIPOC and police officers. FR

What does FR do

Feed it a picture or video (say ring surveillance of a package being stolen) and it provides X number of names/photos of people that look like the person. The investigator will then do research see if that is the suspect. This might include trying to get cell phone data, looking at pawn records to see if they are pawning stuff, checking social media, looking at tattoos, etc.

A giant arrest drone is not dispatched to scoop up this person to because the program said so. Match(es) are always reviewed by a person.

You know in old shows they have the victim looking through a big book of mugshots? And then they dramatically go “that’s the guy!”. The program is doing that instead of the person. That’s it.

So what is the problem with FR

Bad PR mostly? People seem to think is it like Person of Interest or Minority Report.

The biggest problem cited is currently programs seem to have problems with faces of people of color. Which IS a problem, but that means one of two things:
1) Suspect in photo is not located (false negative)
2) Software identifies wrong person as suspect (false positive)

In the first case, it doesn’t really matter except it wastes a little time of the investigator. The bigger problem is #2, but remember all matches are verified by at least one human (more often or not the investigator will probably show people “hey do you think this guy is this guy?”). While people/police being bad as telling people apart is an issue, that is outside of the scope of FR since those issues have always existed.

So the system flags some photo as a possible suspect, the investigator will go “oh shit, that’s him!”, or “stupid machine, they look nothing alike!”.

In the first case they will do additional investigations to verify it is the person. Boom crime solved. In the second case they move on and try something else.

Investigators are lazy/racist/stupid/whatever

Yeah, some are. But that isn’t changed with FR, they could more easily just open up TVs “big book of suspects” and find a random dude who looks close enough and pin the crime on him, and that could be done without having the big paper trail that happens when you use the software. So it would be easier, and have less chance of being caught using traditional methods, if someone wanted to be a bad actor.

It gets better

FR is like self driving vehicles. Right now they are OK, not perfect, really bad some days, but in general a bit better than the average of everyone.

But this is still early, they get better the more experience it gets. Part of that is letting them get the ‘reps’ needed to build good muscles.

It is inevitable

Also like SDVs, this will happen. Understanding, education, and regulation is going to be more productive and standing on the tracks trying to stop it.


The way I see it, I don’t see any. With the proliferation of video recording, FR could be a powerful tool is solving crimes that have zero chance of being solved without it.

1. Keep things the same: low solving rates, lots of unneeded encounters between police and innocent people.
2. Use new technology appropriately. You could have fewer investigators handle more cases, thus leading to smaller departments

*** Note: I am solely discussing “find me matches to this image” type of FR, not “track this person” or “find the names of everyone walking out of this club and see if they have warrants” type uses. Go ahead ban those. ***