There’s a buffet of things to be scared shitless about these days, but for those who have sampled even a taste of big tech’s eagerness to build and sell facial recognition surveillance software to law enforcement, it’s uniquely rancid.
Among the challenges in reining in this dangerous class of artificial intelligence—which marries the lack of oversight endemic to tech with the lack of accountability law enforcement enjoys—is that there aren’t really any law governing the use of these products to speak of. Even the most basic suggestions, like asking Amazon to submit its Rekognition suite to testing by the National Institute of Standards and Technology, have been ignored. With any luck, though, a bill announced today by four members of Congress could put us on the path toward a solution.
The Facial Recognition and Biometric Technology Moratorium Act, at least from my reading of the bill’s text, delivers on its name: a full stop to use of this entire class of software at the federal level. It broadly defines this class of technology as software that performs “remote biometric recognition in real time or on a recording or photograph”—and not just facial analysis and identification, but also identification based on other characteristics like gait or voice analysis. Fingerprint and palm-print analyses are the only acceptable carve-outs in the bill’s present wording.
Foreseeing loopholes down the line, Senator Ed Markey, the bill’s author, specifies that this ban would not only include employees of federal agencies, but contractors and subcontractors as well; it likewise specifies that, geographically, the ban applies to all of the United States: continental, territorial, ports and airports, as well as border zones.
Markey, along with Sen. Jeff Merkley and Reps. Pramila Jayapal and Ayanna Pressley, have more or less delivered on what privacy advocates have been demanding for years now. Unsurprisingly then, the bill comes with a litany of endorsements from the American Civil Liberties Union, the Electronic Frontier Foundation, Color of Change, and Georgetown University Law’s Center on Privacy and Technology, among others.
Research performed by these and other groups has, time and again, shown significant disparities in accuracy for facial recognition. While the amount of inconsistency differs, this class of software tends to struggle with accurately identifying individuals who aren’t white men. Compounding matters, reporting has shown that clients often use the software in ways they were not designed for, and that the companies selling it have little oversight over that use. Just yesterday, news broke of the first known case where facial recognition technology led to a wrongful arrest—of a Black man who looked nothing like the surveillance video he was supposedly matched to, no less.
Markey’s bill couldn’t have been drafted at a better time, strategically speaking. Increased public scrutiny regarding law enforcement’s proclivity to use force in racially unequal ways has led to Google, Microsoft, and IBM claiming they will—at least where police are concerned—exit the facial recognition space entirely. (Amazon merely announced a one-year halt to licensing such technology to police in a brief blog post that left more questions than answers.) While there are other players in this space, ones that aren’t necessarily household names, it seems likely tech companies might put up less resistance than usual to regulation—at least if they’re keen to look ideologically consistent to their customers.
That said, one of the bill’s sponsors—the ACLU—today released emails that revealed Microsoft courted the Drug Enforcement Administration to buy its face recognition suite for well over a year. Though the emails predate the current promise to halt sales to police, that claim, like Amazon’s, does not specify whether sales to non-police government agencies are similarly off the menu. Those emails mention all three classes of surveillance covered in today’s bill—pattern, facial, and voice recognition—as being available for the DEA to use at its discretion.
“Facial recognition technology doesn’t just pose a grave threat to our privacy, it physically endangers Black Americans and other minority populations in our country,” Markey said in a statement. “I’ve spent years pushing back against the proliferation of facial recognition surveillance systems because the implications for our civil liberties are chilling and the disproportionate burden on communities of color is unacceptable. In this moment, the only responsible thing to do is to prohibit government and law enforcement from using these surveillance mechanisms.”
Thus far, bans have been limited to individual cities like San Fransisco and, more recently, Boston, meaning police in the majority of the country are still free to spend taxpayer dollars on this unproven and incredibly dangerous software. If passed, the Facial Recognition and Biometric Technology Moratorium Act would be a landmark win for civil liberties and would hopefully set a precedent for states to follow.
“By leveraging federal grants, this legislation will help move state and local governments to pass their own prohibitions on discriminatory policing technology,” Brandi Collins-Dexter, a senior campaign director with Color of Change, wrote in Markey’s press release about the bill. “Ultimately, facial recognition software will always heighten the tremendous attacks that Black communities already face from law enforcement. We support this bill as a critical step toward a society where our communities can live without surveillance.”