Tell Congress to Put an End to FRT

Rudimentary Facial Recognition Technology (FRT) was pioneered in the 1960s and even then the government quickly realized what implications FRT could have for policing and security. If one could use a computer to quickly look for the identity of suspects, they could effectively automate an extremely labor intensive aspect of policing—saving money in the long run.

Since the 60s, computing power has increased significantly, and with that FRT has proliferated. FRT now touches many different aspects of our lives, from the innocuous unlocking of phones to the more sinister surveillance and security state applications. As usual, the law hasn’t quite kept pace with this explosion of potentially invasive technology.

Much like any other surveillance technology, FRT is ripe for abuse. FRT has biases that are baked-in due to the way it is designed: most algorithms are trained on primarily white and male faces, leading to algorithms being less accurate for non-white, non-male people. This can lead to false identifications, wrongful arrest, and lengthy and expensive legal battles to clear the names of the accused. We know of three cases, all involving black men, where the men were exonerated after being wrongly charged in crimes based on FRT matches.

This shouldn’t be taken as a call to increase the accuracy of FRT algorithms, quite the opposite: fully accurate FRT systems would be a waking nightmare when it comes to privacy rights; a literal 1984 dystopia where the government can track all your movements throughout the day.

Facial recognition technology also has concerning implications for protest rights. Often times, law enforcement justifies the installation of cameras that may be used for FRT by claiming its for “public safety purposes”, but then quickly turn to those cameras to identify protesters. We know this is likely to have a chilling effect upon peaceful protest. Protesters fear that their identification could result in retribution by political or ideological opponents. This concern precipitated a moratorium by Apple, IBM, and Microsoft on sales of FRT to law enforcement and a new law in Virginia to tighten restrictions on the use of facial recognition technology by local law enforcement agencies.

And it’s not rare for law enforcement—federal or local—to be taking advantage of this tech. A recent GAO report revealed that of 42 government agencies 20 use facial recognition technology (FRT). The GAO summarized the risks of FRT and expressed concern over the insecurity of citizens’ private data. And without pressure to stop its use, FRT will continue to proliferate.

FRT can also quickly implicate itself in a Kafkaesque bureaucratic nightmare. Unemployment recipients across the U.S. have been denied benefits due to ID.me’s flawed facial recognition models. At a recent meeting of the Massachusetts legislature on facial recognition, Registrar Ogilvie of the Registry of Motor Vehicles (RMV) testified that 20% of applications to RMV were flagged as suspicious in their preliminary facial recognition screen using Rekognition’s system. These 260,000 applications were reviewed by the State Police, who determined that just 497 applications were actually fraudulent.

The private companies that supply databases for FRT algorithms are not so upstanding either. Clearview AI has been implicated in multiple scandals: from providing databases and FRT tech to ICE, to scraping unwitting social media users’ photos for to be used in the lineup of Clearview AI’s database. Buzzfeed recently reported on the use of Clearview AI’s facial recognition technology by 2,200 law enforcement agencies in the US. Even more concering, officers often used it without the permission of superiors. The NYPD conducted 5,100 searches with Clearview AI using the company’s trial program.

19 municipalities have adopted facial recognition technology ordinances, but these do not always prevent the use of FRT. The San Francisco Police Department was sued for its alleged use of a network of 400 private surveillance cameras to spy on protesters in 2020, despite an ordinance banning use of facial surveillance. Boston’s facial recognition ban contains a loophole allowing the use of databases, programs, and technology provided by another government entity.

This is why we need a federal moratorium on the use of facial and biometric technology. A moratorium will allow legislators to consider proper regulation of our government’s use of a technology that has enabled the wholesale roundup of Uyghurs in China and contributed to the wrongful arrest of at least three innocent Black men here in the US. Our freedom is threatened by any surveillance technology that enables us to be tracked everywhere and FRT has especially dangerous implications for this. You can send a letter to your lawmakers to tell them to support a federal moratorium of FRT here.

Learn more about FRT here.

A possible federal ban on warrantless StingRay use?

We’ve got promising news on the Fourth Amendment front: Last week, a federal bill regulating StingRay (otherwise known as a cell-site simulator) use was introduced in Congress with the help of Representative Ted Lieu and Senator Ron Wyden with Senator Steve Daines and Representative Tom McClintock. We’ve been waiting patiently for this bill to get introduced, as it effectively bans law enforcement from using StingRays without the express permission of a judge. Warrants can be a sticky subject when it comes to StingRays: law enforcement often doesn’t like to actually obtain express permission from courts to use the highly invasive surveillance tech—especially if obtaining a warrant will reveal too much about how the technology works. Federal law enforcement will often push for dismissal of cases if it becomes clear that going through with the case will reveal specifics about how StingRays are used and operate. We’ve seen the DOJ coaching local police departments on how to keep cell-site simulator capabilities secret from the public (and judges!) and defendants often not being told a StingRay was used to collect evidence in their cases. Most commonly, law enforcement obtains “pen register” or “trap and trace” orders that do not require probable cause when seeking to use StingRays, obfuscating the very real and very harmful effects of this surveillance tech—including its ability to disrupt calls to emergency services and sweep up bystanders’ information. This results in judges approving StingRay usage without a full appreciation of what will be searched or seized

Obviously, this raises a whole host of problematic Constitutional implications; from violating the Fourth Amendment all the way to the Sixth Amendment—which guarantees certain trial rights. Please tell your congressperson to support The Cell-Site Simulator Act of 2021. You can write them a letter here.

You can also learn more about StingRay surveillance by reading our explainer here.