ShotSpotter

What is ShotSpotter?

ShotSpotter is an Acoustic Gunshot Detection System (AGDS) used by law enforcement to detect and report the location of gun fire. These microphone and acoustic sensors are generally placed on utility poles and buildings, where they register and record a specific “acoustic signature” whenever a short, explosive, and loud sound is detected. Audio data is then sent to ShotSpotter’s “acoustic analysts,” who are responsible for verification and for providing supplemental information when requested. The audio data is triangulated to a location and time enabling local law enforcement to respond. ShotSpotter installs 20-25 audio detectors per square mile at a cost of $65,000 to $90,000.

ShotSpotter poses significant Fourth Amendment, privacy, and equity issues. In the course of monitoring for gunshot-like sounds, ShotSpotter sensors continuously record audio, usually in high-minority neighborhoods. Without a warrant or any individualized probable cause, their sensors inevitably surveil many innocent people going about their daily activities.

ShotSpotter is used in more than 130 U.S. cities and towns. Recently, in Detroit, city officials approved a $7 million expansion of ShotSpotter, contravening a locally adopted surveillance ordinance which required public input before the adoption or expansion of any surveillance technology. The Detroit Justice Center and the Sugar Law Center for Economic and Social Justice are suing the City. Many cities continue to expand their contracts using funds available from the 2021 American Rescue Plan Act. Through federal funding, ShotSpotter has secured contracts of: $3 million with Albuquerque, New Mexico; $2 million from Macon-Bibb County, Georgia; $1.2 million from New Haven, Connecticut; and $171,000 from Syracuse, New York. Chicago’s and New York City’s ShotSpotter contracts account for a large portion of the company’s revenue: $33 million, and $22 million for up to 5 years with possible extension, respectively. The Urban Area Security Initiative (UASI), a DHS grant program, also funds ShotSpotter installation. At the same time, several cities have found that ShotSpotter was ineffective in increasing arrests of shooters or reducing murders, and have therefore not renewed their contracts, including Charlotte, NC; Dayton, OH; and Fall River, MA.

ShotSpotter Doesn’t Work

ShotSpotter’s ineffectiveness in reducing murders and arrests is partly a result of its inaccuracy. 

ShotSpotter publicly boasts a 97% accuracy rate and a 0.5% false positive incidence rate. Cities with ShotSpotter contracts do not own the data they produce, nor has ShotSpotter made that data publicly available for an independent audit, casting doubt on the reliability of those figures. The accuracy rate that ShotSpotter advertises derives from two sources: an in-house study conducted in Pittsburgh, Pennsylvania and voluntary police reporting. The study conducted by ShotSpotter tested its sensors in a controlled environment that scarcely resembles typically targeted communities. The study fails to consider loud noises (fireworks, construction vehicles, etc.) or environmental factors (e.g., harsh weather) that result in false positive reports. Voluntary police reporting on the product is a poor metric for performance. ShotSpotter regularly contacts the police departments it has contracts with to instruct them on public relations strategies. Therefore, ShotSpotter customers who report on its efficacy have a strong incentive to provide positive reviews.

According to a ten-year longitudinal study on automated gunshot detection systems in St. Louis, Missouri, ShotSpotter did not deter violent gun crime, did not reduce police response times, and rarely produced evidence helpful to uncovering or prosecuting crimes. Furthermore, citizen reports of gunfire were seven times more likely to result in the recovery of evidence of a gunshot. ShotSpotter also had the deleterious effect of reducing the number of higher-quality citizen-initiated reports by 25%. This means ShotSpotter produced more false reports overall, along with more than doubling the overall number of reports. The excess unnecessary police responses came at a substantial expense to the city and diverted police from their other responsibilities.

ShotSpotter Wastes Time and Money

A sketch of ShotSpotter’s wasteful costs is illustrative. The MacArthur Justice Institute calculated that there were 87 unfounded ShotSpotter Chicago PD deployments per day (31,640 in total) from April 2021 to April 2022. The best available study on response times, from a study conducted in St. Louis, MO, found no statistical difference in the length of time it took for officers to arrive at the location of the “gunshot” when it was signaled by AGDS or a human phone report. However, each AGDS call took an average of 26.13 minutes, including the investigation. 

Generally, dispatch guidelines place calls involving firearms and violence at a high priority level, so it is reasonable to expect that at least two officers would be dispatched to each alert (although often it would likely be more). Finally, Illinois public records indicate that the median base salary for law enforcement officers is $87,006, $41.63 per hour. 

A calculation of the annual expense of ShotSpotter’s false reporting comes out to $6,307.50 per day, or $2,245,470 a year in wages, for the City of Chicago. This figure is conservative, as it is likely that more than two units are deployed for each ShotSpotter report, and many officers earn more than median base salary.  Nonetheless, our estimate discloses new and unconsidered costs for ShotSpotter’s inaccuracy. The $2.25 million in wasted public funds could instead be diverted into measures that provably improve community safety. 

The MacArthur Justice Institute reviewed 21 months of ShotSpotter deployments in Chicago and found that 89% found no gun-related crime and 86% reported no crime at all. The City of Chicago Office of Inspector General (OIG) conducted a similar study on ShotSpotter and found comparable results: CPD responses to ShotSpotter alerts rarely produce documented evidence of a gun-related crime or recovery of a firearm. The St. Louis study concluded that for every 100 AGDS calls for service, there were 0.9 founded crime incidents, which indicates that “alerting police of potential activity is not enough, human intelligence supporting that information is critical to turn a notice of potential activity into something police can act upon.”

Most telling is a study conducted from 1999 to 2016 that assessed ShotSpotter’s impact on rates of firearm related crime. Even when controlling for county- and state-level demographics, as well as state firearm laws, ShotSpotter was found to have made no difference in firearm homicides, murder arrests, and weapons arrests across the 68 metropolitan counties in its sample.

Advocates for ShotSpotter tend to focus on the short-term metrics of ShotSpotter’s success: Do police like it? Does it shorten police response time to a callout? We believe that elected officials should focus on questions that deal with longer-term consequences such as these:

  • How many fewer gun-related crimes are recorded as occurring in ShotSpotter’s coverage area, relative to the area not covered by ShotSpotter?
  • How many incremental arrests occur in ShotSpotter’s coverage area, relative to the area not covered by ShotSpotter, for gun-related crimes?
  • Of those arrests, how many result in convictions?
  • What proportion of ShotSpotter alerts result in confirmation of gunfire?
  • What police resources were expended to respond to false alerts, and how could those resources be more effectively employed to reduce gun violence?

ShotSpotter Captures Human Conversation

ShotSpotter claims that its audio sensors are only triggered by loud and abrupt noises that are or resemble gunshots. However, this is misleading. Even in contexts where the audio snippet contains a gunshot-like sound, it can also contain the sounds of human voices. In a California case, People v. Johnson, prosecutors introduced as evidence voice audio recordings captured by ShotSpotter devices. In a criminal trial in Massachusetts, Commonwealth v. Denison, ShotSpotter voice recordings were ruled inadmissible under the Massachusetts Wiretap Act. Since ShotSpotter does not publicly disclose the location of their sensors, it is difficult to know whether speech will remain private, even in your own home

Data collected by ShotSpotter devices are stored for 30-48 hours. Two concerns arise from this practice. First, ShotSpotter has reserved the right to own and sell these data with little oversight or regulation, including selling it to third parties and using it to develop biometric surveillance devices. Second, these data are sensitive, personal, and collected without the target’s knowledge. Data stored for long periods of time are vulnerable and susceptible to breaches, thus further risking your privacy. The sensitive and private nature of the information ShotSpotter collects suggests that police should need a warrant to access it. 

ShotSpotter’s Disproportionate Surveillance of Black and Brown Communities

ShotSpotter is predominantly deployed in Black and Brown communities. In Chicago, the 12 districts with ShotSpotter installations are those with the highest populations of Black and Latinx residents. Analysis from the Surveillance Technology Oversight Project (STOP) shows that of the 31 NYPD precincts that deployed ShotSpotter sensors in 2018, 70% were majority Black or Latinx. The same racist pattern was found in Kansas City, Missouri; Cleveland, Ohio; Atlanta, Georgia; and Boston, Massachusetts.

ShotSpotter relies on historical and geographic crime data to determine the placement of its sensors. They prioritize areas that demonstrate higher incidences of gun fire. The problem with crime data is that it reflects and is biased by the over-policing of communities of color. After a ShotSpotter installation, the police department will receive more alerts of gunshot-like sounds from within the coverage area than outside it, leading law enforcement to engage in proportionally more stops and pat-downs within the coverage area.

The toxic mix of inaccurate technology and racial bias in ShotSpotter’s surveillance apparatus led to the false arrest of 65-year-old Michael Williams. According to Vice’s study of court documents from the Williams case and other trials in Chicago and New York State, evidence suggests that law enforcement frequently requests that ShotSpotter’s analysts modify alerts to support their narratives of events. According to confidential ShotSpotter documents obtained by The Associated Press, acoustic analysts are given broad discretion to reclassify sound data as a gunshot. It is estimated that these “reversals” happen 10% of the time as of 2021. The accuracy of reversals is as questionable as the overall reports since analysts are given only one minute to classify audio.  

The Chicago OIG found that ShotSpotter technology directly impacts how law enforcement officers treat individuals present in areas with regular ShotSpotter alerts. A false report primes police to believe they are entering an actively dangerous situation with guns involved.  Even if police are really responding to a back-firing car or other loud noise, they expect a gun, and must assume that people in the vicinity are armed. This can create a dangerous situation for people in the area, who may not even realize that ShotSpotter has reported a shooting to the police. When this happened, police have shot unarmed citizens who simply happen to be in the area when responding to calls about gunshots. The death of 13 year old Adam Toledo may have resulted from this hyper-vigilance. 

How is Restore the Fourth Fighting Back Against ShotSpotter?

Restore the Fourth’s work to pass surveillance oversight ordinances, involving partners like the ACLU and EFF, has deterred installations and expansions of ShotSpotter, along with other surveillance technologies, without public hearings and elected officials’ consent. We have played an active role in the passage of ordinances in Cambridge, MA; New York City, NY; Oakland, CA; Palo Alto, CA; Santa Clara County, CA; Somerville, MA and Boston, MA. Our Minnesota chapter is campaigning to pass one in Minneapolis. If you’re looking to start or join a campaign to pass an ordinance in your town, city, or county, please sign this petition and get in touch.