Support Unraveled
We are journalist-owned and 100% supported by readers like you.
Your support funds our investigative and on-the-ground reporting. Thank you for uplifting independent journalism!
The data includes over 12,000 events where police response times were zero or negative.
by People’s Fabric May 16, 2024
Share this article:
Chicago Police Department (CPD) recently sent a report of ShotSpotter statistics to City Council members that is skewed by inaccurate and missing response time data, a People’s Fabric analysis confirms. Other data behind the report sheds new light on how often residents call 911 about shootings—and it’s far more often than claimed by SoundThinking, the vendor behind ShotSpotter.
The report, obtained by South Side Weekly, was intended to help City Council members bolster their efforts to retain ShotSpotter. But experts interviewed by the Weekly cast doubt on the accuracy of the data and the use of misleading statistics.
People’s Fabric obtained the raw data used to generate the report via Freedom of Information Act request and has confirmed many of the suspicions voiced by those experts.
When CPD provided the data, they included a note stating an error had been discovered in the report they had provided to City Council members. A “typographical error” in their query left out a full year’s worth of data used to calculate response times to ShotSpotter alerts and calls of shots fired.
It’s unclear whether CPD has provided a corrected report to City Council members, or if similar errors apply to other portions of the report.
In CPD’s revised analysis, all of the average response times increased, but the relative differences between them were similar. The averages inaccurately suggest officers take two minutes longer to respond to ShotSpotter alerts when they are accompanied by a 911 call.
CPD also noted that response times are self-reported by responding officers and data is skewed by officers who delayed marking themselves as “on-scene.”
Despite acknowledging the data is inaccurate in such a way that it would skew averages, CPD provided only averages in their report.
As Robert Vargas, the deputy dean of social sciences and director of the Justice Project at the University of Chicago, told South Side Weekly, “The average can be misleading. It’s really puzzling to me why anyone would put so much weight on a single report based on a single statistic.”
In the most extreme case, one officer marked “on-scene” 16 hours after the event. Over 4,200 (2.4%) of these events, all of which involve reported gunfire, show officers taking more than an hour to mark their arrival.
The data also includes calls with negative response times and more than 12,000 (7%) calls that were responded to in zero seconds—either indicating bad data or the possession of teleportation and time travel devices.
People’s Fabric was able to replicate CPD’s average statistics exactly when including all of the obviously erroneous data.
For reasons that are unclear, officers disproportionately delayed marking themselves on-scene more often when an event had an associated call compared to when officers were solely responding to alert. While all of the averages were skewed by bad data, this had the effect of skewing the average time to respond to “ShotSpotter with call” events more than ShotSpotter alone.
Just as an analysis of average household incomes tends to focus on the median—because averages would be skewed by people like Jeff Bezos— looking at median response times provides a better point of comparison.
The median response time when ShotSpotter is accompanied by a call is 4.53 minutes. This is effectively the same as ShotSpotter events with no call, which is 4.83 minutes. The perception from the report that police are slower to respond when people call is a result of the skewed data and bad statistical analysis, not reality.
The report also included details about arrests and evidence collection associated with ShotSpotter, though a recent study has shown in aggregate ShotSpotter does not lead to more arrests or reduce gun violence.
CPD did not respond to our request to speak with anyone involved in the production of the report.
To conduct a deeper analysis, People’s Fabric combined this new dataset with another dataset previously obtained from the Office of Emergency Management & Communications (OEMC).
The OEMC dataset spans 2020 through 2023 and shows how officers closed out each call. This includes identifying whether it was a homicide, aggravated battery with a firearm (someone was shot), reckless discharge, or whether they found nothing.
CPD provided information about 105,847 ShotSpotter alerts, 44,727 (42%) of which they categorized as having corresponding calls.
The number of related 911 calls significantly increases when looking at those that involved confirmed gunfire or wounded victims.
In cases where a person was shot and the gunfire was detected by ShotSpotter, 98.2% of those alerts are identified as having corresponding 911 calls. This is compared to SoundThinking CEO Ralph Clark’s claim that “80% to 90% of gunfire goes unreported.”
Conversely, only 29% of ShotSpotter calls closed as “Miscellaneous” had a corresponding call. This is typically how cases are closed when officers find no evidence of gunfire at a ShotSpotter location.
A whopping 93% of ShotSpotter alerts that had no corresponding call are classified as “Miscellaneous.” SoundThinking says this is evidence that people do not call about gunfire. An alternative interpretation of the data is that a significant number of ShotSpotter alerts may not be gunfire.
A previous study found 83% of all ShotSpotter alerts, regardless of whether there was a corresponding call, ended in no evidence of a crime. The Chicago Office of the Inspector General similarly found that over 90% of ShotSpotter alerts result in no evidence of a gun crime. This works out to as many as 15,000 to 20,000 high-priority dead-end runs by police every year.
The data strongly suggests that when someone has been shot, it’s very rare that someone does not call 911. In cases where People’s Fabric has analyzed the detailed OEMC call logs, ShotSpotter alerts involving shooting victims were often accompanied by several calls, sometimes as many as a dozen.
Chicago’s current agreement to retain ShotSpotter expires in November.
On Wednesday, May 22nd, City Council is expected to consider a largely symbolic order demanding that the mayor retain ShotSpotter on a ward-by-ward basis. The order was initially drafted for alderpeople by ShotSpotter’s lobbyists, as reported by the Reader.
We are journalist-owned and 100% supported by readers like you.
Your support funds our investigative and on-the-ground reporting. Thank you for uplifting independent journalism!
Welcome to Unraveled.
Records show the Chicago Police Department has misrepresented the extent of the technology’s role in providing aid to gunshot victims.
A new study has found that ShotSpotter, a gunfire detection technology (GDT), does not live up to the benefits touted by the vendor and law enforcement. Among the study’s findings, ShotSpotter “did not translate to any meaningful improvements to crime control outcomes.” Initially marketed as a crime reduction system, ShotSpotter, now rebranded as SoundThinking, has […]