Don’t Be Scared—They Can Hear You!
Police are violence workers. The license to violence, like the license to commit rape, daylight abduction, and secret surveillance, has been granted by the state in exclusive perpetuity to the police, who are expected to use these powers to bring people into the justice system. Normal people share none of these powers. We aren’t allowed to beat, electrify, and kidnap the police when we catch them speeding or abusing their spouses, and we aren’t allowed to strip them down and penetrate their cavities with our fingers. We don’t get to confiscate large sums of cash from anyone we suspect of misusing public funds, and we’re never allowed to shoot them in the back if they try to run.
We aren’t even allowed to install up to two dozen listening devices per square mile to record the ambient auditory output of an entire city, bill the city for the privilege of doing so, and bill them yet again to actually see the data. That particular power has been literally licensed away to a Bay Area company called ShotSpotter (SST), a surveillance subscription service helmed by a fascist-sympathizing CEO named Ralph Clark.
ShotSpotter markets itself as a gunshot-detection technology capable of triangulating any loud noise through a network of sensors hidden about 30 feet in the air. If the sensors detect a gunshot, they push a street address to the local cops via a phone app called Respond. ShotSpotter touts sub-20-second response times, 80% accuracy, and big-name clients like the Chicago Police Department, who were disappearing and torturing people of color at Homan Square for decades while the earthquake-detection technology that became the ShotSpotter system was under development in California.
Every year, Chicagoans, via the CPD, pay ShotSpotter over $60,000 per square mile for the privilege of being surveilled. And then they pay again, as much as $50,000 per city, for access to the actual data. Double-dipping in the public coffers hasn’t helped ShotSpotter achieve profitability yet, but that hasn’t muted Wall Street’s passion for the idea: the company’s value has increased over 300% since an IPO in 2017, and the reach of their network, though not impervious to pushback, has continued to spread. As of today, they’re actively listening in on hundreds of square miles in over 90 cities around the world, nine university campuses, and one experimental freeway system. Sifting through the ambient auditory property of millions of U.S. citizens, ShotSpotter gives some of their findings to the police, then locks the rest away in the name of maintaining “trade secrets.” These trade secrets, however, can be had for $50,000.
$50,000 also happens to be the exact value of the referral discount that cities receive on ShotSpotter when they hire a company lackey in a police leadership position, as the people of my town, South Bend, Indiana, learned when the city bequeathed them a new chief by the name of Ron Teachman.
“Mayor Pete” & The Revolving Doors of South Bend, Indiana
In late 2016, I moved from small-town Mississippi to South Bend, a recovering Rust Belt city nestled along the beautiful St. Joe river about two hours east of Chicago, to study creative writing. The man who showed me my one-bedroom unit brought up ShotSpotter in our first conversation, evangelizing the benefits of the incredible technology operating silently right outside my front door. South Bend was getting safer every day, he assured me, and the police were on the bleeding edge of progress. When I brought up ShotSpotter to the person who actually sold me the unit, though, they laughed and said, “That scam? Yeah, that’s bullshit. Don’t buy into that.”
Whether selling point or scam, ShotSpotter might never have come to South Bend if it hadn’t been for Pete Buttigieg, the 37-year-old mayor now nationally known as “Mayor Pete.” In 2012, shortly after starting his first term, Buttigieg learned that Chief of Police Darryl Boykins was under federal investigation for wiretapping his own department (Boykins, the first and only black police chief, was allegedly recording racist remarks made against him). Buttigieg demanded Boykins’s resignation, a unilateral move that drew strong criticism from the community and which, in retrospect, he calls his “first serious mistake as mayor.” Buttigieg weathered the controversy and brought in a white Bay Stater named Ron Teachman to replace Boykins. Teachman, the former Chief of Police of New Bedford, pitched South Bend a more “academic and high-tech approach to policing.”
By “academic and high-tech” policing, Teachman—coming in hot on the heels of a serious surveillance scandal—apparently meant a direct escalation of the SBPD’s surveillance capability. In addition to spearheading the purchase of automatic license-plate readers (later reported to be less than 1% accurate), Teachman sold the city on the benefits of ShotSpotter—a product that, like a traveling salesman, he’d just sold and implemented in Massachusetts.
The taxpayers of South Bend put up a down payment of $300,000—after Teachman’s referral discount, presumably—along with a recurring $150,000 yearly subscription for the privilege of entangling about four square miles of the city in ShotSpotter’s acoustic mesh. There was no community engagement, no back-and-forth between the residents of those four square miles and ShotSpotter (let alone the citizenry and the police). The decision was made, and Teachman convinced the media of the system’s benefits retroactively. “I was pleased to find more flexible options now available from SST,” he said, “and the new ShotSpotter Flex made it easier to enable and manage the system with fewer resources.”
If that sounds like direct-to-consumer marketing, that’s because it was. Two years into the job, and a year before I arrived in South Bend, Teachman determined that his work serving the community was done: he resigned as Chief of Police and immediately announced his intention to take a directorial position at ShotSpotter. At the press conference, with Pete Buttigieg gladhanding and smiling beside him, Teachman assured the public that there had been no “personal gain historically” from his relationship with the company, and then reminded us of the generous $50,000 referral discount he’d scored for the town.
Bad In Theory, Worse In Practice
If Director Teachman personally gained—as it’s quite evident that he did—from the privatization of acoustic surveillance, then he may have been the only person in South Bend to do so. No independent studies have conclusively proven that ShotSpotter even works, let alone indicated that it improves communities by any metric. CEO Ralph Clark likes to say that it’s a tool to “build trust,” but the few details that ShotSpotter makes public about their tech have already belied even the most modest of their claims.
A former analyst, for example, testified under oath that the technology’s accuracy statistics were put together by sales and marketing teams, not engineers, in order to “start a dialogue” with customers. “We have to tell them something,” they said. And academic researchers trying to verify the technology have been stonewalled by Clark, who keeps a close lid on his secrets. Still, the data has begun to trickle in from different cities around the country, often with contradictory results.
During one 16-month trial period in San Francisco, the police responded to 3,485 alerts, flagged 1,412 of them as unfounded, and ultimately made a grand total of 4 arrests. The city of San Antonio, Texas canceled their subscription after a similar trial period in 2017, during which time they confiscated 7 weapons and also made 4 arrests—at a total cost of about $546,000, or $136,500 per arrest.
Clark, who likes to use vivid terms like super criminal and trigger puller when talking about his product, conveniently insists that any statistical criticisms of ShotSpotter are missing the point. “Sometimes the things that are most impactful are the most difficult to measure,” he says, pointing to cities like San Diego, already a major organ of the military-industrial complex, which recently renewed its ShotSpotter system at a cost of about $250,000 a year, paid for entirely by funds acquired through civil asset forfeiture.
The Chicago Police Department, by far ShotSpotter’s single largest customer, is also still on board with the product: they just signed a three-year, $23 million contract to surveil over 100 square miles, or almost half the area of Chicago. This contract stipulates that ShotSpotter will now augment the city’s existing network of approximately 30,000 CCTV cameras with around 10,000 additional ShotSpotter cameras and sensors, rolling the entire system into ShotSpotter’s flagship product, Flex. The CPD points to a 60% reduction in gun violence in the famous South Side district of Englewood, a phenomenon that ShotSpotter is quick to take credit for.
So, does it work? No one knows, exactly—certainly not the public. If the sudden ubiquitous installation of acoustic surveillance technology like ShotSpotter truly is in the public interest, then the public should indisputably have access to evidence of the results. If such a thing were to be designed with actual democratic consensus in mind, then it should be requisite that a system like ShotSpotter make all the software involved open-source. Most crucially, it should be ensured that the public consents to the kind of data harvested and the methods by which it is captured, filtered, and interpreted. And the system’s data should also be made available to public researchers—and scrutiny. Ideally, a populace would reject the idea of dragnet surveillance of their community altogether, and the government would refrain accordingly. This, of course, is a far cry from the current reality.
Demanding that the public continue to accept constant unilateral surveillance of their airspace (under the aegis of a private company, no less) isn’t just the wrong way to “build trust between communities.” In fact, it utterly precludes the possibility of trust. Like a partner who demands to know all of your passwords but refuses to share even their username, a relationship predicated on distrust will always be toxic. Just how toxic the real societal relation becomes will depend on how ShotSpotter actually works.
Wait—How Does It Work?
One of Clark’s favorite answers to whether ShotSpotter constitutes an invasion of privacy is that the technology “just isn’t that good” at eavesdropping; there’s no reason to be afraid of a bunch of microphones in the sky when we all carry much more accurate microphones in our pockets.
But nothing about ShotSpotter’s technology fundamentally limits its ability to facilitate conversation-level surveillance. We know that the system’s technological capabilities broadly exceed its stated purpose of gunshot detection. Each ShotSpotter sensor package contains a microphone, a data chip, a SIM card, and sometimes a camera. ShotSpotter has claimed that they’re usually mounted 30-40 feet above street level, but in interviews, Clark has admitted that they can be found lower—as low as 20 feet, in fact. When three or more sensors pick up a loud noise (in a decibel range “louder than a human voice,” according to the company) the data is sent to a central server, where a human analyzes the appearance of the waveforms to determine if they resemble those of a gunshot.
The sensors are always on, storing data for an unspecified length of time. When an analyst “verifies” a gunshot, the recording is sent via app to local police’s cell phones—including audio two seconds before and four seconds after the gunshot.
ShotSpotter claims that the audio data it captures is too ambient to present a privacy risk, and that it doesn’t capture individual conversations—or at the very least, that it isn’t trying to. This doesn’t quite square with the fact that so many of its sensors are equipped with cameras that, according to Fox News in Chicago, allow the system to “potentially capture entire crimes on video with crystal clear quality.”
Are the sensors forty feet in the air, or twenty? Are they capable of only fuzzy audio of gunshots, or can they capture crystal-clear video, too? (Because Illinois is a two-party consent state for recording, the system’s legality is immediately called into question.) And when it comes to the total scope of the system’s listening powers, the broader hypersurveillance network that the sensors ultimately feed, how much do details like these matter?
All of the data captured on ShotSpotter’s sensors is transmitted to their analysts over wireless 3G and LTE via major contracts with both AT&T and Verizon, the number one and two carriers in the country, respectively—and thereby connecting the total ambient output of our communities directly to the national telecom industry. The arrangement with Verizon has been particularly propitious for ShotSpotter, which recently signed onto Verizon’s “Smart Streets” initiative to embed ShotSpotter sensors inside Internet-of-things-enabled streetlamps. Boston, an early adopter of the program, hopes to capture more “aggregated data” to help understand the “hazards on our roads.”
It’s worth nothing here Verizon’s history with property like “aggregated data,” and its supposed distinction from something like “personalized data”—it was the telecom company’s metadata archives that the Foreign Intelligence Surveillance Court compelled Verizon to turn over to the National Security Agency, in orders eventually leaked by Edward Snowden in 2013. After 9/11, the President’s Surveillance Program, under extreme secrecy, made the mass harvest of private information, otherwise known as aggregated data, part of the national security apparatus. When its legality was questioned by other members of the White House, the NSA shifted to taking its orders from a secret court, FISC, which is known to rubber-stamp over 99% of all government requests.
Telephony metadata—all of the information about a call except its actual contents, often called “anonymized”—has recently come under fire for being much less anonymous than Silicon Valley and the NSA have led us to believe. At the same time, the technology involved long ago exceeded the theoretical limits of metadata collection, as programs like PRISM and Upstream physically redirect the flow of every fiber-optic cable in the US. Similar programs in the UK, characterized by the European Court of Human Rights as employing a pattern of “secret hearings, vague legal safeguards, and broadening reach,” have already been determined to violate human rights standards. Similar domestic legal challenges have been markedly less successful, and whistleblowers of surveillance abuse are still routinely prosecuted.
And corporate power has a way of consolidating further corporate power. ShotSpotter’s tumorous attachment to the mesh of hypersurveillance is only possible because the public has paid for them to get this far. Now that Wall Street’s involved, more money than ever is flowing through Clark’s hands. Not content with the theoretical limitations of speakers on telephone poles, ShotSpotter has moved into a new, even more controversial product: the power to, ostensibly, see the future—and to sell that future to the cops.
Bias In, Bias Out: HunchLab and the Rise of Predictive Policing
When ShotSpotter went public in the summer of 2017, they raised $32 million on a valuation of $118 million (up to about $500 million today). A few months later, flush with capital, they announced their next move: the acquisition of a Pennsylvania-based start-up called HunchLab.
Already in use by ShotSpotter’s best customer, the CPD, HunchLab maps communities into a network of 250-square-meter “cells,” subjecting each cell to an exhaustive list of historical, current, and so-called “future indicators” like seasonality, time of day, day of week, and even “trends in socioeconomic conditions and upcoming events.” The entire past, present, and future history of a 250-square-meter block is processed through an algorithm called an “advanced statistical model,” and the police are arranged on the board accordingly.
These black-box algorithms claim to know not just the exact place and time in which a crime will occur, but even what kinds of crime to expect. Police work is so reactionary both ideologically and literally that profiling-by-software is still being sold as progressive: Ralph Clark calls his new toy the democratization of intelligence, providing every cop with their “own personal virtual crime analyst.”
Virtual analysts, or virtual crimes? Predictive policing, by definition, profiles communities—or in the parlance of both HunchLab and antiterrorism operations, “cells”—according to crimes that have literally never happened. The CPD, which refers to their intelligence hubs as “mini war rooms,” already keeps a so-called “heat list,” an algorithmically compiled record of potential criminals in the city.
This isn’t something that can be ameliorated by Silicon Valley’s standard iterative improvements. Algorithms will always be biased as long as biased humans are in charge of the inputs, as engineers across Silicon Valley, from Microsoft’s antisemitic Tay to Amazon’s sexist recruitment AI, are having to learn and re-learn. They map onto the expectations of their creators and users, not the world itself. This is especially true in a law enforcement context: these systems model the crimes police are looking for, not crime as it really exists. It’s hard to find an unbiased algorithm in practice. It’s just as hard in theory.
HunchLab hasn’t exactly given us a reason to trust them. They named themselves HunchLab, after all, which is either naïve to the Constitution or abrasively flippant towards it: Terry v. Ohio in 1968 defined a “hunch” as an “inchoate and unparticularized suspicion,” specifically failing to meet even the standards of that oft-abused “reasonable suspicion.” Hunches—like “intuitions, gut feelings and sixth sense”—are totally illegal. Then again, maybe that’s why ShotSpotter changed the name soon after the acquisition: HunchLab is now called ShotSpotter Missions™️.
Keeping Secrets in the Age of Surveillance
“I love talking to police agencies that want to think different, right? We’re on a crusade here.”
Juxtaposing a famous piece of Silicon Valley marketing (Apple’s 1997 Think Different campaign) with the moral ineptitude of the genocidal Crusades is surprisingly appropriate for Clark, who earns about $750,000 a year selling the ambient personal data of millions of people to the cops. It’s even more fitting that Clark deflects to our iPhones when talking about privacy. In a world of StingRays, PRISM, Facebook, and secret courts, ShotSpotter may appear fairly low on the hierarchy of domestic surveillance. But the extreme vertical integration and consolidation of the surveillance state means that the old hierarchies don’t really matter.
ShotSpotter’s microphones are directly connected to the camera networks covering half of Chicago, which in turn connect to the compromised telephony networks of AT&T and Verizon, diffusing the locus of control through the entire communications apparatus of the security state. Surveillance capitalists like Clark expect us to buy into privatized spying specifically because it’s already so entwined with the architecture of control.
This is the reality of hypersurveillance at the hands of capitalism: the collective visual and auditory property of any given individual or community, the entire output of a city, isn’t just for sale to the highest bidder, but has in fact already thoroughly propagated through the mesh that it depends upon, reinforces, and feeds into—from local CCTV, to national LTE, to server farms somewhere beneath the NSA headquarters. And unlike our cell phones, which we can just get rid of (however impractical that may be in modernity), ShotSpotter specifically circumvents individualized means of control; all personal output becomes the purview of “acoustic assessment,” the rights to which you have already paid someone else to sell. Police exist to protect capital, and as long as capital flows through Clark’s personal control, the crusade against America’s most vulnerable communities will continue, enabling police oppression while simultaneously grifting the cops via dysfunctional apps and baldly corrupt salesmen.
For every San Antonio, Texas or Charlotte, North Carolina—who ended their trial subscription last summer—there’s a new city block in Chicago, or a new university campus in Georgia, or a new police chief for hire in Indiana. Always, somewhere in the world, a new fascist. As Clark said in the last ShotSpotter earnings call: “I think, certainly, the new President of Brazil is a little bit more law and order like. So we’re pretty constructive on the long-term of very interesting possibilities in international markets.” ShotSpotter’s viability depends on people like Jair Bolsonaro, who believes “a good criminal is a dead criminal” and has promised to criminalize leftism and purge it from Brazil. We must ask tougher questions of these technologies, and of the people who control them, before letting them extend their tendrils into our neighborhoods.
What could that money buy that actually benefits a community without violating its civil rights? Another after-school program, another food bank, another park? Why are we more comfortable infringing on our 4th Amendment rights than on our 2nd Amendment rights? If gun control is too contentious but ShotSpotter a no-brainer, why does the right to violence supersede the right to privacy and due process? Does that really keep us safe going forward?
And once we have the data, are the police really the best people to trust with it? Is it possible, even reasonable, to trust an organization of armed and prejudiced violence workers based on such an uneven and unilateral distribution of power? If the police departments across the US plan to continue operating with a veneer of trust, then three obvious conditions need to be satisfied when it comes to hiring out hypersurveillance and predictive policing.
First, all companies and technologies involved need to be subject to independent and rigorous audits and studies—both on efficacy and on constitutional permissibility.
Second—and this applies particularly to predictive policing—any such technologies need to be used from the beginning on the police themselves.
Third, all algorithms involved—especially the ones that control the civil standing of both individuals and communities—need to be completely open source, exposing every input and equation to the public before we have to accept the legal consequences of each output.
If these conditions aren’t satisfied—if predictive policing remains closed-source and unilateral in its deployment—then we know that the police are operating in bad faith, confirming the suspicion that they have more to do with protecting the controlled flow of capital than with enforcing public safety.
There exists a surveillance capitalist who believes in a quasi-religious moral crusade to listen in on every city in the world; to record, analyze, and resell, on a recurring subscription, the auditory output of vulnerable populations to an organization most famous for violating human rights, from slavecatching and strike-busting to Rodney King and Homan Square. And that surveillance capitalist is gaining literal ground—35 square miles in just the last earnings quarter.
Meanwhile in South Bend, Indiana a mutually convenient tradition continues. Mayor Buttigieg, after two terms of regressive policy and ongoing police brutality scandals, has somehow emerged on the national stage as a progressive, leaving behind a city more impoverished than ever but littered with LimeBikes and microphones. And the new Police Chief Scott Ruszowski, who got the job after Director Teachman stepped down, appears to be following directly in his predecessor’s footsteps: “After almost 30 years in law enforcement, I’ve yet to find a more profound and proven way to increase community/police relations than ShotSpotter.”
Or so say the ads on the internet. ♦
 In 2016, according to the San Diego Union-Tribune, San Diego County seized $2.7 million from 76 cases, the majority of which never went to court. As with “broken-windows” policing (and like the earthquake-detection tech that led to ShotSpotter), civil asset forfeiture is an invention of the early 1980s that has come under increasing scrutiny in the 21st century.
 According to an investigation by The Verge, other factors include: “population density; census data; the locations of bars, churches, schools, and transportation hubs; schedules for home games—even moon phases.” CPD has the largest and best-funded crime-prediction initiative.
Daniel Uncapher is a Sparks Fellow at Notre Dame, where he received his MFA, and an incoming Ph.D student in creative writing at the University of Utah. A disabled bisexual from North Mississippi, his work has appeared or is forthcoming in Chicago Quarterly Review, Tin House Online, The Carolina Quarterly, Penn Review, and others. View his website here.