Translate page with Google

Story Publication logo August 24, 2021

Senators Question DOJ Funding for AI-Powered Policing Tech

Authors:
A laptopogram based on a neutral background and populated by scattered squared portraits, all monochromatic, grouped according to similarity. The groupings vary in size, ranging from single faces to overlapping collections of up to twelve. The facial expressions of all the individuals featured are neutral, represented through a mixture of ages and genders.
English

Project

Tracked

The Associated Press examines the power and influence of predictive and surveillance technologies.

author #1 image author #2 image
Multiple Authors
SECTIONS
Man sits in chair in his home
Michael Williams sits for an interview in his South Side Chicago home on July 27, 2021. Prosecutors used ShotSpotter evidence to build their case against Williams, who spent 11 months behind bars before being released. “I kept trying to figure out, how can they get away with using the technology like that against me?” he asked. "That's not fair." Williams was released after nearly a year because of insufficient evidence. Image by Charles Rex Arbogast. United States.

A Democratic senator said the U.S. Justice Department needs to look into whether the algorithm-powered police technologies it funds contribute to racial bias in law enforcement and lead to wrongful arrests.

Sen. Ron Wyden, an Oregon Democrat, was responding to an investigation by The Associated Press published Thursday about the possibility of bias in courtroom evidence produced by an algorithm-powered gunshot detection technology called ShotSpotter. The system, which can be funded by Justice Department grants, is used by law enforcement in more than 110 U.S. communities to detect gunfire and respond to crime scenes faster.

“While there continues to be a national debate on policing in America, it’s become increasingly clear that algorithms and technologies used during investigations, like ShotSpotter, can further racial biases and increase the potential for sending innocent people to prison,” Wyden said.


As a nonprofit journalism organization, we depend on your support to fund critical stories in local U.S. newsrooms. Donate any amount today to become a Pulitzer Center Champion and receive exclusive benefits!


Chicago prosecutors relied on audio evidence picked up by ShotSpotter sensors to charge 65-year-old Michael Williams with murder last year for allegedly shooting a man inside his car. ShotSpotter has said its system has trouble identifying gunshots in enclosed spaces. Williams spent nearly a year in jail, until late last month when a judge dismissed the case against him at the request of prosecutors, who said they had insufficient evidence.

“Fundamentally, these tools are outsourcing critical policing decisions, leaving the fate of people like Michael Williams to a computer,” Wyden said.

In Chicago, where Williams was jailed, community members rallied in front of a police station on Thursday, demanding the city end its contract with ShotSpotter, a system they said “creates a dangerous situation where police treat everyone in the alert area as an armed threat.”

An intersection in Chicago
A pedestrian walks with a dog at the intersection of South Stony Island Avenue and East 63rd Street, where the ShotSpotter technology is in use above the crossroads, on August 10, 2021, in Chicago. Image by Charles Rex Arbogast. United States.

The Chicago Police Department on Friday defended the technology in response to calls to end the city’s ShotSpotter contract. Chicago is ShotSpotter’s largest customer.

“ShotSpotter has detected hundreds of shootings that would have otherwise gone unreported,” it said in a statement emailed to the AP, adding that the technology is just one of many tools the department relies on “to keep the public safe and ultimately save lives.”

It said real-time ShotSpotter alerts about gunshots mean officers respond faster and more consistently than when depending on someone to call 911 to report gunfire.

“The system gives police the opportunity to reassure communities that law enforcement is there to serve and protect them and helps to build bridges with residents who wish to remain anonymous,” the department said.

ShotSpotter uses a secret algorithm to analyze noises detected by sensors mounted on light poles and buildings. Employees at the company’s Incident Review Centers in Washington, D.C., and Newark, California, look at the wavelengths and listen to sounds that the computer deems possible gunshots to make a final determination before alerting police.

“The point is anything that ultimately gets produced as a gunshot has to have eyes and ears on it,” said CEO Ralph Clark in an interview. “Human eyes and ears, OK?”

Man looks foreward
Michael Williams speaks during an interview in his Chicago home on July 27, 2021. Williams remains shaken by his ordeal. He said he doesn’t feel safe in his hometown anymore. When he walks through the neighborhood, he scans for the little ShotSpotter microphones that almost sent him to jail for life. “The only places these devices are installed are in poor Black communities, nowhere else,” he said. “How many of us will end up in this same situation?" Image by Charles Rex Arbogast, United States.

Civil rights advocates say the human reviews can introduce bias.

Wyden said he and seven other Democratic lawmakers are still waiting for a Justice Department response to their April letter raising concerns about federal funds going to local law enforcement agencies to buy a variety of artificial intelligence technologies, including some that integrate gunshot detection data. In addition to Wyden, the letter was signed by Sens. Ed Markey and Elizabeth Warren of Massachusetts, Alex Padilla of California, Raphael Warnock of Georgia, and Jeff Merkley of Oregon, and U.S. Reps. Yvette Clarke of New York and Sheila Jackson Lee of Texas.

“These algorithms, which automate policing decisions, not only suffer from a lack of meaningful oversight regarding whether they actually improve public safety, but it is also likely they amplify biases against historically marginalized groups,” they wrote to Attorney General Merrick Garland.

Video: Hear from Williams and learn more about ShotSpotter by clicking here.

RELATED CONTENT

RELATED TOPICS

Criminal Justice

Topic

Criminal Justice

Criminal Justice
an orange halftone illustration of a hand underneath a drone

Topic

AI Accountability

AI Accountability

RELATED INITIATIVES

two cows

Initiative

Bringing Stories Home

Bringing Stories Home
Logo: The AI Accountability Network

Initiative

AI Accountability Network

AI Accountability Network

Support our work

Your support ensures great journalism and education on underreported and systemic global issues