Temperature-detecting cameras. Drones. Technology that police say can predict human feelings. Amid a year of outbreaks and protests, government officials worldwide have deployed new tools to expand surveillance and guide their policy decisions from a distance.

From policing to child welfare, experts say the algorithms powering these technologies can help solve some of society’s deepest problems. But when unethically or improperly applied, they can encode systemic racism, amplify social divisions, and deepen inequalities. And in many countries, these apps and software packages are overseen by regulations written long before artificial intelligence even existed, leaving room for abuses of power.

In a series of deeply reported, narrative investigations that will span the globe, the Associated Press will examine the power and influence of predictive and surveillance technologies. Through lifting up stories from Delhi to Pretoria to Detroit, we will shine a light on the people who build, purchase and operate these systems—and the children, families, and communities they impact. We will compare these tools’ application across nations, and probe how people’s personal data can be sold and mined to expand the knowledge infrastructure of governments and corporations.

In this project, we’ll explore how governments employ these tools, who they impact, and how they affect notions of fairness and due process.

Image by Philipp Schmitt & AT&T Laboratories Cambridge / Better Images of AI / Data flock (faces) / CC-BY 4.0.

SECTIONS

RELATED INITIATIVES

Logo: The AI Accountability Network

Initiative

AI Accountability Network

AI Accountability Network

RELATED TOPICS

Criminal Justice

Topic

Criminal Justice

Criminal Justice
Governance

Topic

Governance

Governance
teal halftone illustration of two children, one holding up a teddy bear

Topic

Children and Youth

Children and Youth
an orange halftone illustration of a hand underneath a drone

Topic

AI Accountability

AI Accountability