Translate page with Google

Pulitzer Center Update June 11, 2024

Mozilla Festival Partners With Pulitzer Center

Country:

Author:
Drone
English

AI surveillance tools rely on low-paid workers to label and organize data.

The Mozilla Festival is working closely with partners and allies to collectively explore the core issues facing us today.

One of our key partners for MozFest House Amsterdam in June—"the premier gathering for people working to build a better digital world"—is the Pulitzer Center, especially the work it is doing around reporting on AI with an accountability lens. MozFest House Amsterdam will be held June 11-13 at the Tolhuistuin in Amsterdam-Noord, a borough of Amsterdam. We will be exploring the role of journalism and investigating the impact of AI across multiple industries and movements.

We have partnered with the Center’s AI Accountability Network to bring together programming that is focused on journalism. 

With this year’s global elections cycle, coupled with the rise of AI threatening the integrity of journalism, we're partnering with media professionals to address the challenges posed by AI.


Journalism Sessions at MozFest House by the Pulitzer Center’s AI Accountability Fellows:

Data and Diversity in AI

How does data collection impact different communities around the world? Who gets disproportionately harmed or left out? How does it affect elections? Join us to discuss how data shapes the lives of individuals and the outcomes of democratic elections.

Speakers: Joanna Kao and Srishti Jaswal

Related coverage by the Fellows:

"Blind Internet Users Struggle With Error-Prone AI Aids" | Joanna Kao

"The Data Collection App at the Heart of the BJP’s Indian Election Campaign" | Srishti Jaswal

Investigating the Black Box

How do journalists find out how algorithms work? Are there other ways to improve the investigation methods?

Speakers: Pablo Jiménez Arandia, Karol Ilagan, and Federico Acosta Rainis, Pulitzer Center data specialist


Following the AI Supply Chain

AI is a global story; it crosses borders. In this session, we discuss ways in which AI Accountability Fellows look into AI supply chains for surveillance, intermediaries distributing micro work to gig workers training AI, and the labor conditions of humans behind the machines. We delve deeper into how AI work can be fair for all.

Speakers: Niamh McIntyre and Tatiana Dias

Related coverage by the Fellows:

"Online Gig Work Is Feeding Russia’s Surveillance Machine" | Niamh McIntyre

"Paid Pennies To Train Tools of Repression: The Humans Behind Moscow’s State Surveillance" | Niamh McIntyre

Moderadores Subterrâneos (Underground Moderators): | Tatiana Dias (link in Portuguese)


In addition to these panels, Mozilla Foundation teams are collaborating with the Pulitzer Center to run a workshop meant to equip and empower journalists to interrogate and investigate AI, and hold its makers accountable.

 

SECTIONS
English

WhatsApp has become a battlefield for contemporary Indian elections.

The illustration shows people from all walks of life looking at their phones, to show connection to the internet and digital accessibility.  Image courtesy of Jamillah Knowles & Reset.Tech Australia / Better Images of AI / People with phones / CC-BY 4.0.
English

AI still faces hurdles in making the world more accessible to the disabled.

illustration of coils, rectangles with swirls, people's silhouettes, and drops in red, blue, and yellow
English

A close look at this underground workforce

RELATED INITIATIVES

Logo: The AI Accountability Network

Initiative

AI Accountability Network

AI Accountability Network

RELATED TOPICS

an orange halftone illustration of a hand underneath a drone

Topic

AI Accountability

AI Accountability