The Artificial Intelligence Accountability Network supports and brings together journalists reporting on AI and with AI globally. Working with journalists and newsrooms that represent the diversity of the communities affected by AI technologies, the Network seeks to address the knowledge imbalance on artificial intelligence that exists in the journalism industry and to create a multidisciplinary and collaborative ecosystem that enables journalists to report on this fast-evolving topic with skill, nuance, and impact.

READ THE REPORTING | HOW IT WORKS | JOIN THE NETWORK | AI SPOTLIGHT SERIES | IMPACT | RESOURCES | PARTNERS

HOW IT WORKS

The Network supports two programs:  AI Accountability Fellowships and Machine Learning Reporting Grants.

The Al Accountability Fellowships seek to support journalists working on in-depth AI accountability stories that examine governments' and corporations’ uses of predictive and surveillance technologies to guide decisions in policing, medicine, social welfare, the criminal justice system, hiring, and more.

Through the AI Accountability Fellowships, the Pulitzer Center provides journalists financial support, a community of peers, mentorship, and training to pursue in-depth reporting projects that interrogate how these AI systems are funded, built, and deployed by corporations, governments and other powerful actors. 

Here you can find our fellowship FAQ and other tips for a successful application. 

Our Machine Learning Reporting Grants support journalists seeking to use machine learning to augment their reporting capacity on big data projects. Reporters we have supported combined the use of machine learning with geospatial analysis, satellite imagery, and traditional shoe-leather reporting, among other approaches. Their stories mapped the proliferation of gold mines in the Amazon rainforest, predicted oil-well abandonment in Texas, and revealed the corporate owners that are amassing single-family house rentals in North Carolina.  

Applications for our Machine Learning Reporting Grants are reviewed on a rolling basis.

JOIN THE NETWORK

In its first year, the Network supported 10 Fellows to report in 10 countries. The 2022 cohort of AI Accountability Fellows reported on themes crucial to equity and human rights, including AI in hiring, surveillance, social welfare, policing, migration, and border control. If you're interested in applying for a short-term project grant, visit our AI reporting grant page.

Applications for the 2023-2024 AI Accountability Fellowships have now closed. Thank you for your interest.

If you have questions about our AI fellowships or grants you can contact AI Network manager Boyoung Lim at [email protected].

The AI Spotlight Series is designed to equip reporters and editors—whether on the tech beat or any other—with the knowledge and skills to cover and shape coverage of AI and its profound influence on society. Our instructors include some of the world’s leading tech reporters and editors tracking AI and data-driven technologies for years.

Each course is designed to give you a strong grounding in what AI is and how it works as well as the tools to identify critical stories—from spot news to deep investigations—that will highlight the technology’s impacts, hold companies and governments accountable, and drive policy and community change, while avoiding both hype and unnecessary alarmism.

IMPACT

The stories reported by our AI fellows and grantees have been cited by courts that ruled entire programs unconstitutional; sparked official investigations; and inspired other journalists to replicate their methods in their own reporting.


INFORMATION & ARTIFICIAL INTELLIGENCE IMPACT

Massive Police Facial Recognition Database Now Requiring Policy Limits on Use

The South Florida Sun Sentinel series supported by the Pulitzer Center was also cited in testimonies to U.S. Congress and President Joe Biden’s AI Bill of Rights.


INFORMATION & ARTIFICIAL INTELLIGENCE IMPACT

Reports Inspired by 'Security for Sale' Project Investigate Corporate Homebuyers

Reports on corporate landlords contributing to widening wealth gaps in New Jersey and Kentucky have taken inspiration from the Pulitzer Center-supported project Security for Sale


INFORMATION & ARTIFICIAL INTELLIGENCE IMPACT

Universities and Students Investigate Use of AI Surveillance Tool in Response to Grantee’s Reporting

Pulitzer Center AI Accountability Fellow Ari Sen and UC Berkeley Investigative Reporting Program journalist Derêka Bennett revealed a lesser-known use of an AI tool known as Social Sentinel: surveilling campus protests.

RESOURCES

We believe in radical sharing of methodologies and lessons learned from the projects we support so they may serve as valuable resources and blueprints for other newsrooms, universities, and civil society organizations pursuing similar projects.

PARTNERS

We work across disciplines and areas of expertise to leverage knowledge and approaches and create powerful synergies in the public interest. We are grateful to our partners for their inspiration and support!

---