The Artificial Intelligence Accountability Network supports and brings together journalists reporting on AI and with AI globally. Working with journalists and newsrooms that represent the diversity of the communities affected by AI technologies, the Network seeks to address the knowledge imbalance on artificial intelligence that exists in the journalism industry and to create a multidisciplinary and collaborative ecosystem that enables journalists to report on this fast-evolving topic with skill, nuance, and impact.



The Network supports two programs:  AI Accountability Fellowships and Machine Learning Reporting Grants.

The Al Accountability Fellowships seek to support journalists working on in-depth AI accountability stories that examine governments' and corporations’ uses of predictive and surveillance technologies to guide decisions in policing, medicine, social welfare, the criminal justice system, hiring, and more.

Through the AI Accountability Fellowships, the Pulitzer Center provides journalists financial support, a community of peers, mentorship, and training to pursue in-depth reporting projects that interrogate how these AI systems are funded, built, and deployed by corporations, governments and other powerful actors. 

Here you can find our fellowship FAQ and other tips for a successful application. 

Our Machine Learning Reporting Grants support journalists seeking to use machine learning to augment their reporting capacity on big data projects. Reporters we have supported combined the use of machine learning with geospatial analysis, satellite imagery, and traditional shoe-leather reporting, among other approaches. Their stories mapped the proliferation of gold mines in the Amazon rainforest, predicted oil-well abandonment in Texas, and revealed the corporate owners that are amassing single-family house rentals in North Carolina.  

Applications for our Machine Learning Reporting Grants are reviewed on a rolling basis.


In its first year, the Network supported 10 Fellows to report in 10 countries. The 2022 cohort of AI Accountability Fellows reported on themes crucial to equity and human rights, including AI in hiring, surveillance, social welfare, policing, migration, and border control.

Applications for the 2023-2024 AI Accountability Fellowships have now closed. Thank you for your interest.

If you have questions about our AI fellowships or grants you can contact AI Network manager Boyoung Lim at [email protected].


The stories reported by our AI fellows and grantees have been cited by courts that ruled entire programs unconstitutional; sparked official investigations; and inspired other journalists to replicate their methods in their own reporting.


Massive Police Facial Recognition Database Now Requiring Policy Limits on Use

The South Florida Sun Sentinel series supported by the Pulitzer Center was also cited in testimonies to U.S. Congress and President Joe Biden’s AI Bill of Rights.


Reports Inspired by 'Security for Sale' Project Investigate Corporate Homebuyers

Reports on corporate landlords contributing to widening wealth gaps in New Jersey and Kentucky have taken inspiration from the Pulitzer Center-supported project Security for Sale


Universities and Students Investigate Use of AI Surveillance Tool in Response to Grantee’s Reporting

Pulitzer Center AI Accountability Fellow Ari Sen and UC Berkeley Investigative Reporting Program journalist Derêka Bennett revealed a lesser-known use of an AI tool known as Social Sentinel: surveilling campus protests.


We believe in radical sharing of methodologies and lessons learned from the projects we support so they may serve as valuable resources and blueprints for other newsrooms, universities, and civil society organizations pursuing similar projects.


We work across disciplines and areas of expertise to leverage knowledge and approaches and create powerful synergies in the public interest. We are grateful to our partners for their inspiration and support!