Governments and corporations worldwide are increasingly harnessing the power of artificial intelligence and predictive technologies to help solve some of society’s biggest problems. But if left unchecked, algorithms have the potential to harm some of the most vulnerable members of society, deepening social and economic gaps and amplifying race, gender, and ability biases.

The Artificial Intelligence Accountability Network supports and brings together journalists reporting on AI and with AI globally. By supporting journalists and newsrooms that represent the diversity of the communities affected by AI technologies, the Network seeks to address the knowledge imbalance on artificial intelligence that exists in the journalism industry and to build the capacity of journalists to report on this fast-evolving and underreported topic with skill, nuance, and impact.

The Network consists of AI Accountability Fellowships and Machine Learning Reporting Grants.

Through the AI Accountability Fellowships, the Pulitzer Center provides journalists with financial support, a community of peers, mentorship, and training to pursue in-depth reporting projects that interrogate how these AI systems are funded, built, and deployed by governments and other powerful actors.

In its first year, the Network supported 10 Fellows to report in 10 countries. The 2022 cohort of AI Accountability Fellows reported on themes crucial to equity and human rights, including AI in hiring, surveillance, social welfare, policing, migration, and border control.

Applications for the 2023 AI Accountability Fellowships are now open. The deadline is July 1, 2023. Apply here.

Our Machine Learning Reporting Grants support journalists seeking to use machine learning to augment their reporting capacity on big data projects. Reporters we have supported combined the use of machine learning with geospatial analysis, satellite imagery, and traditional shoe-leather reporting, among other approaches. Applications are reviewed on a rolling basis. Apply here.

Recent grantees have used machine learning to reveal the true scope of oil well abandonment in Texas; hold land banks accountable in Ohio; reveal the scale of the corporate-owned single-family home rental industry in North Carolina; and map the proliferation of gold mines in the Amazon rainforest.

We require the sharing of methodologies and lessons learned from the projects we support so they may serve as valuable resources and blueprints for other newsrooms pursuing similar projects.

Videos by Daniel Vasta. 2022.