Translate page with Google

Pulitzer Center Update September 29, 2023

At Syracuse, Grantee Karen Hao Discusses Reporting On and With AI

Two people using a computer

The AI supply chain often concentrates power into the hands of wealthy nations while leaving the...

“If you think about what data actually is, data is people. The data is not coming from nowhere, it’s coming from someone. We are the data,” Pulitzer Center grantee Karen Hao said during a lecture at Syracuse University on September 18, 2023.

As part of the Pulitzer Center's Campus Consortium partnership with Syracuse's Newhouse School of Public Communications, Hao spoke to students about her journalism career and her reporting on artificial intelligence (AI), including her Pulitzer Center-supported work. 

In a presentation to students, Hao shared how her project, AI Colonialism, tells the stories of the workers training the algorithms and the negative impacts the work has on them, while countries with more power disproportionately reap the technology’s economic rewards.

Image by Mikaela Schmitt. United States, 2023.

“[Historical colonialism] ultimately disenfranchised and held back the economic development of a lot of countries because their labor and their resources were being sent away to support the development of the Global North, not themselves,” Hao said. “I was seeing the same story again and again. Seeing it through this metaphor of colonialism really helps people understand the vastness of it all and that we are talking about a global phenomenon, happening at a global scale.” 

The labor impacts of the rapid globalization of AI extend beyond the workers behind the algorithms. Hao explained to students that it is “not a coincidence” that there has been an uptick in strikes lately, from the United Automobile Workers, the Writers Guild of America, and SAG-AFTRA in the U.S., to railway workers in Britain and France. 

“It’s not that AI systems today are sophisticated enough to actually do their jobs or our jobs; I do not think AI systems will ever become that sophisticated because of the fundamental technical limitations of these tools,” Hao said. “But, it doesn’t matter if there are technical limitations. As long as people believe that it can automate away those jobs, the specter of automation really erodes people’s labor rights. You can’t be at the bargaining table … when you believe, and the executives believe, they can just get rid of you and replace you with ChatGPT.”

Image by Mikaela Schmitt. United States, 2023.

Students were eager to hear Hao’s thoughts on the impact of AI on productivity and possible free speech regulations regarding the dissemination of disinformation/misinformation by AI.

One student asked if there is an ethical way to engage with generative AI tools, such as ChatGPT. 

“The question is: Is it proportional? Is the thing you used the resources for worth the resources it used?” Hao responded. 

Image by Mikaela Schmitt. United States, 2023.

Hao also explained how her worries about disinformation/misinformation due to AI are focused on other countries.

In the U.S., the “balance of misinformation and high-quality information coming from journalists and fact checkers” is strong, she said. She argued that it will have the greatest impact on non-English-language countries and non-U.S. elections, and she encouraged students to think critically about the global impacts beyond the Global North. 

Hao, who previously was a foreign correspondent covering China’s technology industry for The Wall Street Journal, reflected on her experiences reporting in China, a country with strict laws regulating journalism.

Speaking to the Business and Ethics of Journalism in a Changing World class, she explained that at the core of reporting ethics in China is advocating for sources.

“It's really important to assess not just your risk, but the sources’ risk, and put the sources’ risk and safety as your priority,” Hao said.

Hao met individually with students in the morning to answer questions and offer career advice. She then had lunch with students before presenting in classes. Students were eager to hear how she got into the AI beat, and how to elucidate the complexities of tech to her audiences. 

“Talking to people who are just starting to hear [about] AI for the first time and are overwhelmed, the questions they ask me, like this event right now, the questions you’re asking helps me,” Hao told the students. “That gives me new ideas for how I can frame my next story or how I can make it more accessible to an audience. It’s just trying to constantly think about and keep in touch with the reader that I would want to reach.” 

Image by Mikaela Schmitt. United States, 2023.



Logo: The AI Accountability Network


AI Accountability Network

AI Accountability Network