
Thinking outside the box to crack the 'black box'
When AI Accountability Network Fellow Karol Ilagan decided to investigate the algorithm that calculates fares for Grab, the Philippines' most popular ride-hailing app, we faced the classic "black box" problem. An algorithm is a series of instructions that, given an input, returns an output: We usually know the input and output, but for "black box" algorithms we don't know what happens in between.
Metro Manila has some of the worst traffic congestion in the world, so bad that its residents sharpen their wits to describe it: They say they face "carmageddon" and commute like "sardines." Grab’s car service is virtually a monopoly in the Philippines and, like similar ride-hailing apps, has a dynamic pricing system. In addition to the time and distance fees, set by regulators, a surge fee can be applied, depending on supply and demand. This extra charge can double the cost of a ride but is totally opaque: No one knows how it’s calculated.
To crack Grab's “black box,” Ilagan and the Pulitzer Center's Data and Research team designed a custom methodology, combining creativity and collaborative work. For a week, a group of 20 researchers attempted hourly to book 10 Metro Manila trips on the app, and logged the data. In parallel, we launched a bot that captured fare data for the same 10 routes over the same period directly from Grab's website. We collected more than 8,000 data points.
What did we find? The surge pricing, supposedly dependent on supply and demand, was present on all rides, regardless of time and location. Moreover, higher fees didn't necessarily mean shorter wait times, which was one of the objectives of surge pricing in the first place. Grab acknowledged our findings, confirming that the surge period could become "prolonged."
The investigation, published by the Philippine Center for Investigative Journalism, PumaPodcast, and Commoner, proves that it's possible to reach meaningful insights even if we can't see what's happening inside the “black box.” Accountability for algorithms, whose decisions impact our lives every day, is a key field for investigative journalism. Devising creative strategies to analyze their outputs seems to be one of the most promising avenues.
Best,

Impact
Artificial intelligence technologies offer significant opportunities for media innovation while simultaneously posing its greatest threat. On July 30, 2024, Pulitzer Center Executive Editor Marina Walker Guevara spoke to Open Society Foundations about efforts at the Pulitzer Center to support newsrooms covering AI.
On the future of artificial intelligence in the media, Walker Guevara told OSF:
“These tools are imperfect and incomplete, and they require constant supervision. By bringing together journalists who cover communities disproportionately affected by AI, we are creating a collaborative ecosystem that empowers media to better report on, and use, this rapidly evolving technology to give audiences a greater understanding of the role it plays in our lives.”
The Pulitzer Center is working to empower journalists to report on artificial intelligence by launching an ambitious training program for journalists internationally, prioritizing those in the Global South. The AI Spotlight Series offers a range of free online workshops and webinars on reporting at all levels.
Read the full Q&A with Walker Guevara here.
Photo of the Week

This message first appeared in the August 9, 2024, edition of the Pulitzer Center's weekly newsletter. Subscribe today.
Click here to read the full newsletter.