Translate page with Google

Story Publication logo October 30, 2024

The Rumor Clinic: At the Center for an Informed Public, Kate Starbird Tracks Falsehoods and Counters Them in Real Time

Author:
illustration showing social media misinformation
English

The challenges researchers grapple with as they try to understand how misinformation spreads.

SECTIONS

Hours before U.S. presidential candidates Kamala Harris (D) and Donald Trump (R) stepped out onto the debate stage in a Philadelphia studio on 10 September, disinformation researcher Kate Starbird and about a dozen colleagues gathered in a conference room at the University of Washington (UW) in Seattle to discuss their own strategy for the night.

The researchers, based at the university’s Center for an Informed Public (CIP), were on the lookout for rumors about the election process—or at least, clues to what rumors might come. “Rumors tend to emerge the next day, but the frames and themes that will dictate what those rumors are are going to happen tonight,” Danielle Lee Tomson, the team’s research manager, explained that afternoon.

“If noncitizen voters are mentioned or something about mail-in ballots in Pennsylvania, those will be the narratives we will be working on tomorrow morning,” Starbird added.

Starbird and her colleagues have spent more than 4 years studying the rumors that swirl around elections. It’s not purely an academic interest: As they amass data, the team writes rapid research blogs explaining to journalists, election officials, and the public what rumors are circulating and where they are coming from—and correcting the record. “I jokingly call our group the ER [emergency room],” Tomson says. “What we do is triage information.”


As a nonprofit journalism organization, we depend on your support to fund more than 170 reporting projects every year on critical global and local issues. Donate any amount today to become a Pulitzer Center Champion and receive exclusive benefits!


The research—part data science, part sociology, part journalism, part political analysis—has won plaudits from fellow misinformation researchers. “Kate’s work really sits at the intersection of all of these in a way that I think very few others do, and it makes it really hard to define, but it makes it so incredibly impactful,” says Rebekah Tromble, a computational social scientist at George Washington University.

But it has also made Starbird the target of harassment and threats. Recently, she has found herself the subject of political attacks by Republicans in the U.S. House of Representatives, who have cast her as part of a “censorship industrial complex,” accusing her of suppressing conservative political speech in the course of her work studying election integrity.

Starbird has stood her ground. “The real problem some politicians have with the research is that it can blunt ideological campaigns to mislead the public,” she and her colleague Ryan Calo wrote in an editorial in Science. “She’s tough as nails, and she’s not going to stop doing research because of these kinds of things,” says Jevin West, her colleague and co-founder of CIP. “Her role has been not just representing our center a lot of times, but also, I think, representing the field.”

But the attacks and the negativity of the content she is studying do take a toll, Starbird says. “You can just lose hope,” she says. “How are we ever going to get ourselves out of this?”


Starbird started out an optimist. For her Ph.D. at the University of Colorado she worked under Leysia Palen, studying “crisis informatics”: how people gather and share information online during events such as natural disasters or terror attacks. Her thesis concentrated on the digital volunteers who crowdsourced maps of victims in need of help following the Haiti earthquake in 2010, but she also researched volunteer work around other events such as the Deepwater Horizon oil spill that same year or the tornado that tore through the town of Joplin, Missouri, in 2011. A lot of the work involved looking at “some of the worst of times, but the best of human behavior,” she says.

In 2012, after finishing her dissertation, Starbird joined UW and started to look at rumors—her preferred term for what others would often class as misinformation or disinformation. “The reason we use ‘rumors’ is you can start talking about something before you know intent and before you know veracity,” Starbird says. The word avoids labeling as misinformation something that later turns out to be true, for instance, and it doesn’t stigmatize people spreading the information, who may actually believe it.

Rumors have traditionally had a bad reputation. In World War II the U.S. government was so worried that rumors could hurt the public’s morale that it put up posters warning against spreading them. “Of all the virus that attack the vulnerable nerve tissues of a nation at war, rumor is the most malignant,” Life magazine wrote in 1942. The government also came up with the idea of “rumor clinics,” where researchers would track and study rumors while also correcting the record.

After the war, Japanese American researcher Tamotsu Shibutani helped change how rumors were seen, casting them as a rational response to disasters and uncertainty. To Shibutani, who had spent the war imprisoned in a Japanese internment camp in California, rumor was more of a verb than a noun, a collective process of making sense of the world. His view was heavily influenced by his experience in the camp, Starbird notes, where the community used rumors to try and cope with the “horrible uncertainty” they faced. “In times of crisis and anxiety and these kinds of conditions, to participate in rumoring is a natural thing to do, and then there’s other folks that are going to exploit that,” she says. “Those dynamics have always been there.”

Starbird initially expected her research to tell a positive story: how rumors spread by some users are quickly fact checked and corrected by others. “I’d sort of seen that in the early data, where we would see rumors and misinformation, but they were quickly corrected,” Starbird says. “They weren’t causing a lot of damage, as far as we could tell.” It was the idea of a self-correcting crowd, a kind of Wikipedia in the wild.

But her first few studies quickly disabused her of this notion. “We could really see, especially between 2013 and 2015, that rumors and misinformation were becoming a larger and larger part of the discourse online during these crisis events,” she says. Rumors were rarely being corrected—and when they were, the correction usually came too late and reached far fewer people than the original, false information. A rumor that the 2013 Boston Marathon bombing was a false flag attack by Navy SEALs, for instance, circulated on Twitter (now X) for days. Social media algorithms, designed to keep people’s attention, often advantaged the sensational. “And it turns out that falsehoods and conspiracy theories tend to be more sensational than the truth,” Starbird says.

Biography of a rumor

Former President Donald Trump has made unsubstantiated claims about Democrats planning to “steal the election” using the Uniformed and Overseas Citizens Absentee Voting Act (UOCAVA), which allows military personnel and citizens living abroad to vote. Researchers at the University of Washington traced the genesis of the claim to an article published by the far-right fake news website the Gateway Pundit in early September and tracked the response to it on X (formerly Twitter).

  1. Original Gateway Pundit article
    The Gateway Pundit publishes an article claiming Democrats are planning to use fake overseas votes to steal this year's presidential election. The story is based in part on a DNC memo that erroneously states there are about 9 million Americans living or serving overseas. (It is far fewer.) The outlet promotes the article on X with a tweet that also tags Elon Musk.
  2. Second Gateway Pundit article
    The Gateway Pundit follows up with another article: “WAKE UP, REPUBLICANS!... Democrats Are Openly Stealing the 2024 Election with Fraudulent Overseas Ballots.” Again it posts a tweet to promote it.
  3. Gateway Pundit tweet
    The Gateway Pundit posts another tweet linking to the first article and tying the overseas votes rumor to the idea there was election fraud in the 2020 election in Fulton County in Georgia.
  4. Trump posts on Truth Social at 4:33 p.m. UTC
    Trump writes that Democrats “are getting ready to CHEAT! They are going to use UOCAVA to get ballots.” The Gateway Pundit quickly publishes and promotes a piece saying “President Trump Acknowledges Gateway Pundit's Exclusive Reporting On a LEGAL Trick Democrats Can Use To Steal The Election.”
Data: Center for an Informed Public at the University of Washington. Image by M. Hersher/Science.

At first, Starbird told her students to ignore these kinds of rumors. “I did not want to stand on stages talking about conspiracy theories,” she says. But event after event spawned this type of rumor. And then, in late 2015, after the terror attacks on Paris and a mass shooting at a community college in Umpqua, Oregon, Starbird noticed something else. The various groups spreading rumors about these events—white supremacists, Trump fans, and supporters of WikiLeaks and the hacker group Anonymous—were becoming increasingly connected on social media via a burgeoning number of shared followers, even though the groups normally appeal to completely different audiences. “We began to see that the disinformation was sinking into the structure of the network graph,” Starbird says. “That was the moment I went like holy …”

There is a concept in neuroscience called Hebb’s law: What fires together, wires together. Neurons that are activated together strengthen their connections, making it more likely that they will fire together in the future. Something similar was happening online, aided by social media algorithms that suggested people following one superspreader of misinformation also follow others. If social media is a kind of collective consciousness of humanity, Starbird was seeing it rewire in real time. And the pathways being laid down did not point to a good place.

But even then, Starbird remained optimistic. If she could help people understand what was happening, then platforms, journalists, and others would take actions to address the issue, she thought. “I guess I was naïve and overly optimistic to believe that people would be turned off when they discovered that they were being manipulated—and that political parties would reject, rather than double down on, these manipulations.”


In December 2019, Starbird co-founded CIP, a collaboration between UW’s information school, law school, and the school for engineering. Its declared aim is to “resist strategic misinformation, promote an informed society, and strengthen democratic discourse.”

At CIP, Starbird has cultivated a diverse group, with expertise spanning computer science, sociology, library science, psychology, and ethnography. Before becoming an academic, Starbird was a professional basketball player in the Women’s National Basketball Association, Tomson notes. “You can’t play basketball with just one kind of person,” she says. “She understands that different team members have different strengths.”

Unusually for academia, a lot of the team’s work happens in real time and is public facing. The group has one arm called “discovery” with shifts of students and postdocs monitoring social media and flagging anything they find interesting or alarming. Another arm called “analysis” sifts through what the team has flagged, deciding which rumors to investigate further and respond to. In effect, she has built up a kind of modern-day rumor clinic.

"When you have the ability to help, how can you just sit there and watch?"

Kate Starbird, University of Washington

Starbird acknowledges that the nature of her work makes it difficult to measure its impact. The goal is not necessarily to stop rumors from spreading, she says, but rather to help decision-makers understand why they are spreading and how they are misleading. For elections, that means reaching election workers and officials, lawyers, and judges, “the people who will find themselves faced with the decision of whether or not to rely on these rumors to make decisions.”

This impulse to intervene came out of her Ph.D. work in crisis informatics, Starbird says, where she ended up not just studying how volunteers were responding to disasters, but also helping make maps to aid them with their work. “When you have the ability to help, how can you just sit there and watch?” she asks. CIP’s research does eventually lead to peer-reviewed work in scientific journals. But papers come out far more slowly than rumors circulate and need to be countered, Tomson says. “The timeline of science, as we all know, is much slower than the timeline from which regular folks want answers.”

Almost immediately after CIP was founded, it went into action. The COVID-19 pandemic arrived, and the researchers found themselves sifting through misinformation about the virus. Then came the 2020 U.S. elections. One hundred days before the elections, CIP became part of the Election Integrity Partnership (EIP) along with other institutions including Stanford University’s Internet Observatory. The project aimed to “detect and mitigate the impact of attempts to prevent or deter people from voting or to delegitimize election results.”

In some ways elections are a natural next step for someone studying communication around crises like natural disasters, says Emma Spiro, a co-founder of CIP who took over from Starbird as director in September. “Elections in the U.S. just happen to involve a similar level of uncertainty and anxiety and sort of unknown outcomes,” she says. And the stakes are high. False information about when and where to vote can confuse people and rob them of their vote as well as generally erode confidence in elections and democracy.

The 2020 elections may have been just a trial run. In March 2021, an EIP report concluded that although the project “was intended to meet an immediate need, the conditions that necessitated its creation have not abated, and in fact may have worsened.” Later that year CIP received a $3 million grant from the National Science Foundation for follow-up work, which is funding the team’s studies of this year’s elections.


It was a rumor that provided the most memorable moment of September’s presidential debate. By the time Harris and Trump stepped off the stage, Trump’s claim that Haitian immigrants in Springfield, Ohio, were “eating the pets” was ricocheting around the internet, being turned into songs, memes of scared pets, and the butt of late night hosts’ jokes. But neither candidate had said anything directly related to the integrity of the upcoming election—the kind of claim Starbird’s team researches.

Another rumor was already starting a slow ascent into public consciousness, however. That very morning, the Gateway Pundit, a right-wing fake news website, had published an article claiming Democrats were planning to steal the election using fraudulent overseas ballots. The site had posted a first story a week earlier, and it followed up with a third on 22 September.

On 23 September, Trump took note and amplified the claim, posting on Truth Social, the social media site he part owns, that Democrats were “getting ready to CHEAT! They are going to use UOCAVA to get ballots, a program that emails ballots overseas without any citizenship check or verification of identity, whatsoever.” That post led to another spike of traffic about the rumor on X, and the Gateway Pundit followed up with another article headlined “BREAKING: President Trump Acknowledges Gateway Pundit’s Exclusive Reporting On a LEGAL Trick Democrats Can Use to Steal the Election.” By that point, Starbird’s team had noticed.

The researchers saw several telltale signs suggesting the rumor would travel far. For one, it was based on a “real” piece of evidence: a 12 August memo in which the Democratic National Committee wrote it was hoping to reach 9 million overseas voters, though in reality there are far fewer. The article cast this as a plan to flood the election with fake overseas votes. “If you have a piece of truth that you wrap up in a larger narrative, decontextualized from its environments, that makes it more compelling, much harder to fact check,” Tomson says.

The Gateway Pundit articles also all used the acronym UOCAVA, which refers to a little-known law that regulates overseas voting: the Uniformed and Overseas Citizens Absentee Voting Act. That acronym is catchy and very searchable, Tomson says. To successfully spread a rumor—or spot one that is likely to spread—“you almost have to think like a marketer or an advertiser or a storyteller,” she says. Little had been written about this particular law in the past, and that data void made it easier to seed a new narrative.

On 1 October, Starbird’s team posted a long article online outlining the “emerging rumor.” The rumor tied into the common narrative that immigrants were illegally voting in U.S. elections, the researchers wrote. “We know from prior research that the most effective and potentially viral rumors combine a novel element—such as “UOCAVA”—with a familiar theme, “noncitizen voting.” The rumor could gain traction in the following weeks, the team cautioned.

Indeed, within days news emerged that Republican groups had filed lawsuits in the swing states of Michigan, North Carolina, and Pennsylvania challenging overseas voting policies. The move was part of a pattern, Starbird and colleagues later wrote, “whereby rumors and disinformation campaigns may serve to motivate legal action, inscribing the concern into public record and offering it a patina of legitimacy.”


It was probably inevitable that Starbird herself would eventually become the subject of rumor. EIP, the rumor went, had colluded with government agencies to censor millions of tweets around the 2020 elections. In fact, EIP researchers had flagged close to 5,000 questionable posts to social media companies, which removed roughly 10% of them. Starbird’s critics also seized on her role as chair of an external advisory committee to the Department of Homeland Security on “misinformation, disinformation, and malinformation.” The House of Representatives Judiciary Committee launched an investigation in early 2023, charging that Starbird had supported a “censorship regime,” demanded years of her communications, and interviewed her in June 2023.

The interview was “superstressful,” Starbird says, partly because she felt she had to be perfect in defending her and her colleagues’ work. “It’s like you have to talk in a way that’s completely bulletproof all the time, because the worst person in the world is going to try to take something you say and leak it.”

Starbird stresses that falsehoods and misinformation can take hold of anyone, regardless of their political beliefs. After the attempted assassination of Trump in Butler, Pennsylvania, she says she was confronted by conspiracy theories from liberals. “I remember being at a party and being like, ‘Everybody calm down. It’s not a staged event.’” But a parallel rumor circulated widely on the right: that the shooting was an inside job by the Secret Service. “It’s not that we’re not equally vulnerable,” Starbird says. “What’s different right now in the United States, Philippines, India, and other places, is that we’ve got right-wing populists that are effectively using this strategically to gain and maintain their power.”

Starbird says she could see early on that the rumor about EIP would gain momentum. “I was like, ‘Oh, my God, this is going to be really hard for us to counter.’” It combined fiction with factual statements about EIP, and identifying both the factual roots and the distortions would be difficult because EIP was a large project with many different partners, none fully aware of what the others were doing.

The advice from other researchers was to lie low, so it was months before Starbird put out statements correcting misperceptions. In retrospect she should have acted sooner, she says. Nobel Peace Prize winner Maria Ressa, a Filipina journalist who has studied how fake news spreads on social media, later told Starbird as much. “She’s like, ‘There’s a golden hour that you have. You have this window. If you miss the window, you’ve missed it.’ And we totally missed it.” Starbird says the attacks taught her that 
lesson—and another. “We’re not afraid of the unknown anymore.”

Other misinformation researchers have pointed to many reasons why Starbird has been the subject of so many political attacks: because she is a woman, for instance, or because she is gay. But Starbird suspects part of the reason she has been singled out is that her team isn’t just studying misinformation, but actually pushing back against it. “We’re actually trying to make a change on a hard problem in real time,” she says.

Rumor has it, she won’t stop anytime soon.

RELATED TOPICS

a pink halftone illustration of a woman speaking a microphone while raising a fist

Topic

Democracy and Authoritarianism

Democracy and Authoritarianism
orange halftone illustration of three newspapers stacked on each other

Topic

Misinformation and Disinformation

Misinformation and Disinformation

Support our work

Your support ensures great journalism and education on underreported and systemic global issues