Translate page with Google

Story Publication logo May 17, 2019

Facebook's Regulation Fail in Ukraine Should Worry Europe

Country:

Author:
Ukrainian presidential candidate Volodymyr Zelensky. Image by Sergei Chuzavkov / Shutterstock. Ukraine, 2019.
English

Ukraine—the home of Europe’s hot war, and the Petri dish where Russian information operations are...

SECTIONS
Ukrainian President-elect Volodymyr Zelenskiy shows victory sign during a comedy show at a concert hall in Brovary. Ukraine, 2019. Image by Shutterstock.
Ukrainian President-elect Volodymyr Zelenskiy shows victory sign during a comedy show at a concert hall in Brovary. Ukraine, 2019. Image by Shutterstock.

KIEV — For years, Ukraine has been a playground where Russia tests its abilities to interfere in elections — whether through hacking or influence campaigns. Last month’s presidential election was no exception: The high-stakes campaign became a test of Facebook’s commitment to protect elections from foreign and domestic disinformation.

The Silicon Valley tech giant failed — badly.

Facebook’s efforts to weed out misinformation from the content available on its platform was a Potemkin village of regulations riddled with cracks and loopholes that were easily exploited by actors across the Ukrainian information ecosystem.

The company’s patchy protection of the Ukrainian vote should be a wake-up call to Europe as its lawmakers grapple with how to guard against misinformation ahead of the European Parliament election later this month. It indicates that, despite Facebook’s self-reported progress fighting election interference, the tech giant’s efforts are still falling short.

Ukraine’s new president-elect, Volodymyr Zelenskiy, beat incumbent Petro Poroshenko in a dirty race in which Facebook's lax monitoring of political advertising allowed disinformation to run rampant.

Among the Facebook pages that spread spurious claims during the election was one with more than 100,000 followers that ran a video claiming Zelenskiy will allow Russia to take over the country with a violent military operation. Others portrayed him as a drug addict, or Poroshenko as an alcoholic. One Facebook page posted a digitally edited picture of a Ukrainian rock star holding an anti-Poroshenko sign, when the musician had in fact been denouncing Zelenskiy.

Beyond their divisive rhetoric, these Facebook pages shared a common characteristic: Voters had little idea who funded or curated them.

Facebook was fatally late in introducing ad transparency rules creating a searchable advertising database and requiring political advertisers to “confirm their identity and location” or risk an ad’s removal. These measures took effect only two weeks before the election’s first round on March 31.

When the rules did finally go live, they were slowly enforced, and pages that repeatedly violated them were still allowed to attempt to buy ads.

"Servant of the Freak," the Facebook page that claimed Zelenskiy is a Russian collaborator and addict, placed and ran at least 59 political ads during the election. Facebook eventually rejected them all for non-compliance with transparency rules, but not before they reached millions of Ukrainians.

The 17 ads placed by the page in April, between the first and second rounds of voting, garnered no fewer than 6 million impressions and cost its administrators between $7,000 and $29,993 before Facebook removed them.

Ukrainian election also showed there are ways to get around Facebook’s geographic regulations to spread disinformation.

Ukraine’s Security Service announced in January that it had uncovered a Russian plot to rent accounts from Ukrainians and use them to place political ads. A casual Google search of the phrase “Facebook account rental” in Russian and Ukrainian turns up countless results.

Standalone websites, posts on freelance boards and even Facebook pages and groups advertise that any Facebook account more than six months old with at least 200 friends can earn about $100 per month — around one-third of the average Ukrainian salary — simply by handing over their account to an advertiser.

It’s impossible to know the extent to which this practice has infiltrated Ukraine’s political discourse, but the brazen attempts to recruit ad mules suggest it is a serious problem.

Attempts to influence the political discussion online are not restricted to paid advertisements. Pages can build huge followings without the purchase of a single ad. These pages are not governed by Facebook’s rules on “ads related to politics or issues of national importance,” and the public has no way to know who is behind them or what interests they represent.

Facebook has made pages and groups more transparent since the 2016 U.S. election — making pages’ creation dates, manager locations and name changes visible to users. But to truly be effective, the platform should require managers of large communities to take public ownership of them, whether they are purchasing ads or spreading far-reaching content without paying to do so.

Doing so would not just protect against inauthentic manipulation of communities — exemplified by Russia’s interference in the 2016 American presidential election — it would introduce more responsibility into discussions on Facebook.

Users aren’t allowed to have more than one personal Facebook account or use fake names on the platform; why should they be allowed to control communities that reach thousands of people from the shadows?

Facebook could also make its ad library searchable by purchaser attributes to allow users to more easily spot manipulation. Facebook users should not need investigative journalism skills to understand who is influencing them during an election campaign.

Europe’s online debate ahead of May’s ballot has the potential to be even uglier than Ukraine’s. The Continent is home to burgeoning far-right, populist and nationalist elements, as well as an ever-more connected electorate. It also suffers from inadequate regulation governing online election communications.

If Facebook’s “protection” of Ukraine’s vote is a harbinger of things to come, the EU should steel itself for an opaque online influence battle aided and abetted by the platform itself.

RELATED TOPICS

war and conflict reporting

Topic

War and Conflict

War and Conflict
technology and society

Topic

Technology and Society

Technology and Society

RELATED INITIATIVES

two soldiers stand holding guns and wearing fatigues

Initiative

Conflict & Peace

Conflict & Peace

Support our work

Your support ensures great journalism and education on underreported and systemic global issues