Translate page with Google

Story Publication logo November 3, 2017

Can Germany Fix Facebook?

Country:

Author:
Investigators search archives all around the world looking for the names of potential perpetrators, including the crumbling ship manifests at Argentina's Hotel de Inmigrantes. Image by Linda Kinstler. Argentina, 2017.
English

A team of German prosecutors are scouring two continents for Nazis who have managed to escape...

SECTIONS
Image courtesy Andrey_Popov / Shutterstock / Facebook / Zak Bickel / The Atlantic.
Image courtesy Andrey_Popov / Shutterstock / Facebook / Zak Bickel / The Atlantic.

In early October, a musical titled Facebook—Terms and Conditions toured in Germany. It’s an intentionally outlandish love story about an aspiring novelist who pens a fictional work about an imaginary social network. The novel just happens to be titled Facebook—Terms and Conditions. Then, as luck would have it, she meets the founder of a new social-media network in search of a “complicated user’s manual” for his creation. In a masterstroke of farcical proportions, her novel becomes the terms and conditions of use for his social network.

The show’s creators, the satirist Peer Gahmert and the choreographer Tim Gerhards, usually collaborate on more-traditional theater projects “that no one visits and which no one wants to see,” as Gahmert put it in a recent interview. This time, their goal was to satirize Facebook’s cryptic regulations, which have made the company a target of vehement public criticism in a society historically suspicious of censorship in all forms. Virtually the entire script—including its eight musical numbers—is made up of phrases drawn from Facebook’s fine print. It also quotes from portions of the company’s community standards, which broadly outline what kinds of content are and are not allowed on the site (including direct threats, bullying, hate speech, violence, and attacks on public figures), as well as the company’s opaque procedures for removing posts that violate those standards.

“Since no one has ever read these terms before clicking ‘Create Account,’ we thought it would be a good idea to put them into something as enjoyable and easy to swallow as a musical,” Gahmert told me. Gahmert and Gerhards even took the liberty of adding a section to the script: a fabricated set of “extra,” unspoken terms and conditions for users living in Germany—which they insist Facebook “has obviously written” but keeps from the public.

To say that Facebook has an image problem in Germany, where it has 28 million users, is a staggering understatement. Germans tend to view it as a phenomenon that drives people apart instead of bringing them “closer together,” as Facebook’s mission statement suggests, by facilitating the spread of hate speech, misinformation, and fake news. Despite Chancellor Angela Merkel’s win in the recent general election, experts’ assessments that fake news did not significantly affect its results, and Facebook’s chief executive officer Mark Zuckerberg’s assurance that his company worked with federal authorities to safeguard the vote’s integrity, it nevertheless appeared that Facebook played a role in delivering to the far-right Alternative for Germany party the best performance of a far-right nationalist party since the Third Reich. Harris Media, the Austin-based political consultancy that the AfD hired to increase its social-media presence, took advice from Facebook employees in Berlin before the election and developed digital ads targeting Germans whose social-media usage made them seem sympathetic to the AfD’s cause. And despite Facebook’s assurances, a ProPublica investigation found that political advertisements of dubious origin that targeted the Green Party were still disseminated on the network.

On October 1, a new law went into force in Germany, compelling Facebook and other social-media companies to conform to federal law governing the freedom of speech. The Netzwerkdurchsetzungsgesetz, or the “Network Enforcement Law,” colloquially referred to as the “Facebook Law,” allows the government to fine social-media platforms with more than 2 million registered users in Germany—a club that includes giants such as Twitter, YouTube, Instagram, and Reddit—up to 50 million euros for leaving “manifestly unlawful” posts up for more than 24 hours. Unlawful content is defined as anything that violates Germany’s Criminal Code, which bans incitement to hatred, incitement to crime, the spread of symbols belonging to unconstitutional groups, and more. But what makes content “manifestly” illegal is left up to human—or algorithmic—judgment. A transition period lasting until January 1, 2018, is meant to give companies time to figure out how to comply.

Predictably, internet-freedom advocates and Facebook representatives worry that by incentivizing companies to take down content in order to avoid large fines, the law will have a chilling effect on freedom of expression, and Facebook claims that the new legislation violates European Union law. But they are in the minority. German Justice Minister Heiko Maas, who introduced the law, heralded it as “the end of the internet law of the jungle,” a position that a large majority of Germans share: 70 percent of them support the effort, while only 26 percent reported concern that it would threaten freedom of expression, according to an April poll.

Unlike in the United States, freedom of speech is “not the most important civil right” in Germany, the digital-rights activist Markus Beckedahl told me. Article Five of the German constitution, which governs the right to freedom of expression, explicitly protects freedom of opinion, a narrower category than freedom of speech writ large. Instead, Article One of Germany’s postwar constitution instructs, “Human dignity shall be inviolable.” This notion “means you are not allowed to claim false things about me, because it hurts my dignity,” Beckedahl said. “You are not allowed to tell anyone in public lies about me, or I can take you to court.”

This concept of human dignity originated in West Germany’s 1949 constitution, which was heavily influenced by the occupying Allied nations; under the banner of human dignity, the constitution also explicitly bans volksverhetzung, or “incitement to hatred,” as well as any public endorsement and invocation of National Socialist ideas and symbols. The invocation of human dignity carries immense moral force, and parallels the language of the 1948 UN Declaration of Human Rights.

Illegal speech in Germany, then, is speech that violates the human dignity of an individual or group. Fittingly, the concept is repeatedly invoked in the regulations of the criminal code that govern the dissemination of illegal content. According to German law, distributing material that documents “cruel or otherwise inhuman acts of violence” is illegal because it violates human dignity. National laws banning the dissemination of propaganda and incitement to hatred mean that all “written material” of a malicious, insulting, and defamatory nature targeting a national, religious, ethnic, or racial group is illegal. Yet while the concept of human dignity is unassailable in Germany’s offline world, Facebook’s community standards operate according to a sweeping, often vague set of guidelines that leave much open to interpretation. The country’s Facebook Law is an attempt to see whether human dignity can survive the internet—and its success or failure could very well determine what it means to be human, and online, for the rest of the world.

In 2015, a teenager named Anas Modamani fled his hometown of Darayya, Syria. By the fall of that year, he was living in a Berlin refugee camp. One day, Angela Merkel came to visit. Modamani, his short hair styled into gelled spikes, snapped a selfie with her; the image became emblematic of the chancellor’s welcoming posture toward Syrian refugees. But after terrorist attacks in Brussels and Berlin in 2016, Modamani’s selfie began popping up on Facebook again, this time doctored to falsely label him as one of the perpetrators of both attacks. Afraid of being recognized, Modamani was scared to leave his house.

Modamani’s lawyer, Chan Jo Jun, petitioned Facebook to remove every post using Modamani’s image. In Germany, citizens are entitled to rights over their own images, but the law was not written with the internet in mind. Under the constitutional protection of human dignity, everyone who copied and shared Modamani’s image online should have first sought his permission, just as they would have had to if they were using it in an offline advertisement or magazine. “Either we decide to enforce this law on the internet, or we get rid of it,” Jun told me. “But nobody wants to touch it, so we have a contradiction between offline and online.”

Facebook argued that it would take a “wonder machine” to find every image of Modamani circulating on its network. (Gizmodo called that claim “bullshit,” pointing out that Facebook already has developed several technologies, including image-detection software, that can do just that.) The dispute led to a high-profile court case this past spring, in which the court ruled in Facebook’s favor, arguing that it acted as “neither a perpetrator or participant in the smears.” Jun said the loss showed that Germany’s laws were insufficient to deal with the brave new world of social media. “We have to decide whether we want to accept that Facebook can basically do whatever it wants,” Jun said after they lost, “or whether German law, and above all the removal of illegal contents in Germany, will be enforced. If we want that, we need new laws.”

Before the verdict, Facebook lawyers tried to agree on a confidential settlement to resolve the issue, Jun said. The son of Korean immigrants, he decided not to appeal the decision after receiving a death threat targeting him and his family. “Maybe the picture of someone who looks Asian next to a Syrian refugee, with all the attention of the court, blew some fuses for some people,” he said.

Even though he lost the case, the controversy over the Modamani selfie proved, for Jun, that the German legal system was not yet strong enough to hold social-media companies accountable, and the immense press coverage it had generated seemed to tilt public opinion toward the same conclusion. When the Bundestag approved the Facebook Law, in June, Modamani’s image was still circulating in Facebook posts labeling him a terrorist. If they were willing to appeal, they'd have a much stronger case now that the Facebook Law had taken effect. Earlier this month, Jun filed a new request asking Facebook to take down a post falsely labeling Modamani a terrorist—the post should be “manifestly unlawful” according to the new law—but Facebook refused. Though the Modamani case may have stalled for now, Jun is far from done with Facebook.

In fact, Jun laid the groundwork for the Facebook Law. Over the last two years, he has dedicated a large part of his small firm’s resources to taking Facebook to court for spreading alleged hate speech and fake news online. Jun argues that the social network acts as both perpetrator and participant in its online community. For Facebook to keep its servers running with the full knowledge that hate speech is being disseminated on its platform is a crime in itself, Jun claims.

But pursuing an aggressive, litigious strategy casting Facebook as a participant in the crimes perpetrated on its network has made Jun a lot of enemies. Mention his name to a free-speech purist, like a politician, lobbyist, or well-connected activist in Berlin, and you’ll get a smirk—as well as a respectful nod. Jun has emerged as one of Germany’s foremost advocates of user rights on social media, and one of Facebook’s primary foes in Germany.

In July, I visited Jun’s small offices in Würzburg, a baroque Bavarian city about an hour’s drive from Frankfurt in the heart of Franconian wine country. In a glass-paneled conference room at his firm’s secluded, modern headquarters, tucked among mansions in the city’s residential hills, Jun had queued up a presentation composed of images pulled from the darkest corners of the internet, as well as quotes from Facebook employees and statistics on the company’s attempts to take down offensive content. On one slide, Jun showed how Zuckerberg’s comments and visits to Germany correlated with a heightened removal of illegal posts.

One case that Jun highlighted involved the Green Party politician Renate Künast, who is known for her sympathy for refugees. In December 2016, an asylum seeker named Hussein K. sexually assaulted and murdered a German student in Freiburg, in southwest Germany. (At his trial, in September, he admitted to the murder and attempted rape; he has not yet been sentenced.) Soon after, an image of Künast and a quote attributed to her—“The traumatized young refugee may have killed someone, but we still must help him”—appeared on Facebook, supposedly pulled from a story published by Süddeutsche Zeitung, Germany’s largest daily.

But the quote did not come from Künast. Nor did it appear in the newspaper. It had been fabricated by a Facebook user in Switzerland and spread by a right-wing German group hoping to make her and her party appear soft on refugees. Künast’s staff reported the post to Facebook, but it took three days and dozens of phone calls to its employees in Berlin before the original post was taken down. By then, it had been shared and copied thousands of times. The situation instigated a public backlash against Facebook in Germany. Stefan Plöchinger, the editor of Süddeutsche Zeitung’s website, called the company “democratically disruptive filth”—a view shared by many Germans.

Malicious misinformation “is even more dangerous than hatred or incitement,” Jun said, as he zoomed in on a screenshot of the offending post, magnifying a frowning Künast, her gaze directed away from the camera.

Künast’s staff keeps a record of hateful, fake, and potentially illegal social-media posts about her in a thick black binder. They regularly report attacks against her to law-enforcement authorities and social-media networks, and track what, if anything, is done to remove the offending posts and locate the offenders. The answer is usually nothing—many cases don’t move forward when the offenders are anonymous, or when Facebook ignores communications from prosecutors. In 2016, Künast’s office directed 40 complaints to police officers across Germany against people who published threatening, false, and defamatory posts about her. By the end of the year, they had heard back on only about 12 cases, seven of which had been closed because police searches had yielded no results or faltered due to a lack of evidence. In only four cases could the police identify and locate the offenders, and issue a penalty. This year, Künast has sent 11 cases to law enforcement.

Like Künast, Jun has been assiduously collecting and reporting instances of pro-Nazi language and other forms of hate speech on Facebook since 2015, logging them methodically in a Dropbox folder and Excel spreadsheet that he’s eager to share with inquiring journalists. Images from the internet of beheadings, murders, Nazi salutes, and racist slurs abound. His aim in collecting this wealth of abhorrent material is simple: to underscore the chasm separating Germany’s constitution, which bans the dissemination of material documenting “cruel or otherwise inhuman acts of violence” and incitement to hatred, and the permissive culture of the internet. The result, Jun said, is that Germans—and most Facebook users around the world—are living a kind of double life, subject to two very different legal and moral codes.

“There are laws and values that we were brought up with in the offline community that were not enforced in the online world,” Jun told me. “We have to make a decision: Do we want to be governed by community standards made up by Facebook or by the laws of our constitution?”

For years, German authorities have been trying to use legal means to punish individuals who write, like, or comment on offensive posts, forming a task force in 2015 and partnering with Facebook, Google, and Twitter in 2016 to clamp down on offensive content. A government study found that Twitter deleted just 1 percent of illegal hate speech within 24 hours of its posting, Facebook deleted 39 percent, and YouTube took down 90 percent.

In response to what he saw as Facebook’s negligence on this score, Jun took a different approach, targeting individual managers at the company for perpetuating, or even encouraging, hateful behavior. “If a manager of a company has positive knowledge about a concrete crime, and he doesn’t do anything about it, then he will have personal liability for that crime,” Jun explained. In other words: If he could prove that Facebook employees were aware of hate speech on the network and did not take the posts down, they could be found guilty of a crime in a German court. (The law is usually applied to copyright infringements, and whether it will work in a criminal case against a social-media platform remains an open question.)

To ensure that Facebook employees had indeed been notified of hate speech on their site, Jun began printing out hard copies of examples he found online. Then he sent them, via registered mail, to the private addresses of Facebook’s representatives in Europe—that way, when they still refused to delete the offending posts, he could prove that they had acted in bad faith. Facebook’s employees ignored Jun’s letters, and its lawyers threatened to try to disbar him. They also made fun of his firm’s no-frills website. (The website has since been redesigned.)

In October 2016, Jun filed criminal charges against Facebook’s CEO, Mark Zuckerberg, its chief operating officer, Sheryl Sandberg, and eight other employees with federal prosecutors in Munich. He accused them of willfully refusing to delete illegal content on the platform. The case is currently being investigated by prosecutors, who have yet to press charges. “We will absolutely respect the presumption of innocence,” Anne Leiding, a spokesperson for the Munich district attorney, wrote in an email.

Despite the fact that “the sentiment [in Germany] is that we don’t get it right,” as Eva-Maria Kirschsieper, Facebook’s head of public policy in Germany, put it, the company claims that it abides by German law, and that it has improved efforts to take down illegal posts in a timely manner. “You’re probably aware of our community standards,” Kirschsieper said, explaining that one of the “key drivers” of its internal policy is to keep users safe. But keeping users safe doesn’t mean keeping them happy.

According to Kirschsieper, Facebook does take down content that is deemed illegal in Germany, even if it does not violate its own internal standards. “We think that we are actually doing this … If we felt we weren’t doing a good job, we'd go back to the root cause,” she said. Then, sounding downright American, Kirschsieper added: “Very often, people think something is illegal, but it’s not. Hate speech is hard to put in a box.”

In July, Facebook relaunched a goodwill campaign, dubbed “Make Facebook Your Facebook,” to salvage its image across Germany. Ads posing some of the most basic questions and concerns about the social network popped up on streets and in newspapers: “Can you delete your profile completely?”; “I have no idea who gets to see my posts”; “Private but shared with 500 friends? Not really.” Each was accompanied by friendly explanations in small print: “You can delete your account any time. We’d hate to see you go”; “You can control who sees what. We think your privacy is important”; “Some things just aren’t meant to be seen by everybody.” And while Sandberg toured Germany in September, the company announced new measures to limit who can buy ads on its network.

“We take very seriously our responsibility to earn and maintain the trust of people in businesses,” Sandberg told attendees at a digital-marketing conference in Cologne, echoing the new ad campaign. However, Ulrich Kelber, the state secretary for justice and consumer protection, claimed that Facebook representatives in Brussels told him that the company would respond to the Facebook Law with a year of “radical deletion” and “see how long the Germans can stand it.”

In August, Facebook announced that it would hire 500 more people in Germany to monitor hate speech online. Facebook has already partnered with Arvato, a subsidiary of the Bertelsmann media conglomerate, to hire and train some 700 staff members in Berlin to take down posts that violate community standards. But Arvato’s operations have been shrouded in secrecy. Eva-Maria Kirschsieper, one of the Facebook employees Jun is pursuing criminal charges against, repeatedly declined to say exactly how many people the company had hired to monitor hate speech during a press conference last year.

Süddeutsche Zeitung journalists obtained internal documents detailing the criteria Arvato employees use to monitor and remove posts, and they have spoken to the media about the sometimes oppressive conditions and psychological toll of their work. Reinforcements, though, are apparently on the way. The company will hire 3,000 additional moderators by the end of 2018, and may go beyond that, if needed. While Facebook has image-detection software and a new artificial-intelligence system that can understand words in context, Klaus Gorny, the head of Facebook’s corporate communications in Europe, said the technologies are not yet sophisticated enough to be used for German content moderation. “It needs human review,” he said. “As much as we would like this to be solved by AI, it is not at this stage possible.”

“Defining the line [between hate speech and opinion] is a very difficult task,” Kirschsieper said. “We evaluate and reevaluate it on a daily basis—there’s not a universal book where you can look it up. People speak differently today than they did six months ago. The way they express anger, hate, has changed—it’s very much in flux.”

That flexibility has allowed the company to make refugees a “quasi-protected category” in its moderator guidelines, which protects them against “violence and dehumanizing generalizations, but not against calls for exclusion and degrading generalizations that are not dehumanizing,” as ProPublica discovered in a trove of internal Facebook moderation guidelines in June. But the new designation also suggests that groups can be removed or excluded from those protections at any moment, according to Facebook’s whims. If the company’s goal is to create a global set of rules governing online language, as ProPublica concluded, Germany is the first place where that project is being seriously tested, giving the network immense power to moderate speech, but also placing it under strict regulation. And everyone is watching to see how this experiment will go.

Despite campaigning assiduously against Facebook after the vehement threats and false posts about her, Renate Künast and the rest of the Green Party abstained from voting on the final version of the Network Enforcement Act, citing its unclear language and lack of an appeal provision for people whose posts are deleted. The law was rushed through parliament with muddled language, leaving many politicians, lobbyists, and tech companies worrying over its potentially drastic implications, which could muzzle free speech online.

“The parliamentary process was a big mess,” Künast said, calling Maas an “irresponsible” and “incapable” leader. (Maas declined to comment for this article.) The text of the legislation also left important questions unanswered—like what qualifies as “manifestly unlawful”—and failed to specify whether content deleted in Germany might still be accessible to users outside the country. All of these issues are likely to emerge as points of contention in Merkel’s ongoing coalition talks.

The execution, revision, and impact of Germany’s Facebook Law will set a meaningful precedent for how nations around the world approach the issue. On its face, it is one of the harshest laws to be adopted by a democracy in order to rein in social-media companies, placing the burden on the companies themselves to ensure faster, better moderation of illegal and fake content. Depending on how giants like Facebook, Google, YouTube, and Twitter respond, the law could result in a safer, more truthful internet, or it could simply lead to a more censored one.

Both Jun and another German lawyer, Joachim Steinhofel, are taking proactive steps to help ensure that the new law will not lead to a sanitized web. Jun says he has already observed overblocking, and is preparing a test case to prove that users can press charges against Facebook for removing their content without cause. Steinhofel, an opponent of the Facebook Law, has created an online “Wall of Shame” documenting legal posts that Facebook nevertheless removed, as well as illegal posts that remain online.

“The entire Facebook Law is just a result of Facebook’s stubbornness,” Jun said. “If they had at least pretended to comply with German law, it would never have happened.” For him, the law’s passage was a triumph, an acknowledgement of the fact that Germany needed new laws to deal with social media, and evidence that his protests had been heard. For Steinhofel, however, the Facebook Law is superfluous because, as he told the BBC, the company is already responsible for content posted on its network.

Companies have until January 1, 2018, to make sure that they have the infrastructure to comply with the law before fines start being levied. For the next few months, Facebook, Twitter, and Instagram will be figuring out what compliance means—how much to delete, how many employees it will take, and whether or not it will also alter what content is viewable outside Germany.

It may be that social-media users are at the outset of “the great deletion,” as Beckedahl warned in a June op-ed in Süddeutsche Zeitung. His concern is that despotic heads of state, such as Turkey’s Recep Tayyip Erdoğan, will adopt their own versions of the Facebook Law to censor the opposition. Over the past five years, 50 countries have passed measures to limit speech on social media, according to The New York Times, but few are as forceful as the German legislation. “It’s collateral damage—that every dictatorship can copy our law,” Beckedahl said. “So we have a problem.”

RELATED TOPICS

war and conflict reporting

Topic

War and Conflict

War and Conflict
Criminal Justice

Topic

Criminal Justice

Criminal Justice
technology and society

Topic

Technology and Society

Technology and Society

Support our work

Your support ensures great journalism and education on underreported and systemic global issues