The Washington PostDemocracy Dies in Darkness

The indisputable harm caused by Facebook

Analysis by
Columnist
October 26, 2021 at 12:01 a.m. EDT

You’re reading an excerpt from the Today’s WorldView newsletter. Sign up to get the rest, including news from around the globe, interesting ideas and opinions to know, sent to your inbox every weekday.

A decade ago, Facebook could do no wrong. The rising social media company was at the vanguard of America’s embrace of tech positivism. Its leading executives were treated by the media not just as industry trendsetters, but as gurus on everything from the future of work to the new face of feminism.

Mark Zuckerberg, the company’s hoodie-sporting CEO, was made Time magazine’s 2010 Person of the Year. He was dubbed, simply, “the Connector” — a recognition then of the vast population of people around the world who had found a voice and each other through Facebook.

Earlier this month, however, Time again placed Zuckerberg on its cover and it reflects the profound shift in the zeitgeist since then. “Delete ‘Facebook’?” the cover’s caption line says.

Facebook and the other apps it owns, including WhatsApp and Instagram, are now increasingly seen through the prism of the harm they appear to cause. They have become major platforms for misinformation, polarization and hate speech. At the same time, Zuckerberg and his colleagues rake in billions of dollars each quarter in profits. The company also keeps growing its user base, which now encompasses nearly half of humanity.

In recent days, The Washington Post began publishing a series of reports based on internal documents from Facebook whistleblower Frances Haugen. The documents were reviewed by a consortium of media outlets and, according to Haugen, disclosed to the U.S. Securities and Exchange Commission.

The so-called “Facebook Papers” include a mix of presentations, research studies, discussion threads and strategy memos. The Post and other media companies obtained partially redacted versions of the papers through Haugen’s counsel.

What the documents reveal about Facebook’s behavior is stark and damning. They show how some of Zuckerberg’s public claims about Facebook’s principles and activities clashed with internal company findings. For example, he once told Congress that Facebook removes 94 percent of the hate speech it finds. But the inverse was true — according to internal estimates, the number was probably less than 5 percent.

Ahead of the Jan. 6 assault on the Capitol, Facebook’s efforts to stem the flow of misinformation proliferating on its networks fell short. Company employees were unhappy as far-right groups spread the call to join the “Stop the Steal” rally that preceded the attack.

“This is not a new problem,” one unnamed employee fumed on Workplace, an internal message system, on Jan. 6. “We have been watching this behavior from politicians like Trump, and the — at best — wishy washy actions of company leadership, for years now. We have been reading the [farewell] posts from trusted, experienced and loved colleagues who write that they simply cannot conscience working for a company that does not do more to mitigate the negative effects on its platform.”

Outside of the United States, Facebook has also failed to rein in misinformation. In one instance, as documented by my colleagues, two employees created a dummy account for a 21-year-old woman who lived in North India. They wanted to examine what a young woman’s Facebook experience looked like in one of the company’s largest markets.

Soon after the profile was created, the dummy user was encountering posts that included fake news, anti-Muslim hate and jingoistic support for India’s Hindu nationalist Prime Minister Narendra Modi.

“An internal Facebook memo, reviewed by The Washington Post, called the dummy account test an ‘integrity nightmare’ that underscored the vast difference between the experience of Facebook in India and what U.S. users typically encounter,” my colleagues reported, pointing to the real-life episodes of violence provoked by online misinformation in South Asia. “One Facebook worker noted the staggering number of dead bodies.”

Yet in part due to lack of attention, but also likely due to pressures from the Modi government, Facebook has fallen short. “Their investment in a country’s democracy is conditional,” Pratik Sinha, co-founder of a leading fact-checking site in India, told my colleagues. “It is beneficial to care about it in the U.S. Banning Trump works for them there. They can’t even ban a small-time guy in India.”

The Facebook Papers also make clear how Zuckerberg prioritized maximum engagement and the company’s bottom line over ethical concerns about safety and best practices. While he espouses a form of free speech maximalism in public in the United States, he has participated in enabling regimes of censorship elsewhere. My colleagues also pointed to a 2019 episode in Vietnam, where Zuckerberg personally decided to comply with demands from the autocratic government in Hanoi to censor dissident voices on his platform.

“Ahead of Vietnam’s party congress in January, Facebook significantly increased censorship of ‘anti-state’ posts, giving the government near-total control over the platform, according to local activists and free speech advocates,” my colleagues reported.

Zuckerberg and his colleagues have cast the slate of negative coverage as orchestrated by detractors and misrepresentative of the company’s work. “My view is that what we are seeing is a coordinated effort to selectively use leaked documents to paint a false picture of our company,” he said on an earnings call Monday.

“We have no commercial or moral incentive to do anything other than give the maximum number of people as much of a positive experience as possible,” Facebook spokeswoman Dani Lever said. “Like every platform, we are constantly making difficult decisions between free expressions and harmful speech, security and other issues, and we don’t make these decisions inside a vacuum.”

But the bare reality of what Facebook has unleashed is increasingly available for all to see — and recognized internally by many of its employees.

Haugen, the whistleblower, appeared Monday before a parliamentary hearing in Britain. She confirmed that she had seen a lot of internal research that Facebook “fans hate” because of the way its algorithm works. “Bad actors have an incentive to play the algorithm,” she said. “The current system is biased toward bad actors and people who push people to the extremes.”

The Facebook Papers “are astonishing for two reasons,” wrote the Atlantic’s Adrienne LaFrance. “First, because their sheer volume is unbelievable. And second, because these documents leave little room for doubt about Facebook’s crucial role in advancing the cause of authoritarianism in America and around the world. Authoritarianism predates the rise of Facebook, of course. But Facebook makes it much easier for authoritarians to win.”

Read more:

The climate catastrophe is here, whether governments face it or not

If Bolsonaro is potentially guilty of pandemic crimes, is Trump, too?

Poland triggers an existential crisis for Europe