Facebook Papers: All you need to know about documents that reveals the worst about Facebook

Published Oct 26, 2021

Share

If you’re a little lost about what the so-called Facebook Papers are, some context: The trove of documents was shared and reviewed by the Securities and Exchange Commission (SEC) and the Wall Street Journal after Frances Haugen, who was publicly revealed to be the Facebook whistle-blower, leaked the deluge materials; redacted versions were then given to Congress through Haugen’s legal team.

Later, those same documents were then passed onto over three dozen news outlets, including The Washington Post, the New York Times, and The Atlantic, which used the past few weeks to comb through the thousands of pages of internal company information, including chats and studies completed by Facebook itself.

Here’s a detailed analysis of some key points:

- Zuckerberg's public claims often conflict with internal research

Haugen references Zuckerberg's public statements at least 20 times in her SEC complaints, asserting that the CEO's unique degree of control over Facebook forces him to bear ultimate responsibility for a litany of societal harms caused by the company's relentless pursuit of growth.

The documents also show that Zuckerberg's public statements are often at odds with internal company findings.

For example, Zuckerberg testified last year before Congress that the company removes 94 percent of the hate speech it finds. But in internal documents, researchers estimated that the company was removing less than 5 percent of all hate speech on Facebook.

Facebook spokesperson Dani Lever denied that Zuckerberg "makes decisions that cause harm" and dismissed the findings, saying they are "based on selected documents that are mischaracterised and devoid of any context."

- Facebook dropped its guard before the January 6 insurrection

During the run-up to the 2020 US presidential election, the social media giant dialled up efforts to police content that promoted violence, misinformation and hate speech. But after November 6, Facebook rolled back many of the dozens of measures aimed at safeguarding US users. A ban on the main Stop the Steal group didn't apply to the dozens of look-alike groups that popped up in what the company later concluded was a "coordinated" campaign, documents show.

By the time Facebook tried to reimpose its "break the glass" measures, it was too late: A pro-Trump mob was storming the US Capitol.

Facebook officials said they planned exhaustively for the election and its aftermath, anticipated the potential for post-election violence, and always expected the challenges to last through the inauguration of President Biden on January 20.

- Facebook fails to effectively police content in much of the world

For all of Facebook's troubles in North America, its problems with hate speech and misinformation are dramatically worse in the developing world. Documents show that Facebook has meticulously studied its approach abroad and is well aware that weaker moderation in non-English-speaking countries leaves the platform vulnerable to abuse by bad actors and authoritarian regimes. “PEOPLE in Africa are getting a raw version of Facebook which is more dangerous,” Frances Haugen said in her address to UK Members of Parliament.

According to one 2020 summary, the vast majority of its efforts against misinformation - 84 percent - went toward the United States, the documents show, with just 16 percent going to the "Rest of World," including India, France and Italy.

Though Facebook considers India a top priority, activating large teams to engage with civil society groups and protect elections, the documents show that Indian users experience Facebook without critical guardrails common in English-speaking countries.

Facebook's Lever said the company has made "progress," with "global teams with native speakers reviewing content in over 70 languages along with experts in humanitarian and human rights issues."

"We've hired more people with language, country and topic expertise," Lever said, adding that Facebook has "also increased the number of team members with work experience in Myanmar and Ethiopia to include former humanitarian aid workers, crisis responders and policy specialists."

- Facebook chooses maximum engagement over user safety

Zuckerberg has said the company does not design its products to persuade people to spend more time on them. But dozens of documents suggest the opposite.

The company exhaustively studies potential policy changes for their effects on user engagement and other factors key to corporate profits. Amid this push for user attention, Facebook abandoned or delayed initiatives to reduce misinformation and radicalisation.

One 2019 report tracking a dummy account set up to represent a conservative mother in North Carolina found that Facebook's recommendation algorithms led her to QAnon, an extremist ideology that the FBI has deemed a domestic terrorism threat, in just five days. Still, Facebook allowed QAnon to operate on its site largely unchecked for another 13 months.

"We have no commercial or moral incentive to do anything other than give the maximum number of people as much of a positive experience as possible," Facebook's Lever said, adding that the company is "constantly making difficult decisions."

Related Topics:

facebook