Facebook knew calls for violence plagued ‘Groups,’ now plans overhaul

Facebook Inc. in 2019 redesigned its flagship product to center on what it called Groups, forums for like-minded users. Chief Executive Mark Zuckerberg called them the new “heart of the app.”

Now the social-networking giant is clamping down on Groups. The effort began after Facebook’s own research found that American Facebook Groups became a vector for the rabid partisanship and even calls for violence that inflamed the country after the election.

The changes, which Facebook escalated after the Jan. 6 riot at the U.S. Capitol, involve overhauling the mechanics of a product that was meant to be central to its future.

Facebook executives were aware for years that tools fueling Groups’ rapid growth presented an obstacle to their effort to build healthy online communities, and the company struggled internally over how to contain them. Now Facebook is working to overhaul the mechanics of a product that was meant to be central to its future.

The company’s data scientists had warned Facebook executives in August that what they called blatant misinformation and calls to violence were filling the majority of the platform’s top “civic” Groups, according to documents The Wall Street Journal reviewed. Those Groups are generally dedicated to politics and related issues and collectively reach hundreds of millions of users.

The researchers told executives that “enthusiastic calls for violence every day” filled one 58,000-member Group, according to an internal presentation. Another top Group claimed it was set up by fans of Donald Trump but it was actually run by “financially motivated Albanians” directing a million views daily to fake news stories and other provocative content.

Roughly “70% of the top 100 most active US Civic Groups are considered non-recommendable for issues such as hate, misinfo, bullying and harassment,” the presentation concluded. “We need to do something to stop these conversations from happening and growing as quickly as they do,” the researchers wrote, suggesting measures to slow the growth of Groups at least long enough to give Facebook staffers time to address violations.

“Our existing integrity systems,” they wrote, “aren’t addressing these issues.”

In response, Facebook ahead of the election banned some of the most prominent problem Groups and took steps to reduce the growth of others, according to documents and people familiar with its decisions. Still, Facebook viewed the restrictions as temporary and stopped short of imposing measures some of its own researchers had called for, these people said.

In the weeks after the election, many large Groups — including some named in the August presentation — questioned the results of the vote, organized protests about the results and helped precipitate the protests that preceded the Jan. 6 riot. After the Capitol riot, Facebook took down more of the Groups and imposed new rules as part of what it called an emergency response.

Facebook has canceled plans to resume recommending civic or health Groups, said Guy Rosen, Facebook’s Vice President of Integrity, a role that oversees the safety of users and discourse on the platform. Facebook will also disable certain tools that researchers argued had facilitated edgy Groups’ rapid growth and require their administrators to devote more effort to reviewing member-created content, he said.

“That helps us because we can then hold them accountable,” Mr. Rosen said, adding that the changes aren’t an admission that previous rules were too loose, but show Facebook adapting to emerging threats: “If you’d have looked at Groups several years ago, you might not have seen the same set of behaviors.”

Facebook, like some other tech giants, has caught criticism for banning certain content and people, including Mr. Trump. It is also under the close eye of the Biden administration, which has signaled its displeasure with Facebook’s handling of its platforms in the months leading to the election.

Mr. Zuckerberg said on an earnings call last Wednesday that Facebook’s users are tiring of the hyper-partisanship on the platform. “People don’t want politics and fighting to take over their experience on our services,” he said, adding that Facebook is also considering steps to reduce political content in its News Feed — the stream of baby photos, birthday reminders and rants from distant relatives that greets users when they log in.