Facebook Inc. FB 1.69% introduced new penalties for interest-based forums called Groups that are flagged for violating its community standards, as it aims to curb a product that played a high-profile role in the protests that led up to the Capitol riot.
The company has faced criticism both for not doing enough to police discourse on its platform and for censoring users. The changes to Groups come after Facebook’s own researchers found that the company’s oversight of the product was weak.
Facebook recently concluded that some of its largest civic-based Groups were toxic and was alarmed by its growth, according to internal documents reviewed by The Wall Street Journal.
“70% of the top 100 most active US Civic Groups are considered non-recommendable for issues such as hate, misinfo, bullying and harassment,” one presentation stated. It found that top Groups functioned less as communities than as megaphones for partisan publishers and purveyors of “hate bait,” racially and politically charged content meant to elicit calls for violence.”
Subsequent reporting by the Washington Post revealed Facebook internal research showing that a small fraction of Facebook’s user base was flooding it with more than half of the content casting doubt on the safety and efficacy of the coronavirus vaccine, which medical authorities widely consider to be safe.
“Groups and members that violate our rules should have reduced privileges and reach,” Facebook Vice President of Engineering Tom Alison said in a blog post Wednesday.
Groups forums are often private, but can attract millions of users. Facebook has heavily promoted them to users, and more than half of the platform’s 2.8 billion monthly active users belong to at least five.
The penalties formalize some of the restrictions placed on the product in the wake of the Jan. 6 attack.
Facebook will impose escalating penalties on Groups that accrue strikes for breaking platform rules against misinformation, hate speech and other types of content deemed harmful by the platform.
Users will be shown a warning before they join groups with a recent history of “strikes’ for bad behavior, and administrators of such problem groups will be required to approve posts before group members can see them. Facebook will also restrict members of such groups from inviting their friends to join them and block administrators from forming new groups if their current ones have been flagged for bad behavior. Content from such groups would also be featured less prominently in members’ newsfeeds.
The company will also stop recommending political and health-related groups to countries outside of the U.S. It currently only restricts such recommendations to U.S. users.
“We’re trying to look at instances where administrators are trying to create unhealthy groups,” Mr. Alison said.
The changes come after internal Facebook research found that Facebook Groups had been widely used to promote election-related conspiracy theories such as the “Stop the Steal” effort that preceded the Capitol riot.
Other social networks have been cracking down on misinformation on their platform. Twitter Inc. TWTR 1.37% has introduced a strike system that penalizes users who post misleading information on such topics like the Covid-19 vaccine and election integrity.
Write to Jeff Horwitz at Jeff.Horwitz@wsj.com
Copyright ©2020 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8