Skip to main content

Facebook hits pause on algorithmic recommendations for political and social issue groups

With just days to go before the U.S. election, Facebook quietly suspended one of its most worrisome features. During Wednesday’s Senate hearing Senator Ed Markey asked Facebook CEO Mark Zuckerberg about reports that his company has long known its group recommendations push people toward more extreme content. Zuckerberg responded that the company had actually disabled […]

With just days to go before the U.S. election, Facebook quietly suspended one of its most worrisome features.

During Wednesday’s Senate hearing Senator Ed Markey asked Facebook CEO Mark Zuckerberg about reports that his company has long known its group recommendations push people toward more extreme content. Zuckerberg responded that the company had actually disabled that feature for certain groups — a fact Facebook had not previously announced.

“Senator, we have taken the step of stopping recommendations in groups for all political content or social issue groups as a precaution for this,” Zuckerberg told Markey.

TechCrunch reached out to Facebook with questions about what kind of groups would be affected and how long the recommendations would be suspended at the time but did not receive an immediate response. Facebook first confirmed the change to BuzzFeed News on Friday.

“This is a measure we put in place in the lead up to Election Day,” Facebook spokesperson Liz Bourgeois told TechCrunch in an email. “We will assess when to lift them afterwards, but they are temporary.”

The cautionary step will disable recommendations for political and social issue groups as well as any new groups that are created during the window of time. Facebook declined to provide additional details about the kinds of groups that will and won’t be affected by the change or what went into the decision.

Researchers who focus on extremism have long been concerned that algorithmic recommendations on social networks push people toward more extreme content. Facebook has been aware of this phenomenon since at least 2016, when an internal presentation on extremism in Germany observed that “64% of all extremist group joins are due to our recommendation tools.”

In Facebook’s case, recommendations can usher users with extreme views and violent ideas into social groups where they can organize and amplify dangerous ideologies. Before being banned by the social network, the violent far-right group the Proud Boys relied on Facebook groups for its relatively sophisticated national recruitment operation. Members of the group that plotted to kidnap Michigan Governor Gretchen Whitmer also used Facebook Groups to organize, according to an FBI affidavit.

Militia tied to plot to kidnap Gov. Whitmer was removed from Facebook in boogaloo purge

While it sounds like Facebook’s decision to toggle some group recommendations off is temporary, the company has made an unprecedented flurry of choices to limit dangerous content in recent months, possibly in fear that the 2020 election will again plunge it into political controversy. Over the last three months alone, Facebook has cracked down on QAnon, militias, and language used by the Trump campaign that could result in voter intimidation — all surprising postures considering its longstanding inaction and deep fear of decisions that could make be perceived as partisan.

After years of relative inaction, the company now appears to be taking some of the extremism it has long incubated seriously, though the coming days are likely to put its new set of protective policies to the test.

Changing how retweets work, Twitter seeks to slow down election misinformation

 

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.