Facebook Amps Up Its Crackdown on QAnon
Facebook, dealing with criticism that it hasn’t carried out sufficient to curb a fast-growing, fringe conspiracy motion, mentioned on Tuesday that it could take away any group, web page or Instagram account that brazenly recognized with QAnon.
The change drastically hardens earlier insurance policies outlined by the social media firm. In August, Facebook unveiled its first try to restrict the unfold of QAnon, by establishing insurance policies that barred QAnon teams that referred to as for violence.
But a whole bunch of different QAnon teams and pages continued to unfold on the platform, and the trouble was thought-about a disappointment in lots of circles, together with amongst Facebook staff.
On Tuesday, Facebook acknowledged that its earlier insurance policies had not gone far sufficient in addressing the recognition of the far-right conspiracy motion.
“We’ve been vigilant in enforcing our policy and studying its impact on the platform but we’ve seen several issues that led to today’s update,” Facebook said in a public post.
Since Facebook’s initial ban, QAnon followers had found ways to evade the rules. The group dates back to October 2017, but has experienced its largest increase in followers since the start of the pandemic.
At its core, QAnon is a sprawling movement that believes, falsely, that the world is run by a cabal of Satan-worshiping pedophiles who are plotting against President Trump. It has branched into a number of other conspiracies, including casting doubt on medical advice for dealing with the pandemic, like wearing masks.
On Facebook, QAnon has attracted new followers by adopting tactics such as renaming groups and toning down the messaging to make it seem less jarring. A campaign by QAnon to co-opt health and wellness groups as well as discussions about child safety drew thousands of new people into its conspiracies in recent months.
Researchers who study the group said that QAnon’s shifting tactics had initially helped it skirt Facebook’s new rules, but that the policies announced on Tuesday were likely to tighten the screws on the conspiracists.
“Facebook has been instrumental in the growth of QAnon. I’m surprised it has taken the company this long to take this type of action,” said Travis View, a host of “QAnon Anonymous,” a podcast that seeks to explain the movement.
Since QAnon has become a key source of misinformation on a number of topics, Mr. View said, the action announced by Facebook is likely to have a far-reaching impact in “slowing the spread of misinformation on Facebook and more generally across social media.”
Nearly 100 Facebook groups and pages, some with tens of thousands of followers, have already been affected by the changes, according to a survey conducted by The New York Times using Crowdtangle, a Facebook-owned analytics tool.
Facebook said that it had begun to enforce the changes on Tuesday, and that it would take a more proactive approach to finding and removing QAnon content, rather than relying on people to report content.