QAnon, the conspiracy theory that claims President Trump is secretly battling a Hollywood-Jewish-Democrat-deep state-globalist cabal of Satanist-murderer-pedophile-human traffickers, is huge. In both the span of its reach and the depth of its ideas, the conspiracy has grown into a juggernaut of misinformation. (“We call it a superconspiracy,” says Antonis Papasavva, a data scientist at University College London. “Name any conspiracy theory—JFK, MK Ultra, Pizzagate—it’s in there.”) This week, Facebook vowed to remove any pages, groups, or Instagram accounts that represent QAnon, which has gobbled up loads of engagement on the platform thanks to its something-for-everybody theories. Up until two months ago, Facebook didn’t really have any policies when it came to QAnon, and Tuesday’s ban marked a sharp escalation. Sharp, but also perhaps too late.
In case you are (blissfully) unaware, QAnon was born on the internet. Their prophet, Q, amassed followers by posting cryptic messages on 8kun, a message board popular with extremists, but the conspiracy theory has since seeped into every mainstream social media platform. Unlike a lot of conspiracy-minded internet subcultures, QAnon has had no trouble moving offline. At first, it was just T-shirts and mysterious billboards. Now QAnon has allegedly inspired criminal acts including murder and terrorism, been endorsed by multiple Republican congressional candidates, and had its followers praised as patriots by President Trump.
In August, after years of activists calling for Facebook to take a stronger stance against QAnon—which has promoted violence, anti-Semitism, racism, and Covid-19 misinformation on the platform—Facebook took a step forward. They announced that they would be restricting QAnon content by removing it from recommendation algorithms and taking down pages and accounts that discussed real-world violence. According to Facebook, the August crackdown led to the removal of more than 1,500 Facebook groups and pages, but QAnon has continued to flourish. Experts think it’ll go on flourishing, ban or no ban.
Facebook’s QAnon moratorium also has a cavernous loophole: It only targets entities that “represent” QAnon. “If I designate myself Queen of QAnon today, does that mean I’ll be removed?” asks Joan Donovan, research director at Harvard’s Shorenstein Center, where she studies online extremism. “I can’t see a world in which anyone is considered a representative of a conspiracy theory other than Q.” According to Facebook, QAnon “representatives” would have the word QAnon in their handle and bio or title and About section, and share QAnon posts to a degree that crosses a threshold that they’re not divulging. Deciding who and what checks those boxes will be left to Facebook’s Dangerous Organizations Operations team, which handles terrorists and hate groups. “It’s content moderation by press release,” says Donovan. The announcement is strong, but it’s unclear how wide-ranging or enforceable the new policy really is.
If you think that it would now be pretty easy to camouflage an abiding QAnon passion as a passing or even accidental interest by changing some words in your bio, you’d be correct. Plus, extremist groups are experts at going underground to escape public scrutiny. “I’m skeptical that this ban will have any impact in the long run,” says Phyllis Gerstenfeld, who studies online extremism and criminology at Cal State University Stanislaus. “Extremists find new ways to repackage themselves.” QAnon adherents already demonstrated their ability to do this when they hijacked the hashtags #SavetheChildren and #SaveOurChildren and used them to reach new audiences who would never have joined a QAnon group, but do care about kids.
As for hiding the actual QAnon label, that’s happening right now, this very minute. Even before the ban, QAnon groups were discussing alternate ways of identifying themselves to avoid detection and moderation. Tech-censorship doomsday strategizing is common to all online extremist groups, both because they constantly break terms of service and because it suits their paranoid worldview. In this case, people actually had orders to do so from on high: the user identifying themselves as Q told them to “Drop all references re: ‘Q’ ‘Qanon’ etc. to avoid ban/termination.” Some groups have been using “17” as a replacement callsign, but it will be something new by morning.
They don’t even need to rush. Facebook’s announcement says it will take them “days and weeks” to remove all QAnon content. According to righwing watchdog Media Matters, several QAnon groups on Facebook are taking this lull to coordinate moves to alternate platforms like MeWe and Parler. “Gab sent out an email explicitly recruiting QAnon people who had been banned,’” says Jeremy Blackburn, a computer scientist at Binghamton University who studies online extremism.
Everything you need to know about George Soros, Pizzagate, and the Berenstain Bears.
Such migrations are common. In Blackburn’s research on banned Reddit communities, he found that some members of banned groups will be discouraged from future participation, but many others tend to follow the group onto a new, usually more permissive and sequestered area of the internet. On these echoey alternative platforms, people tend to radicalize each other and the group may grow only more extreme. “It helps Facebook, but it doesn’t help the web or society at large because it moves the problem along,” says Richard Rogers, who studies digital culture and deplatforming at the University of Amsterdam. “The conspiracy can continue on Facebook and the mainstream [internet] more generally through relying on links to ‘Q Research’ websites. The benefit for Facebook is that QAnon is less obviously visible on its platform.”
Deplatforming is the go-to strategy for bad actors for a reason: It’s the easiest mechanism to limit how many people see extremist content on a platform that otherwise rewards sensationalism, and it cuts influencers from their revenue stream. It seems to work on a small scale, at least. Milo Yiannopoulos, onetime alt-right darling, is often held up as the poster boy for the strategy. Since being deplatformed, he has all but disappeared from headlines, but his fall from notoriety is also a demonstration of the strategy’s limitations. Around the same time he got kicked off mainstream social media, he was also canceled for making pedophilic comments, and that’s when his support really disappeared. “It shows that deplatforming is more than a technical question,” Donovan says. “The community that this person thrives in also has to disagree with their statements and their politics for it to work.” Among its followers, QAnon is anything but canceled.
When you start asking questions about how effective any Facebook ban would be in curbing the spread of an extremist ideology, you start getting a lot of coulds, maybes, and probablies. Compared to social media sites like Reddit and Twitter, Facebook is miserly with its data. “The data on any of these influence operations and network factions are firmly held within Facebook. We have to over rely on their assessment of risk and size to have these conversations at all,” Donovan says. To investigate the impact of Facebook policies on fringe groups, Blackburn has had to look through other platforms’ data on strategic dates to see if people migrated after being banned. “If Facebook made the entirety of their data available to scientists, it would make our job a lot easier.” Facebook is asking the public to trust them to handle the problem, but it’s not clear what the scope of the problem is or how effective their strategies have been at addressing similar issues in the past.
Also, Facebook is just one platform. Even if its ban works perfectly, it’ll be hard pressed to stop QAnon completely. At this point, maybe nothing can. “Aside from building a time machine? Marty McFly, is he available?” Donovan says. It’s easy to look back now and say that had social media platforms been a little more proactive in banning the groups, or had they taken stronger stances on QAnon activities like the harassment of public figures, maybe the conspiracy theory wouldn’t have grown to the presidentially-endorsed proportions it has now. “[QAnon has] cracked through to the mainstream, like conspiracy theories about the Moon landing,” Donovan says. “It’s probably here to stay, at least in the stories people tell themselves about the deep state and corruption in government.” You can’t undo Q, but better policies could mitigate what QAnon will do in the future.
More Great WIRED Stories
- 📩 Want the latest on tech, science, and more? Sign up for our newsletters!
- The West’s infernos are melting our sense of how fire works
- Amazon wants to “win at games.” So why hasn’t it?
- Publishers worry as ebooks fly off libraries’ virtual shelves
- Your photos are irreplaceable. Get them off your phone
- How Twitter survived its big hack—and plans to stop the next
- 🎮 WIRED Games: Get the latest tips, reviews, and more
- 🏃🏽♀️ Want the best tools to get healthy? Check out our Gear team’s picks for the best fitness trackers, running gear (including shoes and socks), and best headphones