On Tuesday, Facebook announced that it would be expanding its ban on QAnon, and has since begun a purge of groups and pages that reference the fringe, far-right conspiracy theory.
The announcement came less than a week after the United States House of Representatives voted in favor of a resolution condemning QAnon that urged Americans to seek “information from authoritative sources and to engage in political debate from a common factual foundation.” The text of the bill referenced some of the worst parts of the QAnon theory, including its undermining of legitimate child safety efforts and anti-Semitism, while also censuring “all other groups and ideologies that encourage people to destroy property and attack law enforcement.”
Though 17 Republicans and one independent voted against the measure, the resolution ultimately passed with widespread bipartisan support. The condemnation on Capitol Hill was seemingly bolstered when Facebook announced that it would amp up its efforts to fight the conspiracy theory, and delete all groups and pages associated with it. But some wonder if Facebook has done enough.
The representative who co-sponsored the QAnon resolution, Rep. Tom Malinowski, says that while Facebook’s move is worthwhile, the company has not addressed the underlying problem that allowed the conspiracy theory to grow. QAnon, which the FBI has deemed a domestic terrorism threat, has already come into the mainstream after all. Malinowski, a Democrat from New Jersey, has faced death threats from the conspiracy theory’s supporters. Those attacks death threats came in response to Malinowski’s support for the resolution and an advertising campaign that’s been released by the National Republican Congressional Committee (and been flagged by fact-checkers as false).
There’s disagreement about the best way to handle the social media presence of conspiracy theories and related groups. While Facebook has focused on content moderation, others have said that the focus on removing violating posts risks missing how social media algorithms can increasingly promote and polarize people into bubbles of conspiracy theory-focused content.
Now, Malinowski tells Recode that the next step is changing the algorithms that helped QAnon gain a social media following in the first place, even if that means companies like Facebook have to make less money. He’s already working on legislation that would give Congress some power in regulating the algorithms themselves.
A transcript of our interview follows, lightly edited for clarity and length.
Can you start by explaining how you came to work on this resolution and why?
Rep. Tom Malinowski
I’ve been concerned about rising extremism in our country for a while — and QAnon, particularly this year, as we saw evidence that recruitment to this conspiracy-mongering cult was growing exponentially. I thought, among other things that we need to be doing, that it would be helpful to demonstrate overwhelming bipartisan condemnation. Given President Trump’s apparent approval or, at least, non-disapproval of QAnon. I thought it would be good to show that both Republicans and Democrats are united against this nonsense.
There are several extremist groups that seem to be gaining more attention in this country. What about Qanon particularly stood out to you?
To be clear, I’m worried about many groups. I’m worried about white supremacist groups that are responsible for most violent terrorist attacks in our country over the last few years. I’m worried about some groups on the far left as well. But most of these other groups are not mass-membership organizations. They have caused extreme acts of violence, but they don’t have millions of people obsessively following them.
I think what’s particularly pernicious about QAnon is that it is in immunizing millions of Americans against reality by teaching them to not trust objective sources of information — science and government experts — and to replace all of those things with an assortment of crazy conspiracy theories that have, at their heart, the old anti-Semitic blood libel, which tries to explain everything that happens in the world as a conspiracy of powerful people —many of whom happen to the Jewish — who are trying to control the world and are kidnapping your children.
What’s your reaction to news that Facebook is going to expand its enforcement against QAnon and remove related groups and pages — regardless of whether or not they appear to be promoting violence?
So it’s a good move. I’m happy that they did it. I think it’s particularly powerful coming in the wake of the Congressional condemnation. It’s a good one-two punch. Obviously, I’m interested in how effectively they’re going to implement it. And whether it’s even possible to identify and crack down on all of the evasion tactics that people in the QAnon community are likely going to use, like changing their vocabulary and their means, while delivering the same message.
But I’m glad Facebook did this. My concern is that it’s not enough. I’ve never believed that it’s enough to just take down content and groups. … I think the deeper problem is the design of the social network itself. This is like a farmer who suddenly discovers his fields are overgrown with noxious weeds. So he pulls out all the weeds, which is good, but never stops to consider how they spread in the first place.
So, on the one hand: it’s like they’re playing whack-a-mole with extremists without necessarily being willing to change the design of the social network that’s built to elevate extremism. If that remains the case, it’s a strike, it’s a very powerful blow against what we now call QAnon. But the phenomenon of disinformation and conspiracy theories and extremism spreading online will continue.
Do you think that there’s a role for Congress or for legislators in regulating the kind of algorithms that these companies are producing?
I am going to try to take that on. We are developing legislation right now that aims to take that challenge on.
What about people who saw this Facebook announcement and just said: this is way too late? It’s not necessarily meaningless, but it’s close to meaningless in the sense that, according to some, this was a problem that Facebook helped to create.
That’s part of what I’m saying, but that’s the farmer analogy, right? You let the weeds grow. You may even have sown the seeds for the weeds, and then you’re ripping them out. I’m not going to criticize them for ripping out the weeds. They are doing now what they should be doing: taking these groups down.
You know, we’re not monitoring this in real time, but there are researchers who do. And what we’ve heard in the last day or so is that Facebook is being very aggressive in taking these groups down, and Gab and Parler are now trying to welcome all the Q adherents being pushed away by Facebook.
That’s a good sign, by the way, like the most avid believers make it to sites like that. But I think the “Save the Children soccer moms” are not likely to give up Facebook for an extreme platform. … So again, this is good. I don’t want to disparage it.
But I don’t think it’s enough. It’s still the easier thing for Facebook to do. They can take down millions of pages and groups and posts without changing the design of their social network. And naturally, they don’t want to change the design of an incredibly profitable thing that they created. It may not be possible to really solve this problem without accepting that Facebook has to make less money.
You had the goal of this being a bipartisan effort in working on this resolution. It seems, to a large extent, that it was. But not everyone ended up voting for it. I’m curious what you think of your colleagues who did not vote in favor of this?
I was very satisfied with the final vote. … We were hoping for something overwhelmingly bipartisan, and we got that. Given what’s going on in the Republican Party, I’m not surprised that 17 people couldn’t bring themselves to condemn an anti-Semitic conspiracy-mongering cult. But it’s still a disturbing sign of the times that that’s the case.
In response to this resolution, it seems like you were attacked by QAnon. What are you taking away from that experience and what that has been like for you — as a person but also a member of Congress?
I was attacked, not just in response to the resolution but in response to a Republican television ad that the NRCC has been running in my district now for three weeks. It falsely accuses me of doing exactly what QAnon suspects that powerful government officials are doing. I believe that they were deliberately playing to the paranoia that QAnon promotes, and the Q community obviously noticed, because the Q entity — whatever that is — posted a Republican’s press release about their ad to these millions of QAnon adherents. The result was immediate death threats to my office.
That’s not the sort of thing you want to wake up to. It doesn’t change anything I’m going to do. I think the most disturbing part about it is that the NRCC expressed no shame, and has, in fact, doubled-down.
What has this shown you about QAnon? What has it shown you about the extent of this problem?
I guess it shows that QAnon is big enough that less responsible political operatives in the Republican Party see it as an opportunity, not a threat.
If there are any takeaways you have from this resolution or from this Facebook ban about things that we should be thinking about given the upcoming election?
That’s a good question. The Facebook ban is very well timed. Given the risk of misinformation before and even more important immediately after the election, anything that tamps down the hysteria out there is particularly needed.
What’s next in fighting this? After the ban, you hinted at coming legislation. Is there anything else that you think we need to address?
My focus is going to be on the social media companies, and in particular, on the way in which their algorithms have elevated and amplified content that allows extremist groups to grow.
The algorithms know what you’re afraid of, and they send more of it your way. They know what you’re angry at, and they send more of it your way. The companies understand that the fear and anger are what keep us glued to our screens. And so it’s in their interest to give us more and more extreme versions of what we’re scared of and angry about. And as long as that continues to happen, we’re gonna have this problem.
Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.
Help keep Vox free for all
Millions turn to Vox each month to understand what’s happening in the news, from the coronavirus crisis to a racial reckoning to what is, quite possibly, the most consequential presidential election of our lifetimes. Our mission has never been more vital than it is in this moment: to empower you through understanding. But our distinctive brand of explanatory journalism takes resources. Even when the economy and the news advertising market recovers, your support will be a critical part of sustaining our resource-intensive work. If you have already contributed, thank you. If you haven’t, please consider helping everyone make sense of an increasingly chaotic world: Contribute today from as little as $3.