Please, Please, Please Don't Mock Conspiracy Theories

People have a lot of bizarre notions about Covid-19 and the 2020 elections—but if you have to laugh, just do it in private.
Extreme Macro One Dollar Bill Pyramid Eye
Protecting conspiracy theorists’ feelings isn’t the point. The point is to protect the environment.Photograph: Getty Images

The 2020 election is a hurricane season of alleged conspiracies. Theory after theory has churned from the Democratic debates, the Iowa caucuses, and the Twitter goings-on of candidates’ supporters or alleged supporters. The recent theory that a Nigerian Twitter fan account for Pete Buttigieg was actually a sock puppet run by campaign staffer Lis Smith—it wasn’t—illustrates how quickly and easily conspiracy theories spread; the story was trending on Twitter mere hours after the initial allegation.

This case also illustrates the downstream consequences of laughter. Just as quickly as the theory emerged, a storm surge of snarky denials, Nigerian email scam jokes, and other can-you-believe-these-idiots eye rolls flooded social media.

For nonbelievers, “See conspiracy theory; mock” has become an ingrained response online—and we need to break the habit. The closer we get to the general election, the more conspiracy theories we’re bound to see. Making fun of them, however tempting it might be, is the worst possible reaction.

This is not a civility argument. For one thing, civility arguments are mostly bullshit rhetorical sleights-of-hand designed to shift focus from the substance of a critique to its tone. Nor is it a claim that false information is cool and fun and we should just shrug and say nothing, since who needs truth, Mike Bloomberg makes the best memes, and everything is terrible. Rather, it’s a warning that jokes about conspiracy theories are strategically ineffective. In fact, they’re likely to backfire and dump even more refuse into the already toxic political waters.

One problem is that making fun of something spreads that thing just as quickly as sharing it sincerely would. (See: Covid-19 conspiracies.) The theory in question might be a bizarro-world fever dream. The person who shares that fever dream to mock it could have the best intentions; they might assume that showing the absurdity of the theory will help prevent others from taking it seriously. But as a previous article in this series explained, the information ecology doesn’t give a shit why anyone does what they do online. Sharing is sharing is sharing.

In the case of the Nigerian Pete supporter, every time folks made snarky comments or retweeted jokes about the theory—whether they were Pete supporters or Pete anti-stans—those messages pinged to new audiences, spreading polluted information in unpredictable directions. The more responses there were, the more reason journalists had to cover the story. Jokes about the theory, even journalists’ own jokes, quickly became part of that coverage.

Another problem with mocking conspiracy theories is that it plays right into the conspiratorial mindset. Contrary to the prevalent assumption that only MAGA types believe in conspiracy theories, conspiratorial thinking—in which dots are connected, questions are asked, and official explanations are challenged—isn’t unique to any single demographic. Instead, as historian Kathryn Olmsted emphasizes, conspiracy theories have long existed on the right and the left, within white and black communities;, and among those with little power and those with extraordinary power. Some are off-the-wall bonkers, and some are grounded in historical precedent. Some are morally repugnant, and some are understandable. Some even turn out to be true.

Although they’re large and contain multitudes, conspiracy theories across the spectrum share two things. First, they emerge from what Ryan Milner and I describe in our book as “deep memetic frames.” This concept draws from deep stories as described by Arlie Hauschild, which reflect the affective paradigms we feel our ways into; frames as described by George Lakoff, which function as the “metaphors we live by;” and memetic logics as described by Milner, which guide the reciprocal process of social sharing. Deep memetic frames are what we believe in our bones to be true about the world. They emerge from our education, our experiences, and how we’re culturally conditioned to interpret information. Some are tethered to empirical reality, some less so, and some not at all; regardless, they shape how we see and what we know (or think we know) so completely that we probably don’t even notice them. Whether we’re aware of them or not, our frames take information and turn it into evidence.

The second shared characteristic of conspiracy theories is a preoccupation with them. They (whoever “they” might be) stand for everything that we (whoever “we” may be) despise. They are actively trying to dismantle our way of life. Sometimes, they are believed to have already infiltrated society, or at least the part of society that we care most about—these conspiracy theories are known as subversion myths or evil internal-enemy myths. For the white Protestant majority in the US, nonwhite, non-Protestant immigrants have historically filled the role of them; this is the foundation of the subversion myth known as "Make America Great Again." Needless to say, the precise identity of them, the dangers they pose to us, and how we must respond are all dictated by our deep memetic frames.

The internal coherence of these frames makes them deeply resistant to fact checks. Milner and I explore this difficulty in our chapter about the satanic panics of the 1980s and '90s, as well as its Trump-era reboot. The harder you try to disprove theories about satanists or the Deep State to people already convinced that you’re in on the conspiracy—or are sympathetic to the evil them—the more likely believers are to respond to your debunk with obstinance. Beyond that, they might reframe your “evidence” as proof that they’ve been right all along.

It’s not easy to reason with a conspiracy-theory believer, let alone hordes of them online. Audiences are constantly shifting, and it’s tricky to sort true conspiracy believers from pot-stirrers. But under the right conditions, it’s possible to figure out—at least generally—what deep memetic frame a given person is standing behind. From there you can aim your debunking at a target, like shooting a water gun through a hole in a fence. There’s no guarantee the person will be convinced by your correction, but at least the message is going to land where they can see it. Hooting jokes about the theory, in contrast, is like throwing a bucket of water at the same fence. You might make an impression on passersby, but otherwise all you’ll have is splash back.

In the case of the Nigerian Twitter account conspiracy theory, snarky retorts from Buttigieg’s campaign was the bucket flung at a fence. In place of targeted messaging, the campaign’s jokey messaging mocked believers’ conspiratorial tone while sidestepping the substance of their critique—notably Buttigieg’s less than stellar record on race, his campaign’s flirtation with sock puppetry, and the prior use of a stock photo of a Kenyan woman to tout the candidate’s racial justice bona fides.

Things died down after the Nigerian man operating the pro-Pete Twitter account came forward to insist that he was in fact a real Buttigieg supporter. What failed to quell the controversy was the Buttigieg campaign’s—and their supporters’—nothing to see here, dumbasses responses. Rather than being conspiracy-theory-busters, these responses were evidence generators. At the very least, they raised the possibility (for those with certain deep memetic frames) that something fishy was going on. Nothing generates splash back faster than a joke.

For those who seek to sow chaos and confusion, splash back is a gift from the disinformation gods. With everybody snarling and snarking and throwing buckets of gray water every which way, it’s difficult to keep track of whose fence is whose—and even harder to know what a targeted message might look like. Such conditions could not be more perfect for bad actors to arrive, collect the runoff, and use it to spread even more pollution, to amplify this or that conspiracy theory, or stoke this or that tension, or impersonate this or that candidate’s supporter to maximize ill will.

Mocking conspiracy theories and theorists might feel justified. It might be fun. But its benefits simply don’t match its costs. The call to stop amplifying disinformation agents is probably intuitive. The call to stop ridiculing true believers is probably less so. Conspiracy theorists have feelings, as well as reasons for believing what they believe. That’s good to remember regardless of whom you’re talking to. Protecting conspiracy theorists’ feelings isn’t the point, though; the point is to protect the environment.

Jokes about conspiracy theories are hazardous to the environment, a fact that underlines, once again, a simple, vital rule about the internet: Even when you don’t mean to, you can still fling filth all over the street.


More Great WIRED Stories