How I Won a Disinformation Battle — But Lost the War
A firsthand lesson of how and why disinformation spreads, why it's so sticky in the minds of deluded believers, and how to fight back.
Thank you for reading The Garden of Forking Paths. This article is free, but I rely exclusively on paid subscriptions to be able to keep writing for you, so please consider upgrading to support my work if you’ve enjoyed this newsletter.
Let me tell you a story that has profoundly shaped the way I think about disinformation. It’s a firsthand experience in which I learned the hard way what not to do when staring down a sophisticated disinformation operation.
It taught me lessons that are more important than ever before, as American politics enters a dangerous new phase during the Trump prosecutions—in which the stakes are ever higher, the risks of violence ever greater, and the lies ever crazier.
It involves a showdown between me and a strange British right-wing provocateur on the airwaves, in which I mopped the floor with him on the facts, but he beat me on what really mattered: the disinformation narrative.
Here’s what happened…
“I want you to do me a favor”
In late 2019, news broke that Donald Trump had tried to extort Ukraine’s president, Volodymyr Zelenskyy, in a disturbing quid pro quo deal. The gist of it was this: Trump effectively said “I’ll give you weapons if you give me dirt on Joe Biden.” The scandal culminated in Trump’s first impeachment.
The Trump transcript was so damning that the Republican strategy was, as it often is these days, to deflect with whataboutism. Sure enough, right-wing disinformation machines fired into gear and belched out a lie that had just enough of a veneer of truth to it that it sounded plausible to the uninformed.
The lie wasn’t just untrue. It was an inversion of the truth.
Here’s what happened: in 2015, Ukraine got a new Prosecutor General named Viktor Shokin. Shokin was tasked with, among other things, investigating alleged corruption at Burisma, a company that developed ties with Hunter Biden (which was, let’s be honest, a serious error of judgment on Hunter Biden’s part).
But there was a problem. Shokin was, himself, corrupt. He wasn’t doing his job—and wasn’t cracking down on corruption. In fact, he wasn’t investigating Burisma aggressively enough.
As a result, a chorus of voices called for Shokin to be replaced. These voices came not just from the Obama administration, but also from the European Union, the International Monetary Fund, the World Bank, and a bipartisan group of US Senators—including several senior Republicans, who sent a letter urging Shokin to be replaced. Everyone agreed: Shokin needed to go.
But when Trump got into hot water, the right-wing narrative shifted. They rewrote history and lied about it: Shokin, they claimed, had been fired to protect Hunter Biden. The opposite was true: he was fired because he wasn’t investigating Burisma enough.
I knew the lies, I knew how bogus they were, and I prepared to face them down live on the airwaves.
"What’s his name?”
Over the years, I’ve done hundreds and hundreds of TV and radio interviews. But this one haunts me, because I screwed it up.
In late 2019, I got invited to discuss the Trump/Ukraine scandal on BBC radio with Jeremy Vine. For those who don’t know Vine, he’s a big deal in British media — and his radio show has an enormous audience. I assumed it would be, as it often is, what’s known as “an expert interview” — in which I’m asked questions in my capacity as a professor who understands US politics and tries to honestly represent what’s going on across the pond.
Instead, at the last minute, I was informed that it would be an adversarial debate, and I would be squaring off against a colorful right-wing firebrand named André Walker.
If you’ve never come across Walker, well, he’s a memorable character. Perhaps the best way to introduce him is with a screenshot from a newspaper article written about him in 2017, in which he “put a bounty on his own head” by proclaiming that he would give an ISIS terrorist £50,000 if they successfully killed him. “I’ve got a sword, good luck,” he warned. “Come on lads, have a crack.”
(No, I’m afraid I’m not making this up. Here he is…brandishing the sword):
This was the man that the BBC had decided I should debate about President Trump’s scandal in Ukraine. I wasn’t exactly happy about it, but I figured that if I pulled out after they made the bait and switcheroo, listeners would just hear from André, which would be worse.
I had a hunch, though.
“This guy,” I thought to myself, “isn’t going to know anything about the details of this case, so he’s just going to parrot the right-wing lies that he’s probably read about from various MAGA talking heads on Twitter.” So, I came up with a strategy, and sat quietly as Vine previewed the Trump scandal, the quid pro quo transcript with Zelenskyy, and then introduced us live on-air.
We had about five minutes for the segment. I prepared to spring my trap.
Sure enough, Walker immediately did exactly what I expected him to do. He rattled off a series of untrue statements, saying that the real scandal was about Joe Biden. Walker claimed that Joe Biden should be investigated, not Trump, because Biden had — in his telling — tried to get a vigilant, honest Ukrainian prosecutor fired to protect Biden’s son.
I pounced. “You don’t know what you’re talking about, André,” I said. “Tell me this: What’s the name of the Ukrainian prosecutor?”
I win the battle
As I suspected, Walker didn’t have a clue. Trying to buy time with vocal gymnastics, he sputtered and took his phone out, unlocked it, and tried to frantically type something into Google, hoping to be rescued by quick fingers.
“It might be worth pointing out,” I helpfully noted into my microphone, “that André is currently trying to Google his name. And the reason he has to do that is because everything he told you is a lie. The former Prosecutor General is named Viktor Shokin, and he was fired because he was corrupt — a fact that everyone agreed on, from the World Bank to the IMF to the European Union, heck, even Republican Senators wanted him gone.”1
Walker soon regained his composure, engaged in more whataboutism, lobbed further allegations about Biden, and each time, I debunked them.
But then, something startling happened.
“Well, thanks to both of you for coming on,” Vine chimed in. “I’m sure we’ll have much more to say about this story as it develops. And now onto the weather…”
That was it. The five minutes were up. It had passed by in a flash. We walked out of the studio.
Walker, to his great credit, took it in stride. “You bastard!” he said, with a good-natured laugh. “You got me.” We shook hands - and said goodbye.
But lost the war
When you get into a heated debate live on the airwaves, with hundreds of thousands of people listening, there’s inevitably a little infusion of adrenaline into the bloodstream. So, as I left the BBC studios, I felt amped up. My plan had worked!
(Okay, I was a little bit of a jerk to put him on the spot like that; sorry André).
But as I walked down the streets of London, I started to realize that mine was a pyrrhic victory.
We talked for five minutes and said “Biden” dozens of times. We said “Trump” never.
I realized that, in setting a gotcha trap for Walker, I’d accidentally ensnared myself in one that was far worse: the disinformation trap.
The entire debate took place on the turf of lies rather than the turf of truth.
This was a story about a Trump scandal so egregious that it would lead to him getting impeached—and Walker had gotten me to spend the entire time debunking his false claims rather than informing the listeners about what was both important and true.
Anyone listening would have thought I knew my stuff and he didn’t. They would have thought that I had clobbered him in the debate. But they also would have had a lie planted in their mind: that this quid pro quo scandal wasn’t really about Donald Trump, but maybe, just maybe, there was something going on with Joe Biden.
Why disinformation works
I learned my lesson. And I’m pleased to note that when I had repeat showdowns with these same lies, I did something different. After the false claim was raised, I simply said something like this:
“Everything you’ve just heard was a lie. If you Google it, you’ll see countless journalists and fact checkers who have debunked those lies. But I’m not going to waste our time talking about lies, so let me tell you the truth about what really happened with Donald Trump and Ukraine…”
So, here are the broader lessons I’ve learned about disinformation, both from my experience, and from studying it.
Lesson 1: Simple narratives can seduce the storytelling animal
Jonathan Gottschall is right when he says that humans are, above all, “the storytelling animal.” We make sense of the world through narratives — and study after study shows that we more easily process and retain information when it’s part of a straightforward “A to B” story that makes intuitive sense.
In some early studies on disinformation in the 1980s, researchers did experiments in which they told the study’s participants that a fire had been started and that investigators found a bunch of flammable items like oil and paint cans in a closet. Then, after they had told the participants that information, they corrected it. Actually, they said, the closets had been empty all along. No paint cans, no oil cans.
When the participants were later asked about what had happened, the overwhelming majority clung to the story that made sense — the flammable cans in the closet — even though they had been updated and told that the initial version of events was untrue. The first neat and tidy explanation made sense, so it remained lodged in the minds of those who had heard it.
Here’s the problem: the real world isn’t a series of A to B stories. It’s more complex. Consider how much information I had to give you to explain what actually happened with Viktor Shokin. Now, consider the lie: “Joe Biden fired a prosecutor to protect his son because his son’s company was under investigation by the prosecutor.”
The lie is a better story. It’s got a clear motive. There’s nothing in the weeds. What was complex was made seductively simple. And that leads us to our second lesson…
Lesson 2: Lies are “sticky”
“Flammable paint cans were the cause of the raging fire” is an easily understood story. When you correct false information and instead say “we don’t know the source of the fire,” or explain the source of the fire in technical terms that don’t make sense to the average person, then the false information is more likely to linger on.
This is what researchers refer to as “stickiness,” in which it’s difficult to get rid of lies once they’re believed in the first place. A great example is given by Elitsa Dermendzhiyska in Aeon:
“Listening to Mozart will not boost your child’s IQ.”
That sentence, as written, is about saying something won’t work, but it unavoidably plants in your head the idea that there is a link between Mozart and childhood intellectual development. The “not” is drowned out by the other information and the plausible A to B connection between two concepts (in this instance Mozart + child IQ).
This leads us to a third lesson:
Lesson 3: The Backfire Effect is a risk — but maybe not as much as we thought
The Mozart sentence is an example of what’s known as the Backfire Effect: the idea that debunking false information can inadvertently entrench it in the minds of the deluded. In the early days of the Trump era, researchers scrambled to better understand disinformation, and some early studies of the Backfire Effect got a lot of attention. Be careful, they warned, because debunking could just make it worse.
But more recent research by Brendan Nyhan of Dartmouth casts some doubt on the initial consensus and suggests that correcting false information is worth doing — particularly when people have already been exposed to a lie.
In my story, I accidentally caused the Backfire Effect because few British listeners knew much about the Trump/Ukraine scandal, and some of them probably got the wrong impression from my showdown with André Walker.
So, the crucial question is whether someone has already been exposed to a false belief. If they have, it’s worth debunking it. No point in letting a widespread lie go unchallenged. But if they haven’t, then amplifying the lie by engaging with it on a high-profile platform is likely a mistake.
And that’s because…
Lesson 4: We take information at face value — and that’s now dangerous
Us humans, in my experience, are generally well-meaning creatures. Sure, in my line of work, I’ve met some horrific human monsters. But most people are trying their best. We’re social animals who have mastered the world because of the greatest innovation evolution ever produced: cooperation.
That has led us to a rather nice trait: we instinctively trust information when we hear it from other people, unless we have reason to be suspicious.
That strategy works well in an environment characterized by good-faith interactions with other people you know. After all, there are social costs to being a serial liar, so we learn to discount fantastical “facts” spun out by the fabulists in our lives. By and large, it works reasonably well as a mechanism of sorting fact from fiction.
But that trait serves us poorly in the modern information space because of three key changes to how we acquire knowledge about our political world.
Information production has democratized, which gives liars the instantaneous ability to disseminate their lies to mass audiences.
In the past, information technology revolutions (the printing press, mass newspapers, radio, television) increased the number of people who could consume information, but information production was still tightly controlled by a pretty small group of people. It was always few-to-many communication.
The internet changed that, and for the first time in the history of our species, mass information transmission became many-to-many communication. Suddenly, a crackpot who was good at lying could speak to millions with a click. The perils of that system have been amplified because…
The costs of lying have simultaneously decreased within many elite circles.
This is plainly obvious, especially in the United States. Donald Trump lies constantly but those lies have produced very few formal consequences. The same is true for most people in the right-wing disinformation machine. Partisan tribalism, particularly on the American right, shields liars from accountability.
This matters enormously because research shows that elite mouthpieces and their supporter networks are the major vector for disinformation. So, until we establish accountability for liars — the same way that we impose social costs on people who lie to us in our daily lives — well, we’re going to be stuck with a lot of badly deluded people in our societies.
To fix the problem, we need to grapple with the final major shift:
Those who debunk lies are usually those who the misinformed people trust least
Trump tells his base that the 2020 election was stolen. But he also tells them that he’s the victim of a vicious “fake news” media and a Democratic Party that’s full of “Communists.”
Who are the people who try to correct Trump’s lies?
Surprise, surprise: it’s mostly journalists and Democratic politicians who most visibly try to correct the false information. But his base has been primed to mistrust what they’re told by those people, so it doesn’t cut through to those who need to hear it.
The lie survives.
What to do?
I learned these lessons the hard way. Hopefully you can avoid replicating my mistakes.
Thankfully, the fine people who research disinformation have compiled a nice little how-to-guide called The Debunking Handbook, the do’s and don’ts of dealing with disinformation. Give it a little read on how to cope most effectively with the political liars in your life. Here’s a graphic from it, in which they provide the best order for presenting information.
If nothing else, these lessons will help you avoid winning the battle but losing the war. And now, you’ll be prepared if you ever find yourself on a national radio station squaring off against a man who doesn’t have the facts, but understands at least two things: how to spread disinformation effectively and how to wield a sword.
Thank you for reading. If you enjoyed this edition, or learned something new, please consider upgrading to a paid subscription. You can help make this newsletter sustainable. Please consider sharing this article with your family and friends, too, as I rely on word of mouth to grow the newsletter.
This is roughly what I said. I don’t have a transcript of the actual interview, but I’ve done my best to honestly represent it from my memory.
Great stuff. Thank you
Brings to mind a favorite quote of mine to lean on in this post-truth era:
“Lies are often much more plausible, more appealing to reason, than reality, since the liar has the great advantage of knowing beforehand what the audience wishes or expects to hear.”
-Hannah Arendt
Thank you for this handbook of information Brian! It amazes me how repetitive simple narrative sound bites draw people in. I live in Florida and was listening to our governor DeSantis’s live speech and he mentioned the word woke over 20 times! The stickiness is messy!