Dr. Christopher Paul, senior social scientist of the Pardee RAND Graduate School, discusses his seminal work on the "Firehose of Falsehood" propaganda model, going through its four key components and explaining how each exploits different facets of human cognitive heuristics - any approach to problem solving or self-discovery that employs mental shortcuts to ease the cognitive load of decision making, often relying on intuition or gut feeling. Not guaranteed to be optimal or rational, this method is nevertheless sufficient for reaching an immediate decision.
. He talks about the significant impact of the first-mover advantage and how ineffective refutation or what we commonly call "setting the record straight" can be. He also discusses how we judge or misjudge the credibility of information and closes by offering some messaging strategies that help counter the high volume and multi-channels of the firehose.
0:00: Stephanie McVicker:
With us today, we have Dr. Christopher Paul, senior social scientist, and professor at the Pardee RAND Graduate School. Dr. Paul studies counter-terrorism, information operations, military intelligence, and psychological warfare, just to name a few. Dr. Paul, thank you so much for joining us today.
0:16: Dr. Christopher Paul:
Oh, thanks Stephanie. Happy to be here. I'm looking forward to our conversation.
0:21: Stephanie McVicker:
Same here. Dr. Paul, you wrote with Mary Matthews such a seminal piece on a technique you called "the firehose of falsehoods." Can you briefly map out for us the key elements that make up this technique?
0:33: Dr. Christopher Paul:
Yeah. So, this was in 2015 and we were looking at characteristics of Russian propaganda and we called it "the firehose of falsehood." We found four characteristics and the main reason for the firehose is that the firehose is high volume and multichannel. The Russians are employing overwhelming quantities of material on a host of different media modes and channels and sources.
Second, it's rapid, continuous and repetitive. If you're not worried about fact checking, you can have the first accounts of events or non events. And if you have enough folks involved, you can run it 24 hours, keep it moving and repeat the same themes over and over again.
Third, it makes no commitment to objective reality. Sometimes it's true-ish. Sometimes it's false. Sometimes it's false, but backed up with some manufactured evidence. Sometimes it's false, but consistent with things that intended audiences might think. Sometimes it's false, but presented by sources that look or seem credible.
And then finally the Russian firehose of falsehood makes no commitment to consistency. They're not worried about what used to be called information fratricide, having one spokesperson contradict another. They'll quickly change from one theme or set of lies to another theme. And the story changes over time to suit the various purposes.
1:55: Stephanie McVicker:
So let's talk a little bit about the importance of first impressions. Why are first impression is so important? And how does that relate to the firehose falsehoods?
2:04: Dr. Christopher Paul:
Yeah, it's hugely important. So this comes back to two of the characteristics of the firehose of falsehood. One, the rapid continuous and repetitive nature. And two, the lack of commitment to truth. So those two things together, again, if you're making something up, you can have the scoop on it. You can be first-to-market with it because if it didn't actually happen, you're guaranteed to have the first account. But even if you're just inventing aspects of something that really did happen, if you don't wait to fact check, if you don't wait for reporters on the scene to verify, you can be first.
Now, why does being first matter? We've all heard of the power of first impressions or the first-mover advantage, but it's pretty easy to underestimate how powerful that is. Why is it so powerful? I'll tell you why, it has to do with how we as humans store information. I've said elsewhere that we humans are homo narratives, the story animal or the story people, because we keep information in a holistic worldview... in a giant story. And so when I receive, or when you receive some factoid, and factoid is the word I'll use for something that is presented as fact, but may or may not be. So when I receive a factoid, if I accept it, I don't just put it in a mental filing cabinet. I bake it into my worldview.
And so, if someone comes along six seconds, six minutes, six hours, six days later and tells me that factoid ain’t so, they're not asking me to go to my mental card catalog, find a single card and tear it up. They are attacking my worldview. And now that's not impossible to do, but it requires much more vigorous action. And it requires that you give me something to fix it. So not only you can't just tell me, Oh, that's false. You have to tell me what the truth is and you have to tell me what the truth is in a more compelling story that helps me fix my story worldview. So that's why the first-mover advantage is so huge because first impressions are epically consequential.
4:07: Stephanie McVicker:
Then how do people counter that? How do government officials or, how do message-makers counter that first impression block that exists?
4:20: Dr. Christopher Paul:
It's really tricky. And one of the challenges is inherent in that notion of countering. If you think about counter-messaging, you're in this reactive mindset, which is in trouble. When I get a chance to talk to public relations professionals or public affairs spokespersons, I understand if you work for a government and you're the public relations spokesperson, when somebody says something that isn't true about your government or your military or your organization, it's incumbent upon you to get out there and set the record straight. And the psychology says that has almost no impact at all. The retraction or the refutation. So I urge these professionals when they have to do this, to keep the refutation part of it, as short as possible. Say what you got to say, but then use your time, your airtime, to get in front of the next one, seize the initiative, turn it around.
So I've made my retraction. Now say, okay, that that particular Russian source is likely to continue to spread falsehoods and misinformation about these topics, from these sources. And here are the kinds of logical fallacies they're likely to employ. Now, all of a sudden, you're the first-mover. And maybe, hopefully, the next time someone in the audience hears that kind of thing instead of, instead of nodding their head and going, Oh, that might be true. Maybe I should put it in my worldview. They go, wait a minute. That seems like a logical fallacy to me. Maybe that's not right.
5:49: Stephanie McVicker:
You mentioned that variety of sources matter when it comes to disseminating propaganda. Can you elaborate a little on that please?
5:57: Dr. Christopher Paul:
Sure. So when we get into persuasiveness, when you talk about what makes things effective psychologically, the first aspect of the firehose is high volume and multichannel. Well, when you get to what's persuasive, the same arguments from multiple different sources is more persuasive than an argument from a single source. Similarly, just volume. Quantity has a quality all its own. So the more information that a recipient is getting, the more likely they are to be persuaded by it. And baked into that multiplicity of sources is a really important thing. In all the psychology studies, the single most credible and persuasive source for information is someone like you. And I'll make air quotes around "like" because that could be lots of different dimensions of likeness. You could be someone who you perceive as a fan of the same sports team as you, or someone who has the same ethnic or religious background as you. Or someone who likes the same TV shows or music as you, whatever dimension of likeness it happens to be. If you perceive someone as like you, you perceive them more credibly.
So what does that suggest for volume? Well, what are the odds for any given member of the audience out there in the ether, that a single government spokesperson has characteristics that make them like that individual? Or that one of hundreds or thousands of managed Russian personas or the various anchors on RT or the various trolls in the comment sections, all of these different possible sources. What is the probability that one of those individuals is more like any one member of the audience than a single government spokesperson?
7:58: Stephanie McVicker:
You discuss the dark art of Internet trolling. What purpose does trolling serve for those employing it?
8:05: Dr. Christopher Paul:
Sure. So trolling can refer to a couple of different things. One is just being a jerk on the Internet, but the other is having or managing multiple personas or personalities on the Internet. So when we talk about the famed Russian troll farm, the Internet Research Agency, employees there would manage a host of different personas and personalities to pursue their objectives. And so they contribute to that first characteristic, the high volume, because each individual seems like multiple channels.
But coming back to just being a jerk on the Internet, that can actually serve Russian purposes too, because part of what Russia wants in this space is what some authors have characterized as a war on information. They don't just want to be influential. They want to attack and undermine the credibility of all sources. Because Russia is engaged in propagandizing their domestic audience for so many decades, they're all very jaded and very skeptical. They want to share that with the West. RTs slogan, "Question More" is an example of this kind of nihilistic attack on credibility generally. And so having Russian trolls being jerks on the Internet, polluting common discussion spaces with rude comments or whether the comments they make are thematic and connect to Russian propaganda or just make it unpleasant to be there in that space and drive legitimate participants out of the conversation, that serves Russian interests.
9:38: Stephanie McVicker:
And what exactly do the Russians tend to focus on? Is it primarily political or is it health related or is it something else?
9:46: Dr. Christopher Paul:
In general, Russia has a number of different themes and objectives for their propaganda. And they're opportunists. So this is a nation of chess players. So some of it is going to depend on what circumstances deliver and some of it's going to depend on what's happening in other contexts. Again, their primary audience is their domestic audience. And then out in their near abroad gets pretty heavily bombarded through the rest of Europe, gets less attention, but still pretty vigorous. And then here in the US, we get a pretty good dose too.
And in all those situations, they're trying to widen societal cracks and undermine credibility in general. So they might, in one country, engage in political propaganda, but they might, in one country, they might be promoting a candidate on the left, in another country, they might be promoting a candidate on the right. If in any country there is an extremist or absurd or ridiculous candidate, that's where the Russian attention is going, because they would love to undermine democracy. And democracy looks silly when ridiculous candidates actually get elected.
10:57: Stephanie McVicker:
You've said that the very factors that make the firehose of falsehood effective also make it very difficult to counter. Can you explain what you mean by that?
11:06: Dr. Christopher Paul:
So the things we've already talked about make the firehose of falsehood hard to counter because of the first-mover advantage. That's challenging, because official spokespersons are limited in the number of channels and the volume that they can get after. But I do have some suggestions. I'm not all bad news. I've got several suggestions for countering the firehose of falsehood. And since it's got this firehose imagery, if you don't mind, if you'll indulge me, I will torture the water metaphor a little bit. So I think the first one isn't too bad a torturing, but don't expect to counter the firehose of falsehood with the squirt gun of truth. If the Russians have high volume first, low volume, and late after-the-fact isn't going to cut it.
And following in that vein, put raincoats on those at whom the firehose will be directed. And I think this fits well within your organizational mission. This is about preventative things forewarning, naming and shaming, some kind of prophylaxis, exposure not just of the falsehoods, but of the propagandists and their mode and their intent. So helping protect and increase resilience and awareness and intention amongst audiences with inoculation or forewarning.
Then third, don't try to swim upstream or don't point your flow of information directly back up at the firehose. If you want to counter propaganda, be thoughtful about the effects and the consequences of propaganda. Some false information is false, but inconsequential. And some false information is false, but incredibly consequential. Pick your spots, find the falsehoods that are concerning or that are against your interest or your collective interest. And rather than focusing on countering the propaganda, focus on countering the effect. So if, for example, you're the United States or some other NATO partner, and you see that Russian propaganda is undermining confidence in NATO, in a specific country, rather than going after the propaganda, which you may want to do, but rather than focusing on that, do something to bolster confidence in NATO in that same audience.
And then fourth just increase the flow of positive information, increase the flow of true information we need to compete. Right now the information environment is flooded with Russian and other propaganda. Let's amp up the level of true and virtuous and persuasive information.
And then finally, and I guess this is specific to government audiences, let's put kinks in the hose. Can we pursue terms of service violations or FCC violations, or other kinds of things against the propagandists to reduce their access to these channels.
14:08: Michael Gordon:
And that would be similar to something like what Twitter is doing with tagging tweets that they feel are misinformation, some minor efforts like that, that we've seen in the last year,
14:24: Dr. Christopher Paul:
Right. So that helps perhaps with some putting kinks in the hose. But I think it also helps with something else that's important about humans, is the Daniel Kahneman system one and system two thinking. Thinking fast and thinking slow. One of the reasons that we as humans are so vulnerable to misinformation and disinformation is because we're cognitively lazy. If something I'm seeing on the Internet looks news in that it has a logo that has the word "news" in it, and it has a footer running across the bottom of the screen that is streaming factoids and information. And it's on a sound stage with a relatively attractive anchor person who is speaking in clean and unaccented English. Provided that they say something that isn't totally inconsistent with my worldview, my brain says, Oh, that's news, that's reasonably credible. I'm going to believe that.
That's a cognitive heuristic - any approach to problem solving or self-discovery that employs mental shortcuts to ease the cognitive load of decision making, often relying on intuition or gut feeling. Not guaranteed to be optimal or rational, this method is nevertheless sufficient for reaching an immediate decision.
, a shortcut that we do a lot. We do a lot of things like that, because we're bombarded with information constantly. And we can't be completely activated and mentally on our guard all the time. So things like what Facebook is doing might help prompt people to kick in their higher order systems thinking. And when people are engaged, they are somewhat better at discerning truth from falsehood. I wouldn't go so far as to say good at it, but somewhat better. You know, they can recognize common, logical fallacies. They can maybe get mobilized to go, Hmm. I'm not sure if that's true. Maybe I should fact check it. That sort of thing.
16:00: Stephanie McVicker:
Okay. Dr. Paul, it looks like we're out of time for today. We've been talking with Dr. Christopher Paul, senior, social scientist, and professor at the Pardee RAND Graduate School. Dr. Paul, thank you so much for joining us today.
16:12: Dr. Christopher Paul:
Oh, thanks very much for having me. This has been lovely.