Dr. Stephan Lewandowsky discusses the cognitive challenge of relinquishing false narratives.

Producer: Serena Balani
05/29/2022 • 07:39 AM EST


Dr. Stephan Lewandowsky, cognitive scientist and chair of cognitive psychology at the University of Bristol, discusses the default inclination to believe what we hear and the cognitive challenge of correcting faulty beliefs once they have become part of our worldview. He explains the disparate association between scientific reasoning and conspiratorial thinking and how conspiracy believers typically respond to any evidence that doesn't fit the conspiracy. He also discusses how the mainstream media can become a vector of misinformationsee definition - false information that is spread regardless of the intent to mislead, by those who often think the information is true. Unlike disinformation, which is false information spread with the intent to mislead, by those who know the information is untrue.
and disinformation just by reporting on it.

Transcript:

0:00: Serena Balani:

With us today, we have Dr. Steven Lewandowsky, cognitive scientist, and chair of cognitive psychology at the University of Bristol. And we're here to talk with him today about the different ways people can fall prey to dis and misinformation and why it can be so hard to correct it. Dr. Lewandowsky, thank you so much for joining us today.

0:19: Dr. Stephan Lewandowsky:

Well, thanks for having me

0:21: Serena Balani:

In your research, you explore why people further root themselves in their beliefs, even when presented with conflicting facts. Why is this the case?

0:29: Dr. Stephan Lewandowsky:

Well, I mean, first of all, we have to begin by recognizing that people believe things that they hear. And there's a good reason for that, which is that most of the time when we hear people tell us something or when we read something, they tell us, then we can assume it's true. You know, most people around us are telling the truth. And so by default people believe things that they hear. And now the problem with that is when something later on turns out to be false, either inadvertently or because somebody was misleading you, then you have to undo this default belief with which you started out. And it turns out that that is cognitively very difficult. The reason it is difficult is because when we go through the world, we're creating mental models of the reality inside our heads, that we try to keep consistent and which we use to explain the world around us. And if all of a sudden something that we have relied on to build this mental model turns out to be false, then it's extremely difficult for us to just yank that out. And because then we have an incomplete edifice in our head, you know, the representation of the world is no longer complete and coherent. And then what do we do? Well, it's very difficult. 

And what most people do is to acknowledge that what we just told them is false, that that is false, but then they're still relying on that to keep their mental model upright. And so even though they know it's false, they may use the information in the future. So that very basic cognitive process is already something that makes it hard for people to correct the information inside their heads.

Now, of course, then it gets worse if politics or ideology or wishful thinking, something we call motivated cognition, if that is involved on top of that. So all of a sudden, not only do we have the problem, the basic cognitive problem of corrections, but we have added on top of that, people's desire to believe things they were told, even if it then turns out to be wrong. And then it becomes doubly complicated to correct that.

3:08: Serena Balani:

Can you elaborate on how individuals acquire and process information?

3:13: Dr. Stephan Lewandowsky:

Well I mean, there's a lot, I can say, but the fundamental thing is that, you know, if you hear something, you believe it, and then if it is repeated, you will believe it more. And that is another thing that has been observed in the literature repeatedly. And it is very clear that this is what happens and that is that repetition increases perceived truth. Irrespective of whether the information is actually true or false.

3:48: Serena Balani:

In your research, you've also found there to be an association between people who endorse conspiracy theories and those that accept science. Could you further elaborate on this?

4:00: Dr. Stephan Lewandowsky:

Yes. Yes, certainly. Well, just to clarify, first of all, the association is such that the more people believe in conspiracy theories, the less they accept well established findings. So more conspiracy means less science, in terms of people's belief structures and in the data empirically that has been observed across a wide range of domains, so for example, vaccination, climate change, the relationship between lung cancer and tobacco smoking, the link between HIV and Aids, and the list goes on. So it's a pervasive relationship between conspiracy theorizing and denying scientific facts.

Now that's not at all surprising if you analyze the cognition underlying conspiracy theories, because it is the complete opposite of scientific reason. It's the antithesis to scientific reason. What Scientists do is to test hypotheses. What conspiracy theorists do is to confirm their hypothesis. So what conspiracy theorists do is whenever they're presented with contrary evidence, you know, with evidence that doesn't fit the conspiracy, then instead of updating their beliefs and revising their theories, what they do instead is to say, ah, that is part of the conspiracy. Of course it is.

So, for example, there was a time ten years ago when there was an email hack and climate scientist were accused by cherry picking those emails of having done something terrible to create this hoax called "climate change." Now, those scientists were then exonerated around the world by nine different inquiries in the United States, in the United Kingdom, God knows where else. But I mean there was absolutely nothing to this. Now what conventional reasoning would suggest is that you give up on this conspiracy, that climate scientists created a hoax called climate change. Of course, that's not what happened among people who deny climate change. What they were instead saying or hinting at is that the people who exonerated the climate scientists were part of the conspiracy. The FBI surely is part of the conspiracy. Has to be. Oh, it's the deep state. So, that is a classic case of conspiratorial thinking. And of course that's the complete opposite to how science works.

If scientists have contrary evidence to their hypotheses, then certainly eventually after some reluctance, they will adjust their thinking. I mean, whenever something I think turns out to be false in science, which can happen, I may be disappointed, because I didn't think that was the case, but I'm not gonna go around and say, oh, the person who ran that experiment was paid by Bill Gates to implant microchips in my head, I'm gonna accept that result. Conspiracy theorists don't. And that is why it's not at all surprising that there's this negative association between conspiracy theorizing and science, because they're in complete opposition to the way their cognition operates.

7:47: Serena Balani:

So you've mentioned before that mainstream media plays a significant role in spreading of disinformation. Could you maybe elaborate on that?

7:57: Dr. Stephan Lewandowsky:

Yes. Well, the mainstream media, and let's start out with the quality media, you know, media that actually deserve to be called media. They're in a dilemma whenever leading politicians are misleading the public or speaking an untruth, because when that happens, what do they do? Do they report on that? Do they just report it? Do they also correct it? Do they just mention that there's weird ideas out there on social media? So the public knows. That's, you know, these are difficult questions to answer. It's a very difficult terrain to navigate. So, you know, there are cases when things have gone from the complete fringe into the mainstream, because the media picked up on the fact that there was a discussion about that on social media, right? So by merely reporting, oh, look over here on Twitter, there's this weird stuff going on. By doing that, well, they took weird stuff made it mainstream.

One classic example is the so-called Pizzagate conspiracy theory, which you may remember from six years ago, that alleged that the Democrats were hiding the child prostitution ring in the basement of a pizza joint in Washington DC. Now that pizza joint didn't have a basement to begin with. And secondly, it was completely utterly absurd because you know, this was based on hacked emails where, whenever Democratic staff ordered pepperoni pizza, that was deemed to mean, ah, they want children. I mean, complete fruit-loopy stuff really. But the mainstream media eventually reported that because it was gaining prominence on social media. At the moment that happens, of course, something becomes mainstream without having any substance whatsoever attached to it. And so that is one easy way in which the mainstream media, despite trying to do their job, may become a vector of misinformation.

And the other way in which they can do it, again, trying to do their job is if leading politicians are lying and then the New York times has no choice, but to say, well, the president said this today. And it's kind of like, well, there's a lie, but they still have to report it. And then it becomes difficult for them to avoid being a vector for misinformation.

10:53: Serena Balani:

So looking back at pandemics like the black death, the cholera outbreak, even COVID, you've mentioned that pandemics help give rise to conspiracy theories. Could you elaborate on that?

11:06: Dr. Stephan Lewandowsky:

Yes. Yes. I mean, first of all, historically, we have seen that over and over again. You have a pandemic somewhere, well a week later at the latest you'll have a conspiracy theory about something, you know. Either a conspiracy theory that says the pandemic is a hoax or the pandemic or a theory that explains where it's coming from and it's blaming Jews or Muslims or a lab in China or some other, Bill Gates or 5G, somebody will be blamed. Now we know that throughout history that's what's happened. And I think we also understand why that happens. And the answer is that when people are afraid, fearful, when they feel that they have lost control over their lives, that is when they become susceptible to conspiracy theories. Not all of us, but some of us.

And of course the pandemic is, I mean, I can't think of a better way to induce anxiety and fear and lack of control than a pandemic. And I mean, the COVID pandemic turned our lives upside down. All of a sudden things that we took for granted, we could no longer do. We had to stay at home. We had to work from home. We couldn't socialize. Couldn't go to the movies. You know, all of that is not just constraining, but also anxiety provoking. And so in those circumstances, a conspiracy theory can offer chicken soup for your psychology, because ironically the conspiracy theory that is blaming a terrible event on somebody, Bill Gates or Fauci or, you know, anybody, whatever you'd think of. If you can blame somebody for this horrific event that is easier, that makes life easier for certain people because they can blame someone. It's easier for them to do that than it is to accept that it was basically a random event.

And that is why conspiracy theories become particularly attractive after traumatic events when people are fearful and feeling that they've lost control. So any mass shooting in the US is followed by conspiracy theory. I can guarantee you that it will be. This Buffalo shooting recently, I don't know. I haven't seen anything, but I bet you there's conspiracy theories already out there. Has to be, always happens. And the same of course is true after accidents. When Princess Diana was killed in a car crash, two days later we had conspiracy theories all over the place, because it makes people feel better to have some explanation other than chance for these dreadful events.

14:15: Serena Balani:

Dr. Lewandowsky, it seems like the inclination to believe in conspiracy theories is all about control?

14:21: Dr. Stephan Lewandowsky:

Yes. And in fact, we can show this in the laboratory. If you put people in a situation where they lose control, they will become more susceptible to your conspiracy theory. And if you make them feel more in control, then they're less likely to exhibit a conspiracy theory or endorse it if they're presented. So it's very much having a sense of control over your life, that makes a difference. And Joe Uscinski, one of my colleagues who studies conspiracy theories, he claimed the, you know, notable phrase that "conspiracy theories are for losers." And what he means by that is that when people lose something, an election, a loved one, their freedom, their safety, their security, that is when they become susceptible to conspiracy theories because of that loss of control. So if you wanna deal with conspiracy theories, then one of the important things to do, at least if you can do it, is to give people back a sense of control. Because people who have a sense of control over their lives are less likely to believe conspiracy theories.

15:42: Serena Balani:

So you've mentioned that people aren't good at letting go of certain explanations without an alternative fact. Could you elaborate on that?

15:52: Dr. Stephan Lewandowsky:

Yes. Well, let me go back to what I said at the very beginning about mental models and how we're building mental models to understand the world around us. And then if somebody yanks out a piece, because it's false, tells us it's false, then the whole edifice kind of starts creaking and it's kind of hard to keep it upright inside our heads. Well, if, instead of just yanking it out, I also give you a replacement explanation, then you can stick that in and keep the building upright and everything is fine. So that's why the best way to correct misinformation is by providing an alternative explanation.

So for example, if we give you a story, random story, a hypothetical warehouse fire, and we initially tell you, whoa, that was probably negligence because somebody left some oil paint in the wiring cabinet. Then if we just tell the people, no, no, no, that's false. The wiring cabinet was actually empty. That makes no difference. They continued to think there was negligence and oil paint in this wiring cabinet, right? Because of the difficulty of yanking out part of the mental pattern. Now, if, instead we say, oh, that wiring cabinet was empty. And by the way, we found a can of gas in the hallway with petrol soaked rags, it might have been arson. Then bang people give up on the oil paint. Never mentioned it again, because they now know it could have been arson. And so that is what you have to do, if you want to correct people to really let go of the initial information, you provide them with an alternative.

17:46: Serena Balani:

Okay. It looks like we're out of time for today. We've been talking to Dr. Stephan Lewandowsky, cognitive scientist, researcher, and author. Dr. Lewandowsky, thank you so much for joining us today.

Dr. Stephan Lewandowsky:

17:58: Thanks for having me. It's been a pleasure.

References