Dr. Gordon Pennycook discusses the perils of lazy thinking and the collective toll fake news takes on our ability to decipher truth.

Producer: Samy Amanatullah
12/17/2019 • 04:19 AM EST


Dr. Gordon Pennycook, Assistant Professor of Behavioral Science at University of Regina’s Hill/Levene Schools of Business, discusses how lazy thinking makes us more vulnerable to believing fake news. He also discusses how conspiratorial thinking and the systematic deluge of fake news only leads to more public acceptance of wild and inaccurate claims.

Transcript:
0:00: Samy Amanatullah:

We have with us today, Gordon Pennycook, assistant professor of behavioral science at the University of Regina Hill. Professor Pennycook's research focuses on reasoning and decision-making, including the differences between intuitive and deliberative processes. While he has published research on many topics, it is his work on fake news and disinformation that we will be discussing today. Professor Pennycook, I'd like to thank you for agreeing to speak with us and taking time out of your schedule today.

0:33: Dr. Gordon Pennycook:

It's my pleasure.

0:35: Samy Amanatullah:

So, I don't know how familiar you are with the sort of firehose of falsehood theory and terminology, but essentially the idea that fake news or disinformation at least in part is meant to just flood public discourse, muddle it. So how might such a chaotic public space affect how people internalize information?

1:07: Dr. Gordon Pennycook:

It's hard to say because it's a very difficult to experiment to run like an actual study to run as a psychologist, because it's kind of now that's all the things that people are experiencing and you can't really control that. What we have done in experiments is given people, sets of claims, and a number of them, like hundreds of claims in a row. And what we can vary is how plausible the claims are. And so, what happens basically is if you give people a bunch of ridiculous and dumb claims, then they will rate the same kind of ambiguous claim as being true more frequently than if you give them less junk.

Okay. So, what that basically means is the more surprising and wrong things that you give somebody, the more likely that they'll rate something that may or may not be true to be true, because relatively speaking, it seems pretty, pretty unstupid. You know what I mean? But you can adjust how stupid things seem to people by giving people a lot of really dumb things and then getting something that they would otherwise think is dumb, but that, relative to everything else they've seen, isn't that dumb. So, people do kind of adjust relative to what they see, how plausible something is. Which means, just think about that next time you go on Twitter, you have to curate what you're looking at, if you, because it's going to adjust the way that you're assessing relative to what you're already looking at.

2:34: Samy Amanatullah:

So, can you speak to the factors that make fake news so effective?

2:41: Dr. Gordon Pennycook:

There's been a lot of talk about the role of political partisanship in the sharing of fake news, where people are kind of deliberately, you know, deciding to spread falsehoods. And most of the research suggests that it's not really that big, it doesn't play that big of a role, at least not relative to just people automatically, just kind of being lazy. Just like seeing things on the Internet, falsehoods that they could easily debunk it with like a five second Google search, of whether that's true, but they just share it anyways because they don't think about it that much. And so, the culprit is lazy thinking. And so this is what allows us to fall prey to any sorts of disinformation campaigns, whether it's from politicians or teenagers in Macedonia creating fake news or whatever. So, all these things aligned to kind of explain why we're prone to believing fake news.

3:35: Samy Amanatullah:

Do you see or identify any other enabling factors that make audiences, especially prone? In some of the readings I've done, we've talked about these sorts of corridors of doubt, where, for example, with the Pizza Gate scandal, you know, the belief in this deep state makes it possible for some people to believe that there would be a secret pedophile ring?

4:05: Dr. Gordon Pennycook:

Yeah. That's a great point. When people make judgements about, if you give them like an example, like a fake news headline that you would see on it on Facebook, in the same format, picture, headline, whatever. The way that people judge it is not based on the source of that headline, which would be a good heuristic to use. You know, the sources tell you a lot of information about whether it's true or not. They don't really do that. What they do is they just judge it based on how plausible it sounds to them, right. But, but if you kind of live in a world where you are maybe exposing yourself or just believe for whatever reason, a lot of silly things that aren't true, conspiracy theories, then you will be, the same headline is going to seem plausible to you.

And so, what that means is even for those people, even if they stopped and thought about it, you know, if they just like pondered, whether that's true, they're still probably going to fall for it, because the way that they assess the truth is based on whatever background knowledge they happen to have. And if all their background knowledge is filled with this junk conspiracy theories and things like that, then they won't be very good at assessing it. So even for those people, it's a double hit, which is that they're probably not prone to think about it that much in the first place. And if they do think about it, they don't have the tools to assess whether it's accurate, unless they happen to also have a digital literacy, which would be like learning how to fact check things on the Internet, but given that they also have a bunch of belief in conspiracy theories, they probably don't have those tools either. And so, there's some proportion of the population that is going to be very difficult to help improve their capacity to detect fake news because they're already too far, they're kind of deep into the world of junk that it's hard to debunk all that at once, basically.

5:55: Samy Amanatullah:

Now that we're a couple of years removed from the election’s controversy and this sort of bubble where everybody seems to be talking about fake news, what value do you think the term "fake news" has today?

6:15: Dr. Gordon Pennycook:

I'm not sure that the term has ever had any value at all. I mean, as apart from, to kind of organize this around a concern over disinformation more broadly. I mean, the specific kind of cases of people making up headlines and then them getting spread on Facebook, kind of like, that's the prototypical fake news, the Pope endorsing Donald Trump or whatever, a bunch of things that were just made up. That's a kind of, it's an egregious example of the thing that we've seen for millennia probably, which is, you know, aspects of propaganda disinformation, falsehoods. And so, I think, I mean, if the term ever had any value was just to get people to recognize how easy it is now, and probably, you know, in the past always to spread falsehoods and how vigilant we have to be to, to have you know, a real deep regard for what is true and know that we have to do work to figure out the truth.

Fortunately, like people always talk about the kind of scourge of social media in our present media environment, but we're also at the kind of apex of being able to recognize the truth. That is, we have more information available to us than anyone ever has in the past. That the fastest, in your pocket you have a processor that has access to, you know, a massive amount of information and actual legitimate knowledge that we can use at any time to debunk things. 

The only thing that's stopping us is our own brains, our lack of willingness to actually do the, in many cases, small amount of work to kind of recognize what's a reasonable opinion, what is not. And so, the fake news thing is just a good jumping off point for a much deeper and more important problem, which is we need to do work to recognize what's true. And we have to understand how our minds work in order to do that effectively.

References