Dr. Gordon Pennycook discusses the latest strategies to help people spot misinformation and discourage them from sharing it.

Producer: Charlotte Jones and Grace Lovins
05/20/2023 • 05:09 AM EST


Dr. Gordon Pennycook, Associate Professor and Researcher of Behavioral Science at the University of Regina's Hill/Levene Schools of Business, discusses his latest research on people's susceptibility to misinformation and the many different interventions being tested to help people spot misinformation and discourage them from sharing it. He also discusses the limitations of fact-checking and the potential backfire of social media warning labels.

Transcript:
0:00: Charlotte Jones  

With us today we have Dr. Gordon Pennycook, associate professor and researcher of behavioral science at the University of Regina's Hill & Levene School of Business. While he's published research on many topics, it's his work on our susceptibility to misinformation that we will be discussing today. Dr. Pennycook, thank you so much for joining us today.

0:18: Dr. Gordon Pennycook

My pleasure. Thanks for having me.

0:20: Charlotte Jones

Okay to first start off speaking of misinformation your research has shown that people are drawn to misinformation with emotional hooks. Can you explain how you tested for this in previous research?

0:32: Dr. Gordon Pennycook

We kind of looked at actually the emotionality of the individual. So we have a couple different ways that we do that. One is, we know which people can more emotional in the first place. And that's been something that psychologists have been measuring for decades now. And sort of about people who tend to be more emotional who rely on their intuitions and they are more influenced by their kind of intuitive feelings about things, they are more susceptible to misinformation. So that's one piece.

Another thing we've done is, we try to get people into a kind of more emotional mindset. We just like kind of told them to kind of just rely on your emotions and feelings or in contrast, like really try to think through what you're seeing. And if you get them to think or to rely on their emotions, then they are more again susceptible to misinformation.

Well, what we're doing now, we haven't published this yet, but we're looking at emotionality in the content itself. So a lot of studies have looked at, for example, like you have some headlines that people or like news headlines are false and true, content of various forms that vary in how emotional people take them to be like, what kind of emotions they kind of bring out in people. And it is the case that the headlines that can be more kind of emotionally evocative, are the ones that get the most attention and are more deceptive essentially. People use that as a kind of tactic.

What we're doing in this recent study is we take the exact same basic claim and we manipulate it to be high vs. low emotional using like, really terms like "disgusting" or like just adding kind of additional terms to make it more kind of emotionally evocative. And we were testing to see whether fact checking is as effective when you have this emotional content.

And what we found, this is all preliminary yet, we're still running more studies, but fact checking does seem to be a little bit less effective when you have more emotional content. The claims are the same. It's just that the additional kind of emotionality around the claim was there. So I think it basically kind of undermines people's ability to take in the information that you're giving them in the fact check, because emotion basically undermines our reasoning and that's the key element.

2:59: Grace Lovins

What is the "implied truth effect" and what impact does it have on people's judgment of misinformation?

3:05: Dr. Gordon Pennycook

So the implied truth effect relates to inferences that people make about fact checking labels or warnings or whatever it is. So, we ran this experiment when, back in the old days before we talked the first time, when in 2016 one of Facebook's first thing they were going to do to combat the misinformation problem was, they put these little labels on the bottom of content they thought that was fact checked to be false, essentially. And it said, "disputed by third party fact checkers," okay.

And what we did in an experiment is we put those labels on half of the false headlines. And you know, when there's a label on it, people are less likely to believe it, and they're less likely to share it. That's good.

However, what we also found is that the headlines that are false, that don't have the label, are believed more. Because in the context of like a situation where we see a bunch of different labels, people take the lack of a label to mean that it's verified in some sort of way or that it has been checked and is not false. But you could just like, if we just put, we did an experiment where were just like said verified or unverified, and then you don't get the effect. So it's just a matter of the inferences that people make, not just from the presence of the label, but from the absence of it. And that's the implied truth effect.

4:27: Grace Lovins

So what kind of challenges do those warning labels present for those people trying to debunk misinformation?

4:32: Dr. Gordon Pennycook

Well, it really depends on sort of prevalence. So if you see like one warning label like here and there, you're probably not gonna infer that everything that you see that doesn't have a label is true. Right. It's a feature of, if it were to be, this is basically testing a hypothetical scenario where Facebook or whatever social media company became super successful at actually detecting misinformation, if they got to maybe even put labels on 50% of it, you might find that that actually would backfire. And so really, it's something that, I mean in practice and in theory has to be done at kind of the margins essentially.

And so therefore we need to find more different approaches in addition to that. So fact checking and labeling is very important and good, but it can only really be done on a subset of the content, essentially.

5:24: Charlotte Jones

You've co-authored a number of articles on cognitive reflection. Could you explain what this is and how it may relate to political affiliation and ideology?

5:33: Dr. Gordon Pennycook

Yeah, so cognitive reflection is what I've been, that's kind of like thinking analytically. It's stopping and questioning whether your intuition is correct. And so we've argued in lots of different papers and studies and we've shown lots of data indicating that it's a really critical skill to have. I mean, everybody has it. Everybody can stop and think about things, and the difference would be like, you know, if you're gonna buy a house or something, you know that you have to think about that problem. There still will be silly intuitions that may influence you. Things that don't matter that much. Did the yard look nice? You can easily fix that. But like that might influence your choice. But you do know that you have to think and people typically would.

And that's not always the case online. People don't kind of recognize they're not in that sort of mindset. And so when it comes to, especially content is, the thing that's interesting and you can look at misinformation, it's being constructed by people explicitly to get your attention. To appeal to your intuitions. Right? That's so like, there's something there that's working against our general tendency to just kind of go with our gut, especially online. And so that's the kind of primary issue. And it's true for other domains, but it just seems to be a particular problem in the context of online misinformation.

6:53: Grace Lovins

You also had a research paper published this April that looked at the factors people use to decide what news, whether true or false, to share on social media. Can you share some of those factors with us?

7:04: Dr. Gordon Pennycook

Yeah, for sure. It seems to me, there's primarily kind of two sort of things. There's our kind of biases that draw our attention to things, whether they're in sometimes like news that's sensational will do that. And then there's like, the stuff whether it sounds true and like what the source is, all that kind of information has an impact too, but it's not the same for everybody and to different extent, depending on what the content is.

So political ideology for example, is something that drives our biases. And that's not all like, it could be exposure too, like so for example, somebody who in the States is politically conservative, they're more likely to watch things from like Fox News. And if you watch Fox News all day, or even if you were in a house, you grew up in a house where Fox News is on TV, you don't care about politics, but it's on TV in the background all the time. You're gonna have a very different kind of set of political beliefs, just through osmosis, just like exposure to information than somebody who's in a house with MSNBC on all day, ok.

And so those like background kind of beliefs and that's what your biases are. And so if you were in a Fox News house, and you see a fake news headline about Hillary Clinton during the 2016 election, it would seem intuitively more plausible to you. Okay, and it might grab your attention more, and etcetera. And that's one thing that drives engagement. But maybe you're also a dedicated, you know, someone who looks at facts and you care about the truth and all that kind of stuff, and so knowing whether it's true is gonna have an impact as well. But those don't always going the same direction. And often, they're in different directions. That's the key kind of underlying element of it.

And just the final thing on what Charlotte's previous question is, that how reflective you are, doesn't really impact your biases. Our biases tend to be intuitive. And so being reflective can help you correct them. But some people think that being reflective actually helps exacerbate your biases because you can convince yourself that things that you want to be true are true. Basically we find that that's not very common, that people usually, if they are going to spend the time reasoning seriously to get it right, they don't often, you know it's not always the case that they get it right if they stop and think about it, but it generally does help, even across party lines.

9:26: Charlotte Jones

You proposed numerous methods to countering misinformation, especially on social media. Could you explain a few of these a bit further?

9:33: Dr. Gordon Pennycook

Sure, yeah. We discovered in these studies that people often share false content without really even thinking about whether it's true or false. They reflexively kind of share because it grabs their attention, but in many cases, if they thought about it, they would be able to recognize that it's probably not true. So you just kind of really need to kind of trigger one additional step. And so what we found is if you just simply kind of remind people to think about accuracy, just like subtle ways, you can just ask them a question like, "do you think it's important to only share accurate content?"

Like it doesn't matter how they answer or you just ask them the question like, "do you think this is accurate?" If you do that, subsequently, they will be more kind of attentive to whether things are accurate or not, and share less false content, essentially. And it's not a huge effect because the subtle. Just like, just a simple kind of shift of mindset. And it probably is ephemeral. It's like you have to keep on reminding people, to some extent. You didn't teach them anything. But it does seem to be effective. And so that's one additional kind of tactic that we can use to help solve the problem.

10:42: Grace Lovins

This question kind of goes with just what you talked about and kind of those attention based on interventions. So you mentioned accuracy nudging. So I was curious about if you could elaborate on any more attention-based interventions that we could use to counter misinformation?

10:59: Dr. Gordon Pennycook

Well, some people have proposed work from Tali Sharot, which I think University College London. It's a similar sort of idea about like getting people to think about accuracy, but there there was more focus on kind of incentives. Where like, if you were to say, I mean this is all hypothetical, but if there was some sort of cost to saying something that's false, or just something that really kind of, not only does it kind of put more attention on whether things are true or false, but does that in an even stronger way by adding some incentives or some consequences, or like even a lot of people post like, you know, badges.

I don't think that social media companies would ever do that, but like, you know, anything that would make it so that the thing that is incentivized on social media is not necessarily engagement, which is often orthogonal to whether things are true, but things that are related to truth, that might change the way that people interact with the platform. I think that would actually work and that'd be great. Would you ever convince a social media company to do it? I would have my doubts, but you know, it does seem that you can change people's mindset essentially, and get them to the focus on different types of things, because people do care about accuracy.

You know, and it's like even the deepest like, down-the-rabbit-hole conspiracy theorist, is like deeply concerned with accuracy. They just kind of are off-the-rails in terms of what isn't true and what's false. So people who care about the truth, they don't want to share false things. Those people are not trolls, trolls are very uncommon. So it's just a matter of changing the way that people interact with the medium but that's not... you need the medium to do that if you want that to happen. And we don't know the social media companies.

12:52: Grace Lovins

You've also researched the context of social media and how that influences susceptibility to false information. Why are people more apt to believe false claims specifically on social media?

13:03: Dr. Gordon Pennycook

So the first thing I have to say is that we don't know for sure that that's true. Very hard to test that like you know, because like, while we have to, we'd have to, if I own Facebook, I can test that maybe. Although I don't know, I feel like if I owned Facebook I would have a different set of concerns, but, probably I'd be on a beach somewhere right now. But it seems to be the case that because, I mean sort of what I said before, social media, it's about the incentives, right? On social media, there's nothing that is really stopping people from sharing false things. Apart from like, if you were to share something on Facebook, and somebody says to you, "Oh, I think that's false." And people don't like that.

But most people don't do that. Like people don't really, if they feel comfortable kind of fact checking each other, they usually just ignore it. And, by the way, just because I started thinking about this sort of thing, and, you know, with COVID everyone kind of went sort of crazy. And I have a lot of people from various backgrounds, I'm from a rural background, so some relatives of mine, for example, would share a lot of misinformation about COVID. And I decided I'm not going to ignore it anymore. I'm going to say something every single time. And it was interesting. I'll tell you that much.

But it did actually sort of work. Or at least, Facebook, by the way, thought I was really interested in misinformation because I kept on commenting on this stuff. So I kept on seeing more and more things. But then eventually, like, people I think got annoyed or whatever and like my social circle became more cautious about what they share. Or like they found some way that I wouldn't see it. I don't know. But it does seem to have an impact. But generally people are not crazy like me, and don't do that sort of thing.

And so there's like nothing really stopping people. In fact if anything like that you get, if you share something and a few people like it, you don't see the people who don't like it, and that feels like endorsement to you. And then people carry on. And so like that, in addition to the fact that people go on social media for entertainment, means that's where we are.

15:14: Grace Lovins

Now in terms of the decision to share misinformation, can you share your findings from this research kind of regarding how accuracy discernment changes if people are also making the sharing decision on top of that?

15:27: Dr. Gordon Pennycook

Right. So yeah, so what I was talking about before was that, if you get people to think about accuracy, they'll share less kind of false content. But we have a new study that shows the opposite is also true. That if you kind of put your mindset of thinking in terms of what to share, then they become a little bit less discerning. Because they're kind of distracted from the truth and they're thinking about, you know, are people going to like this? How does it make me look? You know. Is it important? All those types of things. And in the context of these studies, those things are not related to the truth.

So it does seem to be the case that social media may have a kind of causal impact on our ability to distinguish between true or false. It's not like making us dumb, per se. It's just in the context, we're dumber. And then you get out of it, and then you're fine. So it's just a matter of... it's fixable, but it does seem to be an issue. Again, it'd be great to run that sort of thing actually online but you know, this is as good as we can do essentially.

16:35: Charlotte Jones  

Finally, in your own words, how would you describe the overall effect misinformation and social media has had on society?

16:42: Dr. Gordon Pennycook

Oh boy, that's a good question. You know, I try to take moderate ground on this. I mean, there's some days that I'm like, they've ruined everything. And I think it's not unreasonable to think that. I think there's some element of truth to that. But at the same time, like there are a lot of true content. So let's just think about COVID for example, the pandemic. We got a lot of stuff, we got a lot of information that was important and true on social media, too. Right? Like, you know, information about where to get the vaccines, or whatever it was. So it wasn't all bad stuff. But like, I'm not sure that this, by the way, and it's not just about true or false, you know I mean. Like the idea of these things is to, especially like Facebook and things like that, is to kind of bring people together and to help you feel connected and all that kind of stuff.

And I think there's some benefit to that but like, I also don't need to know the political opinions of people that I went to high school with, right. But I can stop following them, but I still do it. Like that's the kind of thing. Like my wife, she follows people on Instagram that she just does not like at all. But she gets some enjoyment from it, and she's like look how dumb this is, you know. People do that. Like that's how Howard Stern has his show. So like, people like it and like so, in a certain sense it's hard for us to continue blaming social media, because it's us too. We made it what it is, to some extent. And so we can't just blame Zuckerberg, we have to blame ourselves too.

18:24: Charlotte Jones

Okay, so it looks like we're out of time for today. We've been talking to Dr. Gordon Pennycook, associate professor and researcher of behavioral science at the University of Regina at Hill & Levene Schools of Business. Dr. Pennycook, thank you so much for joining us today.

18:39: Dr. Gordon Pennycook

My pleasure. Thanks for having me.

References