Dr. John Cook discusses using inoculation to combat science denial and vaccine misinformation.

Producer: Serena Balani
05/25/2023 • 07:58 AM EST


Dr. John Cook, senior research fellow at the Melbourne Centre for Behaviour Change, discusses the origins of his popular Cranky Uncle game and how it uses logic-based inoculation to help people detect misinformation. He also explains how he conceived the FLICC acronym and discusses his other pioneering research to combat science denial and vaccine misinformation.

Transcript:
0:00: Serena Balani

With us today we have Dr. John Cook, a senior research fellow at the Melbourne Centre for Behavior Change at the University of Melbourne, and the creator of the Cranky Uncle game that focuses on the understanding and countering of misinformation. Dr. Cook, thank you so much for joining us today.

0:15: Dr. John Cook

Great to talk to you.

0:17: Serena Balani

So it's been a few years since we last talked to you, and at the time you were just releasing the Cranky Uncle game. So could you tell us the inspiration behind the game and about its successes since its release?

0:29: Dr. John Cook

Yeah, so the Crank Uncle game really flowed out of my psychological research. I began my research career investigating the question, how do we counter misinformation? And my research ended up pointing me towards... I'm going to go into jargon now forgive me for being a scientist and throwing around scientific jargon. But, what we call logic-based inoculation, which is just a technical term for explaining the techniques used to mislead people. And there are a lot of techniques. There's a lot to teach people. And that's actually quite a big communication and education challenge. 

So I've been grappling for a while with what are engaging practical and scalable ways to teach the techniques of misinformation to people. And I arrived at games as a potentially effective strategy. So we developed a game, Cranky Uncle, which teaches the techniques of misinformation. And does it in a way that is engaging, using humor, patterns and gameplay elements, like collecting points, leveling-up, as a way to motivate people to get further and further into the game and learn more techniques of misinformation.

1:50: Serena Balani

So who is the character Cranky Uncle in the game? Who was he supposed to be? And was this when he first began exploring your cartoonist skills?

2:00: Dr. John Cook

Yeah, so Cranky Uncle, yeah, so he's a cartoon character in the game representing that family member we all have that thinks that they know better than the world's scientific experts. It actually comes out of my original research into climate denial. There's a study exploring what is the demographics of science, climate science deniers, and I find that their most likely to be older, male, white, politically conservative, essentially that cranky uncle character.

Now before I did my PhD in cognitive psychology, I was actually a cartoonist for about 10 years. And I started exploring the approach of using cartoons as a way to develop publicly engaging public communication. So using humor, using visuals or lots of powerful communication techniques. And so taking that idea of the cranky uncle character and the visual, that all came together as a cranky uncle character. And it began with a book. We published the book Cranky Uncle vs. Climate Change. Then after the book was finished, that's when we started working on the game.

3:26: Serena Balani

So we've talked to a number of researchers about science denial, and they've mentioned the FLICC acronym. Could you walk us through how you coined FLICC and the history behind it?

3:39: Dr. John Cook

So FLICC stands for the five techniques of science denial, (1) fake experts, (2) logical fallacies, (3) impossible expectations, (4) cherry picking, and (5) conspiracy theories. They were first proposed by Mark Hoofnagel, who runs a blog about science denialism and then it was picked up by in a peer-reviewed paper by Pascal Diethelm. And I think that it was from that paper where I first learned about these five techniques, but it just listed these techniques.

It was actually when I was giving a talk at a youth summit, about climate change. So it was a room full of young climate activists. And I was trying to explain to them climate misinformation. And I thought I need to come up with a catchy way for these young people to remember the techniques of science denial, so I actually vividly remember just walking down the road to the hotel, just thinking what's an acronym. We've got fake experts. We've got unrealistic expectations. We've got cherry picking. Oh this is not going in a direction that would be suitable for young people. And then I just kept reworking and tweaking some of the terms like unrealistic expectations, I changed to impossible expectations and ended up coming up with the FLICC acronym which I used in that talk, and found it really useful. Not just helping other people remember it, but it also helping me remember it. That basically, FLICC is the way that I'm able to rattle them off to you right now because it's a nice sticky acronym.

5:20: Serena Balani

Can you discuss the two different types of inoculation?

5:24: Dr. John Cook

Yeah, so there are two main approaches to inoculation. There's actually three, but the main two are, fact-based and logic-based. Fact-base inoculation involves showing how misinformation is wrong through factual explanations. So for example, you might want to inoculate people against a myth to do with the greenhouse effect. And the way you would do that is you would explain how the greenhouse effect actually works. Explain the facts of climate science.

Logic-based inoculation involves explaining the techniques used to mislead. So if you were debunking a myth about the greenhouse effect, you might explain how the myth uses the technique of oversimplification or misrepresentation and there are lots of different techniques underneath the framework. 

The third inoculation, just for the sake of being comprehensive, is source based inoculation. And that involves explaining how misinformation sources are not credible. They may have a bad track record in consistently, repeatedly showing misinformation in the past, or they may have a conflict of interest. Maybe they're, you know, they have  financial vested interests. It's about showing how this source of misinformation shouldn't be trusted, because of various different reasons.

6:56: Serena Balani

Can you discuss the experiment you did regarding the Global Warming Petition Project and your findings on it?

7:03: Dr. John Cook

So during my PhD one of the first experiments I ran was looking at some inoculation. Although at that time I didn't even know that I was doing inoculation. It was only after I presented the results at a psychology conference in Sydney, that a professor said, that sounds a lot like inoculation and I was like, what's that now? It was one of those moments for a PhD student where they thought they'd come up with something new and it turns out nah, smart people have been doing this for 70 years. Doing a PhD is a very humbling experience.

What the experiment did, and this was a paper that ended up being published in 2017, was we were trying to counter what turned out to be one of the most effective and damaging myths about climate change called the Global Warming Petition Project. 

So this is a website, I think it's petitionproject, and the website lists 31,000 American science graduates who have signed a statement saying that humans aren't disrupting climate. The point of this petition is to argue, look 31,000 scientists, that proves that there's no scientific consensus that humans are causing global warming. It's about casting doubt on the scientific consensus on climate change.

In our experiment, we first tested what happens when people are shown just this misinformation by itself. And we found that it works. It reduces people's perceptions about climate change. It has a negative effect, but it doesn't have the same negative effect across the population. People who are more politically conservative, are more influenced by the misinformation. Or people who are more politically liberal, the misinformation really didn't have any effect.

So this tells us that climate misinformation can be polarizing. It pulls the public apart so they end up with more different beliefs after they've encountered the misinformation. Or another group in our experiment, we tried inoculating them before showing them the misinformation, and by inoculation, I mean, we just explained a technique used in the global warming petition project. A technique called fake experts. Using people who convey the impression of expertise, but they don't have the actual relevant expertise.

In the Global Warming Petition Project, it was anyone with a science degree, but it was a degree in any field of science. So if you look at the actual disciplines of the people who signed the petition. You have computer scientists, veterinary scientists, medical scientists, engineers, but very few, like less than .1% of the signatories, actually have the relevant expertise in climate science.

The important feature of our inoculation message though was we didn't mention the Global Warming Petition Project at all. Instead, we spoke about the technique of fake experts in general terms, and then we used tobacco misinformation from the mid 20th century as an example of that technique. We found that when people were shown that inoculation message and then showing the Global Warming Petition Project, the misinformation was canceled out. And across the political spectrum. The misinformation had no effect on people whether they were liberal or conservative.

So what this tells us is two things. Firstly, nobody likes being misled, whether you're conservative or liberal. So if you explain techniques used to deceive, those techniques become less persuasive to people across the political spectrum. Secondly, it tells us that you can inoculate people against misinformation without even mentioning it by explaining in general terms, the techniques used to mislead.

11:18: Serena Balani

When we last talked, this was in the early days of the pandemic, were there new things that you learned or things that were reinforced during the pandemic?

11:27: Dr. John Cook

Early in the pandemic I had this notion that COVID misinformation might not be as polarizing as climate misinformation. And the reason why was because of the concept of psychological distance. A thing that climate communicators like myself struggle with is the psychological distance of climate change. It seems like this distant issue that is going to happen decades in the future to our grandchildren or people in other parts of the world. And therefore people have, like just psychologically we struggle to care about the issue as much as we should because it doesn't seem like an urgent threat.

The pandemic was completely different. There was no psychological distance and it felt like it was everywhere. You know, we go to the shops, we walk down the street, in the office, just being around any other people, it felt like it was ever present. And so I had this, in hindsight, naïve notion that this small psychological distance would mean people would take the issue more seriously and respect the scientific experts because they needed to, to protect their own lives. It turns out that's not what happened.

I mean, it did to a lot, most of the public did take it seriously. But the issue did end up becoming polarized anyway, in a very similar way to climate change. And it happened quickly, like over a matter of months. We have in climate misinformation we've documented this decades-long process of misinformation, polluting the information landscape and polarizing the public. I mentioned how my research showed that misinformation polarizes. If you just keep doing that for decades, the public ends up being polarized, to the degree that it is now on climate change.

We saw that happen with COVID in fast-forward. Like two decades of climate polarization was compressed to like two months during the pandemic. And there were a lot of scientists working in health sciences who were like what's going on? And us working in climate change, we're like, yeah, we've seen this all happen before. Not as quickly but it's an unfortunate side effect of science denial and tribalism.

14:09: Serena Balani

I've also seen you talk about the corrosive effect conspiracy theories can have on people that don't believe them. So could you elaborate on this? 

14:18: Dr. John Cook

Yeah, often conspiracy theories are entertaining, they are so absurd or extreme that we just laugh at them, right. And tend to dismiss them as harmless. The problem is, research has shown that, when people are exposed to conspiracy theories, even if they don't believe them, they can still have an influence on their attitudes, their support for policy, just subconscious influences, where we're not actually believing the conspiracy theory, but it still influences our attitudes. So we do need to be careful about even conspiracy theories that we think are ridiculous and extreme and harmless.

15:04: Serena Balani

Regarding climate change, can you talk to us about what you call the five climate disbeliefs?

15:12: Dr. John Cook

Yeah, as well as doing lots of different acronyms, I also like doing taxonomies. And we were doing a research project where we were training a machine-learning model to detect climate misinformation. To do that, we needed to build a taxonomy of all the different misinformation claims. And we found that as we built this taxonomy, there were five main categories at the top. The five climate misinformation categories where (1) it's not real, as in global warming is not real. (2) It's not us, as humans aren't causing it. (3) It's not bad. The impacts of climate change won't be bad. (4) The solutions won't work. In other words, the proposed either climate policy or renewables are either ineffective or harmful. And then (5) the fifth category was you can't trust climate science or climate scientists. So it was about attacking the actual science or scientists.

Once we had this taxonomy in these five main categories, I realized that they lined up perfectly with research by Ed Maibach who happened to be my boss at the time when I was working at George Mason University. He had done psychological research into all the different attitudes and beliefs that people have about climate change. And he boiled them down to these five key climate beliefs. It's real, it's us, it's bad, the solutions work or there's hope, and the experts agree on the science.

And I realized actually what I've got is the opposite world version of those. Instead of, it's real, it's us, it's bad, it was it's not real, it's not us, it's not bad. Instead of there's hope, it was solutions won't work. And instead of experts agree, you can't trust the experts. So I called these five categories, "The five key climate disbeliefs" as a kind of homage to Ed Maibach's original framework.

17:26: Serena Balani

After understanding the arguments used by science deniers, how would you advise others to rebut these arguments?

17:34: Dr. John Cook

Yeah, that's a good question because it's one thing to inoculate yourself against misinformation and recognize the techniques, but how do you actually respond to them? And we talk about this a little bit in The Conspiracy Theory Handbook. It's important if you're trying to persuade someone that you don't go with the goal of making them feel stupid. That's never the pathway to changing the person's mind. So you do need to try to discuss these kinds of issues with empathy, with curiosity, openness, genuinely trying to understand where they're coming from. But that approach can also be constructive. Trying to understand where they're coming from, can help explore the reasons why they believe what they believe, what are their arguments? Are there any unstated assumptions in their arguments? And that can be a way to kind of deconstruct the person's argument or the claims that they're making.

Also, I think that appealing to commonly held values is a constructive approach, particularly the commonly held value of critical thinking. If someone is conspiratorial or they're not trustful, or suspicious of science, they tend to think of themselves as critical thinkers, and you can appeal to that commonly held value, and even encourage them to turn that critical thinking on to their own claims. And it can be a difficult sell, but that is at least a constructive approach.

The other thing that I've gravitated towards myself and tested a number of times in different scientific studies, is the use of logic analogies. Philosophers call it parallel argumentation. Take the bad logic in a misinformation claim and transplant it into an analogous situation. And for me, it was a light-bulb moment when I realized that parallel argumentation was used every night by late night comedians. They were debunking something, some misinformation of the day. And they would say, well, that's just like being in this situation. And then they would use that same logic, usually an absurd situation. Everyone would laugh. It's quite entertaining. It's funny, but it also is quite educational, because it is a concrete, engaging way to explain what is essentially quite abstract, logical fallacies.

0:00: Serena Balani

Okay, it looks like we're out of time for today. We've been talking to Dr. John Cook, a senior research fellow at the Melbourne Centre for Behaviour Change at the University of Melbourne and the creator of the Cranky Uncle game. Dr. Cook, thank you so much for joining us today.

0:00: Dr. John Cook

Great to talk to you. Thanks a lot.

References