Dr. Tom Stafford explores the relationship between the illusory truth and mere exposure effects and how each influences what we perceive as true.

Producer: Lauren Shields
10/02/2020 • 09:00 PM EST



Dr. Tom Stafford, Cognitive Scientist at the University of Sheffield, UK, discusses the illusory truth and mere exposure effects, the common strategies we use to try and judge truthfulness, and how repetition (e.g. ad nauseumsee definition - repeating something over and over again, until it forms a mental association and/or becomes perceived as truth.
) and familiarity can influence what we perceive as truth. He also touches on the roles humility and introspection can play in combating these phenomena.

Transcript:
0:00: Lauren Shields:

With us today, we have Dr. Tom Stafford, who's going to talk with us about the illusory truth or illusion of truth effect, its influence on belief, and its limitations. Dr. Stafford is a cognitive scientist at the University of Sheffield, UK with a focus on learning, bias, and decision-making, and what he calls evidence-informed persuasion. Dr. Tom Stafford, thanks so much for being with us today.

0:22: Dr. Tom Stafford:

Hi, great to be here.

0:25: Lauren Shields:

Now in your BBC article, How Liars Create the Illusion of Truth, you say that "the illusion of truth can be a dangerous weapon in the hands of a propagandist." Can you please explain how the effect works and how it can be dangerous?

0:40: Dr. Tom Stafford:

Yeah, absolutely. And there's a lot of history around this. The illusionary truth effect is a phenomenon whereby you repeat something and it then appears more true to people. And so we all probably have some more intuitive idea about this, and this is kind of a principle of propaganda, that if you want people to believe things, you repeat the lie often enough and it becomes the truth. But for the experimental psychologist, we're interested in capturing those things in the lab so we can understand if they're true and when they are true, why they're true.

And so the original effect was done by a psychologist, I think called Lynn Hasher, in the seventies, and she got true statements and false statements. And after they rated their truth, either once or multiple times, and found that the statements that people rated multiple times were perceived to be more true by the end, even if they were false. So I've got here, some example statements. So from that experiment, French horn players get cash bonuses to stay in the US army. True, at least in 1977. Zachary Taylor was the first president to die in office, False. So these are statements which were designed for people not to have an idea about or not to be certain about. And the basic fact which has been found many times, is that asking people to rate them multiple times, so the repetition is created by the multiple ratings, increased the perception of their truthiness.

2:35: Michael Gordon:

How was the repetition provided to these people that rated it higher?

2:42: Dr. Tom Stafford:

So you can do iterations of these experiments where you get people to, you know, watch something or read a text and that, you know, rather than doing a rating task, where they're kind of exposed incidentally to the statements, and you get the same phenomenon. So there's a related phenomenon in psychology called "the mere exposure effect," which is that people like things that they've seen more. So you can do these kind of experiments where you show people lots of images or you play certain sounds again and again and the ones that you repeat, people rate as more preferable by the end of the experiment. They like them more.

3:27: Lauren Shields:

So you just mentioned the mere exposure effect. Can you talk to us at all about the relationship between familiarity and what we perceive as truth?

3:35: Dr. Tom Stafford:

So what we're doing, when we try and judge the truth of something, is we assemble all the evidence. And the lesson we learned from thousands of psychology experiments is that we're good reasoners, but imperfect reasoners. No one has access to an encyclopedia in their mind. We can't constantly be checking the Internet. Even the evidence of our own sensors is sometimes misleading. So we have to construct an answer based on our strategies, which sometimes by psychologist called heuristicssee definition - any approach to problem solving or self-discovery that employs mental shortcuts to ease the cognitive load of decision making, often relying on intuition or gut feeling. Not guaranteed to be optimal or rational, this method is nevertheless sufficient for reaching an immediate decision.
, which are rules for judgment that give us a good enough answer most of the time.

And it seems that psychologically one of the heuristics we use to inform our judgements is a sense of familiarity or an ease of processing, which is called "fluency." And that informs our judgment of things. So if you're asked to judge the truth of something like, is Helsinki the capital of Finland, maybe you've been to Finland and you can call up an actual memory, maybe you paid particular attention in school and you remember your geography classes. But if you don't have access to something concrete, then you probably, you know, base it on you know, does it sound right? How do you feel? And that's where this feeling of familiarity or fluency can creep into your judgment.

And one of the things that affects familiarity is repetition. And so there the repetition of an untrue thing can influence people's truth judgments. But a key to understanding that really is that this isn't an insane strategy for someone to use, in a world where true things are repeated more often than false things. So dogs are pets, planes fly, you know, these things are repeated more often than their opposites. Familiarity will more often than not be for things that are true. So using that strategy is not foolproof, but it's not madness either.

6:12: Lauren Shields:

That's a great distinction to draw that just because something is repeated a lot doesn't necessarily mean it's untrue. It's not a crazy strategy. It's actually a survival mechanism, because we can't be constantly thinking about every single decision we make all day long.

6:26: Dr. Tom Stafford:

Yeah. So the insight from psychology is that we're limited reasoners and we have to kind of deploy our resources. Our memories are fallible. Our senses are fallible. We don't have time to check everything. So we rely on strategies. So does it seem familiar? Does it seem plausible? Was it told to me by someone I trust? These are all strategies just believing, you know, I hear it. There's a newspaper, if it's in the newspaper and that newspaper has a good reputation, maybe I'll believe it. We all know that newspapers make errors, but, you know, I've got to choose who to believe and who not to believe. These are examples of strategies that are used to weigh up truth or falsehood.

7:10: Lauren Shields:

Okay. Thank you. At Propwatch our mission is to increase media literacy and raise public awareness of propaganda and disinformation. And part of what you do in the latter part of your article is educate readers on how to combat the effects of the illusion of truth. What's the relationship between educating ourselves and being vulnerable to this particular propaganda technique, do you think?

7:33: Dr. Tom Stafford:

I think the key here is to recognize that we're all imperfect reasoners. So when someone asks me what I believe or what I think is true, there are many things that I believe to be true, and I probably thought them through, or I had good reasons to come to those conclusions once. But it may be the now I've forgotten, you know, why I believe them or who told me. And in a world where, you know, there's never enough time and information is uncertain, I can't spend all my time checking the facts.

But when issues are important, and I think when people disagree with you is a signal that the issue is important, you need to recognize that we're all vulnerable to errors of memory and reasoning, and we should check our references, you know. That's why we try and provide arguments. Why we you know, back up our claims with sources. And recognizing your own kind of humility, your own ability to believe things, you know, Why do I think that? Is it possible I made a mistake? would probably be the big lesson here for me.

8:57: Lauren Shields:

Well, it's like we're out of time for today. If you'd like to know more about Dr. Tom Stafford's work, you can find him on Twitter at @tomstafford, all lowercase. And at tomstafford.staff.chef.ac.uk. Tom, thank you so much for joining us today.

9:13: Dr. Tom Stafford:

Thank you. It's been fun.

References