Dr. Ken Broda Bahm discusses the danger false equivalence poses to rational argument and methods to counter it.

Producer: Serena Balani
08/02/2022 • 06:18 AM EST

Dr. Kenneth Broda Bahm, senior litigation consultant and co-editor of The Online Courtroom,  discusses the challenge that false equivalencesee definition - implying that two things are essentially the same, when they only have anecdotal similarities.
poses in distorting how, "in the interest of balance," the press covers controversial issues, as well as how it can be used in the courtroom to effectively cancel out expert testimony. He also addresses the symbiotic relationship that false equivalence can have in helping insolate conspiracy theories from expert opinion.

0:00: Serena Balani:  

With us today we have Dr. Ken Broda Bahm, a senior litigation consultant, author, editor, prolific legal blogger, and an expert in the areas of legal persuasion and rhetoric. And we're here to talk to him today about false equivalency. Dr. Broda Bahm thank you so much for joining us today.

0:16: Dr. Ken Broda Bahm:  

Thank you for having me.

0:19: Serena Balani:  

So here at Propwatch, we defined a false equivalency as implying that two things are essentially the same when they only have anecdotal similarities. You said that false equivalencies benefit from the Democratic notion that there are always two sides to a story. Can you elaborate on this point?

0:36: Dr. Ken Broda Bahm:  

Yes, and I think the media is a good example. I saw a cartoon the other day of the newscaster saying okay, we just spoke with a very experienced and qualified and trained expert and now in the interest of balance, we'd like to talk to an idiot. The idea that, you know, everything is covered by balancing two sides. And I think the role of an expert, and the role of expertise in managing public knowledge is uniquely vulnerable to this logical problem. And I think that's true inside and outside of a courtroom. It's because we as consumers of the information, we don't have that expertise. And so when we're hearing expert presentation, that can stop our processing. It just kind of acts as an escape valve. And the thought that, well, one expert is as good as another and that experts are essentially fungible.

And the problem this creates in law is that, you know, each side gets an expert on any given point, and the jurors can be tricked into thinking that these experts are kind of equally valuable, equally trained. And then they think well, one says yes, one says no, well it cancels itself out. So I guess I can't rely on expertise. So I'm going to rely on something else. And when they rely on something else, that often comes down to the stories that they want to be true, as in a lot of public perception. I think we can see that in the narratives over the recent election where you have experts who look very deeply and carefully at and say, okay, there's no level of fraud that would change the outcome of the election. But then the other side will say, well, we have experts too, and then that evens itself out. And rather than looking at the quality of each and the analysis, and the evidence that each offers, you just bypass the expert level based on that false equivalency.

So this idea of two sides, when not treated carefully, can just cancel out the space for rational argument about how we arrive at conclusions. I think that's a danger of false equivalence.

2:53: Serena Balani:  

So as an expert in litigation tactics, can you share with us a time when you've seen a false equivalency in use of a court of law

3:01: Dr. Ken Broda Bahm:  

Yeah, so I think every case that comes down to expert opinion, has an element of that risk of false equivalence. And I've even heard attorneys say to me, Look, the jurors are just gonna you know, the experts are gonna cancel each other out. We have one, they have one. That's going to add up to zero and the jurors are going to move on. And we see that sometimes we were running mock trials, is they'll hear experts on both sides about how something very arcane like the insurance industry works. And then, you know, as soon as you see the jury start deliberating with one will say, well, you know, my uncle Joe sells insurance, and they'll more readily base it on that than they will on the expert opinion that they heard and I think part of that is unavoidable.

But I also think that there's ways in which advocates can inadvertently promote that. And that is when an expert follows what I call a product orientation to say like here's the product, the product is my opinion. It is clear, it is high quality. It has been undamaged by the other side. So accept this product, my opinion. And the law kind of treats authoritative opinions in that way. Is the person qualified, what's the opinion? They get to enter it.

I think what's better, the alternative is a process orientation, where the expert is really framed, not as an oracle who has arrived at this conclusion and is delivering the conclusion, but as a teacher, who can help the fact-finders and the decision-makers understand how a qualified and educated person would get to that kind of conclusion. How would they get there?

So it's an example of, like your math teacher said in school, show your work, allow the jurors to walk through that process with you. And I think that act of being a teacher and saying, let me kind of walk with you and show you and how I got to that conclusion, can have the potential of cutting past that false equivalence, because then your audience can participate in that reasoning process. And so I think that we often say, when a case comes down to dueling experts, then the better teacher wins. And that is how it should be. Of course, you're hoping that the side that's truthful also has the better teacher, but that's the part that you work on, is being the better teacher.

5:28: Serena Balani:  

Could you touch upon the problematic relationship between false equivalencies and conspiracy theories?

5:36: Dr. Ken Broda Bahm:  

Yes, certainly. Conspiracy thinking, as you know, is a personality factor with a lot of elements to it. You have people who are characterized by a kind of anti-authoritarianism or high levels of distrust, a desire to be different, eccentricity, gullibility. There's even something in the academic literature called the "Receptivity to Bullshit Scale." That is the actual official name of the scale. And it was developed by Gordon Pennycook at the University of Regina in Canada. And it essentially means susceptibility to treating nonsense as if it were valuable. And so perceiving meaning and patterns where they aren't there. Or even, you know, taking a randomly generated sentence or sentence generated by a computer and attributing profound qualities to it.

And so they can measure this tendency in people and they found that it varies and they found that it it also predicts, to a degree, conspiratorial thinking. But this habit of you know, this habit of seeing conspiracies can also be propped up by false equivalence or at least false equivalence prevents kind of expertise from being used as a check against conspiratorial thinking. So this habit of seeing experts as fungible and essentially the same can serve as kind of an escape pod, you know, from a rational evidence-based argument. So that when one person says, well, you know, the experts say that's not true. The response is well, we have people who call themselves experts as well. If you say your experts are biased, they will respond that all experts are biased. And then just move on to some other basis other than expertise or rationality for supporting the argument.

So I think the bottom line is if you can find a way to cut through false equivalence, to encourage a focus on the reasons and support, you have the potential at least to return to rational argument. And that's not going to convert people on the spot who are conspiracy thinkers. But it will make it harder to sustain, and most importantly, to spread conspiracy style thinking if you can kind of reground it in rational argument and get over this idea that all experts and all expertise is essentially the same.

8:02: Serena Balani:  

Related to argumentation in general, you've said the challenge is to keep it out of the realm of tactics and solidly focus on the practical realm of argument substance. So could you elaborate on this?

8:14: Dr. Ken Broda Bahm:  

Yes, I think there's an unfortunate history of argument being used as a tool of power, and not as a form of philosophy. And by that I just mean it's a way of getting what you want. You argue to kind of get a result. And I think the courtroom is a perfect example of that. The argument is instrumental. You know you have a civil case and the goal is money. You have a criminal case, and the goal is kind of the regulation of personal freedom, it is taken away. And so I think that doesn't mean that the courtroom is bad because I think those goals are very important. But the argument being an instrumentality is important to kind of account for, but knowing that there could still be a priority on good substance.

Like for example, just to talk about the legal realm, one dominant characteristic of jurors is that they do not want to be snowed, they don't want to be fooled. If it's a civil case, they know that it's about money. They know that the advocates, the lawyers have been paid by one side or the other. They know the experts have been paid by one side or the other. They know that the result is gonna affect somebody's fortunes or somebody's freedom. And I think this leads to a high level of what we call persuasive resistance. You know, persuasive resistance is when you pick up the phone and you realize the person on the other end is a salesman. Like immediately, right, you go into a different framework, and you don't want to be easily persuaded. I think jurors are like that. They're skeptical.

And I think the system expects them to be. They should be, because that doesn't doom the possibility of persuasion. Just having a resistant audience doesn't mean persuasion is doomed, but it does call for a different form of persuasion. I think it calls for a form of persuasion where decision-makers participate in their own persuasion. The ancients called that the enthymeme, or the idea of reasoning using one of the premises that your audience already believes, understands, accepts, kind of wrapping your reasoning around at least partly what the audience is already believing in.

So I think when the lawyer says in court, things like you know, we know the contract was broken. The only question is what caused it. You know, that's a small rhetorical step that draws the audience in. We all heard the witness testify to that, you know, kind of invoking this common experience, invoking this common belief. Even if it's an abstract belief, you know, even if you're talking across the aisles politically, and you're saying, we all want a better country. That's vague, but there's a magic in that vagueness because it's common ground, right.

And so speaking to that common ground, even if one has to abstract a bit from the concrete in order to find that identification is important. I train lawyers and advocates to talk like that because it helps in the courtroom. And I think that's because it helps in society, in real life. If you argue in a way that accounts for the fact that the person you're talking to is a person, with the ability to process and they have beliefs and they have experiences. And if you can acknowledge those and if you can validate those in some ways, then you're gonna have a much better chance of persuading them to kind of be you know, a teacher and not a preacher, and also be open to persuasion yourself, right? It's maybe not in the courtroom, but in real life, the idea that you could be wrong and they can be right.

So, you know, ultimately, I think it comes down to credibility, you know. When you're engaging with an audience or a target for persuasion, having some respect or even having a high opinion of that audience helps. Keeping your word, being kind of reliable, being honest. Wrapping your argument around something that they already believe, even if you think most of what they believe is wrong, but there's this part where they're right. And if kind of having a foothold on that part where they're right is effective.

Ultimately, I think persuasion including the parts that encourage rational expertise and you know, discourage conspiracy thinking that's unmoored from the facts. I think, for that persuasion to work, it has to be at some level, it has to be a dialogue, not a monologue. So, you know, it starts with analyzing your audience.

12:45: Serena Balani:  

So when you've written about false equivalencies in the past, you've mentioned some tips on how we can cut through this false equivalency. So can you share some of the tips with us?

12:55: Dr. Ken Broda Bahm:  

Sure. One is to correct what's called the misimpression of parody. And I think the comedian John Oliver had a good example of this on his show when he was talking about climate skeptics. He invited three people who were skeptical of human-caused climate change. And then he gave some time and they talked. And then when they were done, he brought in 97 scientists who represent the mainstream and those are proportionally correct, based on his information at least, that really did combat the misimpression of parody. The notion of his entire studio filled with people who believed one way and then the three people who believed the other way. And I think that's not always an option in a courtroom. The judge usually says each side gets one, so you're not going to get the 97, you're gonna get one.

So I think for in court and other rational settings, I think that focus needs to be something I said earlier, focus on the process, not the product. Unpack the reasoning. Don't treat the opinion as an end result. Treat the opinion as kind of a way of getting to a conclusion and to focus on being the better teacher, to your audience. Ultimately, I think there's no shortcut for just digging in on the process. And one perspective I teach to lawyers who are trying to get jurors to believe an expert is if you're trying to discredit the other side, it means finding not just a difference or not just even an advantage in your side, but finding something that's wrong with the other side's opinion. And I call it the Large Internal Error, and it has, the advantage of being the acronym is LIE, but it has to be large in the sense that this is a substantial or significant problem, not a minor problem, not something that's picky. It goes to the core of what they're asserting.

Second, it's internal, that means it's based on their own methods and assumptions, not just based on, not just I'm going to do it differently, and I have a different conclusion. But it's like, based on what you set out to do, based on your method, you made a mistake. And that's the third part. It's a mistake. It's an error. It's not a difference. It's not a comparative disadvantage. It's an actual mistake. And I think you won't always find those, but when you do, that becomes a very important lever for people to use in distrusting that kind of testimony. For example, if someone, to return to the election example, if someone was disputing the 2020 vote, and saying, hey the total number of votes in this country was greater than the population. And then in offering that conclusion, I think that's actually happened, you're using the wrong county, and so the population figures are wrong.

And so, that's a large internal error. That's something that's concrete. That's understandable, and it undercuts the testimony from inside. And so it becomes a useful way of kind of getting people to distrust.

16:07: Serena Balani:  

And how would you translate your tips for fighting a false equivalency in the courtroom to the real world?

16:13: Dr. Ken Broda Bahm:  

Well, I think you know, when you're in a rational argument, I think the standards are the same. I like to say the law is one of the few areas still where at least the goal is to kind of keep it rational. Like there are rules of evidence, there are rules of what you can do in a courtroom. You can't do any ad hominemsee definition - attacking the character or motive of the person making an argument, rather than attacking the argument itself.
, you can't just attack the other, but you can talk about credibility, right. And so it has a system where it tries to kind of focus on channeled rationality, and I think in the real world we still can try to do that. We can still focus on reasons and proof and explaining our methods and explaining our evidence.

I think one, you know, one important tool is just logical fallacies. And just kind of the question of what do you do when you spot one? And I think you know, we need to in general, we need to be better at talking about the deficiencies in reasoning. And I like that your organization is dedicated to that idea, but I think it's essential for people to become more critical consumers of information as we have to be better at talking about what makes a claim strong and what makes a claim weak. And I think this focus on logical fallacies is useful, but it's also kind of an imperfect tool because there's a tendency to say, oh, that's a fallacy. And to rely on the label, and not the logic. The argument doesn't stop when you identify a fallacy, because fallacy doesn't mean false. It means weak. And it means the logic is weak, not necessarily the conclusion may be true, but it might be fallaciously supported. But the logic has a generic weakness. 

And so, it matters to understand that generic weakness. So for example, I think if you find somebody whose reasoning in what we call academically a post hocsee definition - proclaiming that because something occurred after X, it was caused by X, when no causal relationship at all may exist.
method. You know, they're arguing that after this, therefore, because of this. Rather than just saying, hey, that's post hoc, you know, don't rely on the Latin label, but instead, spend time talking about why that's flawed. To say, well, you know, the rooster crowed and the sun came up. The relationship could be reversed. The rooster didn't do it, the sun did. Or you know, if you ever live near a farm, you know, the roosters crow all the time. So of course they're gonna crow when the sun is coming up as well. And the two don't have that much to do with each other. But to really be able to talk about that using examples that people can relate to and to be able to unpack what makes that claim problematic, rather than just labeling it and moving on.

18:50: Serena Balani:  

Okay, it looks like we're out of time for today. We've been talking to Dr. Ken Broda Bahm, senior litigation consultant, author and co-editor of The Online Courtroom, ABA Press. Dr. Broda Bahm, thank you so much for joining us today. 

19:02: Dr. Ken Broda Bahm:  

Thank you.