Dr. Kurt Braddock discusses the role counter-narratives can play in fighting extremist messaging.

Producer: Grace Lovins and Michael Gordon
02/21/2023 • 05:22 PM EST


Dr. Kurt Braddock, assistant professor of public communication in the School of Communication at American University and author of Weaponized Words, discusses the role narratives play, factual or even fictional, in drawing people into radicalization and extremist groups and how counter-narratives can help people find their way out. 

Transcript:
0:00: Grace Lovins:

With us today, we have Dr. Kurt Braddock, assistant professor of public communication in the School of Communication at American University, an author of Weaponized Words: The Strategic Role of Persuasion in Violent Radicalization and Counter-Radicalization. Dr. Braddock, thank you so much for joining us today.

0:16: Dr. Kurt Braddock:

Grace. Thank you. Pleasure to be here.

0:19: Grace Lovins:

So in your book, Weaponized Words, you write that terrorist messaging seeks to arouse fear, anger, guilt, and pride. So why are these the four key emotions?

0:29: Dr. Kurt Braddock:

Well, there are other emotions that terrorist propaganda can elicit, but these four are very, very motivational. And they typically are the four that we see most within terrorist propaganda. Fear especially. A lot of times, a lot of terrorist propaganda is based on the idea that there's some outgroup that is a threat to either the terrorist groups or the people, the terrorist group means to represent, a threat to their way of life. And by eliciting fear in people, that their way of life is in danger, they can motivate people to stand up for themselves, to support the group, to protect them, or things of that nature. The other kinds of emotions that people experience when they engage with terrorist propaganda, things like anger, they can be very motivational as well. So when people feel, as I just said, when they feel fear that their way of life is under attack, that may make them feel angry as well. And research shows that when people feel angry towards something, they lash out against it. It sounds obvious, but that is what the research shows so they can motivate people through these emotions to lash out at those with the terrorist group poses to be enemies.

1:37: Grace Lovins:

You've previously discussed conspiratorial terrorism. So could you talk a little bit about this and elaborate on that?

1:44: Dr. Kurt Braddock:

Yeah, so I'm kind of changing my mind on conspiratorial terror just by what I'm seeing in terms of motivation for terrorism in the United States especially. I used to think of conspiracy motivated terrorism or conspiracy motivated violence more generally to be a special form of violence, kind of distinct from things like far-right violence, far-left violence, those sorts of things. But the more I see what's happening in the United States and Western Europe, and the more I look at the data related to the attacks that are being taken or the attacks that are occurring, it seems to me that the majority are motivated by some form of disinformation or conspiracy theory. The individuals who perpetuate them or who, I'm sorry, who engage in these activities, they seem to be very good at folding conspiracy theories and disinformation into their extremist belief, and then using those as motivation for engaging in violence.

So when I talk about conspiratorial terrorism or conspiratorial violence, now, I don't think I talk about it in terms of being a unique kind of violence. I think, again, given the proliferation of internet technologies and social media and the way that disinformation runs rampant on certain social media platforms, I think conspiracy and disinformation is probably the ingredient in motivating violence. It's just that the violence is justified through the more traditional extremist ideologies.

3:14: Grace Lovins:

So do terrorist groups or extremist groups, do they spin their own conspiracies or do they use the ones that are kind of already out there to further their goals in their work?

3:24: Dr. Kurt Braddock:

A little bit of both. There are certain conspiracy theories that have been around for literally decades, in some cases, hundreds of years, that new technologies don't affect the conspiracy theory itself, but just affect the way that it can be spread. To give you an example, the, the so-called Great Replacement Theory, the idea that white Americans are being replaced by immigrants to make a more favorable voting block for those that the far-right, called socialists and Marxists and all these other things. That theory has its basis in the protocols of the Elders of Zion, which is an anti-Semitic book that made its way into the United States in the 1920s. So they take these old conspiracy theories and then fold new ideas into them. Some of the newer conspiracy theories, things like QAnon, they kind of follow trajectories that are similar to, and I'm not calling them this, but they follow belief trajectories similar to cults, whereby they have their ideas about the nature of the world around them and then when the nature of the world around them doesn't match what their expectations are, they change the conspiracy theory so that it fits what they believe. So they can never be wrong. The conspiracy theory can never be wrong because they fold what happens into their new narrative.

And I use the analogy of cults because we saw that a very popular example in a book called When Prophecy Fails, it was a sociological, not an experiment, but almost an evaluation done by Leon Festinger in the fifties and sixties, showed that when cults, when they make prophecies and those prophecies fail, there's often a contingent who take that failure as proof of the prophecy, and they double-down on it. And that's what we see now with a lot of these groups. I'm thinking, QAnon they double-down even based on the failures they experience. So these groups are very adept at taking old ideas, old conspiracies, and making them new, using new scapegoats, using new technologies to identify those scapegoats and coordinating with one another.

5:24: Grace Lovins:

What strategies are used to counter these types of conspiracies that result in violence?

5:30: Dr. Kurt Braddock:

Well, we've tried a lot of different things. Several years ago, the buzzword was counter-narratives. If we tell better stories than those, that the extremists tell, maybe we can pull people away from extremist ideologies. And there's some evidence to show that counter-narratives can work under some conditions or they might not work under other conditions. There are so-called alternative narratives where rather than try to undermine the stories told by terrorists, we simply give them a better story and try to draw them away from the extremist ideology by giving them a better option. I study a technique called attitudinal inoculation, which is basically, it's based around the idea that if you inform somebody that they're going to be targeted with a persuasive method, they're more likely to resist that message, especially in western countries like the US and in Western Europe.

And there are other kinds of counter-speech that I'm growing familiar with as well. But more often than not, what we're seeing, at least in the communications psychology space are forms of counter-speech, counter-narrative is still popular, attitudinal inoculation has become popular. But I would advocate even beyond those, a more widespread media literacy, so that people who are engaging with this content online are prepared to parse what's true and what's false. So if we get back to what I was talking about, about disinformation and conspiracy being one of the common ingredients that motivates violence regardless of extremist ideology, if people can parse what's true and what's false online, then we'll see a lower chance of them engaging in violence. So I think media literacy needs to become a major, major element of this fight, and I'd actually like to see it implemented in things like grade schools so people become aware of what they're engaging with early on.

7:13: Michael Gordon:

So based on what you just said there, it seems like you have more belief in preemptive strategies as opposed to trying to straighten it out after people have already been sucked into something.

7:28: Dr. Kurt Braddock:

Yeah. Having done work in both spaces, it seems that it's easier to prevent somebody from going down the rabbit hole than pulling somebody back out of the rabbit hole, because interview data and data related to people who joined groups and then left, they don't leave because they're engaged by security forces, they don't leave because they're engaged with counter-narratives from people outside the group, they leave often for reasons that are unique to them. Growing family responsibilities, they get a new job, things that simple can be the case, but it could also be that they become disillusioned. It's much easier if you're trying to prevent this, for you to engage with the individual's first, before they end up going down that rabbit hole.

8:13: Grace Lovins:

So what is entertainment education programming, and how is that utilized?

8:18: Dr. Kurt Braddock:

It's very similar to narratives. So when we talk about narratives, typically we're talking about a one-off story, a movie, a book, a TV show, something like that. Entertainment education, it's a similar idea. The idea that you can embed educational messages in entertainment, typically over the course of a series of different narratives. So a good example would be if you're watching ER or some sort of medical drama, oftentimes those medical dramas, the producers of those dramas, they'll coordinate with certain health agencies to embed certain messages like anti-smoking, anti-drinking, anti-drunk driving, things like that in the storylines of the actual show.

So over the course of maybe four or five episodes, there might be a story arc where an individual dies in a drunk driving accident. You see the implications for the family, that would be entertainment education initiative. And typically they're done in coordination with either, whether it be a federal or state organization, sometimes nonprofits. But when we say entertainment education, we typically mean the same thing as narratives, but it's a more drawn out type process, and it's more of a campaign than a one-off.

9:36: Grace Lovins:

And if someone has already been exposed to terrorist or extremist messaging, is entertainment education programming effective in kind of bringing them back, or is it more of just a preemptive strategy?

9:47: Dr. Kurt Braddock:

I would say entertainment education could serve either purpose. I'm a big believer in entertainment education, even if somebody has already kind of gone down that radicalization trajectory a little bit, simply because research shows that people engage with characters and stories. They are persuaded by them, even if they know that they're fake. They identify with characters, they feel they're in the same room with them. They have feelings for them in some cases. And when that happens, the experience that the characters have can be hugely influential. So if an individual is starting to go down that rabbit hole, let's say they're starting to engage with white power music, and make friends with people who are in the white power movement, if they watch certain movies or are exposed to certain counter-narrative propaganda, if they engage with characters in those stories, that can be hugely influential, much more so than trying to talk them out of it.

One thing we do find pretty consistently is that when you try to argue somebody out of their position, they, it tends not to work. So at the very least, we can say that these kinds of things work better than arguing with people, but maybe not as much as preempting the messages they're going to encounter.

10:59: Michael Gordon:

So you mentioned a few different communication theories in the conversation. Which one are you most excited about as far as seeing the potential of its effectiveness? Do you see one of these as a standout?

11:15: Dr. Kurt Braddock:

Well, if we're talking about data that already exists, it would be inoculation theory because we're seeing a lot of data is coming out, I don't think I'm speaking too out of turn to say that I wrote one of the first inoculation papers on extremism in 2018. It just got in print like last year, but it was out there, and there's been a lot of work that's been built on inoculation theory that cites that paper. And all of the data seem to show that inoculation does work for disinformation, extremist communication and things like that. So if we're talking about kind of theories where there is a baseline data set, or a set of data to say that it works, it would be inoculation. That said the theory that I'm working with now, reasoned action theory, it's an old theory of psychology. I'm very excited about working with reasoned action theory, just because it's a theory of decision-making and it incorporates all kinds of different elements in one's environment in addition to the communication they're exposed to.

So rather than think about communication as being the only thing that affects whether somebody engages in violence or not, it kind of takes a holistic approach where it looks at the communication they engage with, it looks at their background, it looks at their preexisting beliefs, it looks at their preexisting attitudes. It looks at what norms they developed, how they were socialized, all these different kinds of things, to make an accurate prediction about whether somebody will engage in a behavior or not. Now, as a communication researcher, I'm interested in one small sliver about how communication that people are exposed to, how does that affect the likelihood that they'll engage in violence? But reasoned action theory, I think because it's so comprehensive and the kinds of variables it looks at, I think it can be hugely useful for understanding why people engage in violence and how we can pull people back away from it.

13:02: Michael Gordon:

It sounds like one of the things it does is it can identify, out of let's say a group of a thousand people, which of those people would be more likely to be responsive to certain types of messaging. Is that what you're, based on all these different factors and criteria?

13:20: Dr. Kurt Braddock:

Yeah. Well, I mean, we wouldn't be able to say from the jump that one person is more likely because of others. But if we can collect data on people's beliefs after being exposed to a method, their perceptions and norms that they're being exposed to a message. Their perceptions we call perceived behavioral control, whether they feel they have the power to engage in the activity. If we can measure these things, and these are measurable outcomes, they've been measured for decades, if we can get at these outcomes, they have been shown to be very, very predictive of somebody's intentions, which itself is very predictive of whether or not somebody engages in a behavior.

So I'm very excited about reasoned action theory, just because of, number one, it has a base of proof behind it in other contexts, number one. And number two, because we can gather so much data and see what's most motivational, whether it's beliefs about the behavior, whether it's norms, and whether it's perceived behavioral control. And we can use those outcomes to see why one might be more influential versus another for different types of people. So we get a real good look at, or we can use reasoned action theory, to get a really, really good look at not just why people engage in violence, but why certain people engage in violence under certain conditions.

14:39: Michael Gordon:

What concerns you most right now, if there is something that you see that really concerns you?

14:47: Dr. Kurt Braddock:

Well, my current line of research, I've talked about inoculation a lot, and I still do work in that domain, but my current research, and the reason I'm excited about reasoned action theory is because I'm using reasoned action theory to understand the way that people respond to what I call extremist subtext or implicit language. Because that's what concerns me most, is there's certain, some people call them dog whistles, I call them implicit orders in some cases, where individuals use their freedom of speech to say perfectly legal things, but implicitly advocate for the use of violence. And we see this on television channels. People talk about Great Replacement Theory, like it's true. We have individuals go on Twitter and talk about how, I just saw a tweet from a sitting congressman about how we can deport 6 million people. And he said that on Holocaust Remembrance Day.

So there are all these different kinds of things that people can say where they don't overtly say, you should engage in violence, but people will interpret it as a call to violence. So to give you a rundown of what this phenomenon is, and I'm still working through it myself. It's actually what I'm writing my book on now. It's called the stochastic terrorism. And it's based on the idea that even if an individual doesn't overtly tell somebody to engage in violence, if they have a large enough audience and that audience is motivated to engage in violence on that individual orders, the larger the audience the person has, the greater the likelihood that at least one person will interpret it as a call to violence.

So if for example during the election, Donald Trump told the Proud Boys to "stand back and stand by," and there might be in the United States, let's say 500,000 people who are sympathetic to the Proud Boys cause, let's say it's a half million people. And let's say each one of those individuals have a 0.0000001% chance of engaging in violence, because he said "stand back and standby." Well, that's on the one-person basis. When you multiply that across 500,000 people, then we get closer and closer to a 100% chance that at least one person will engage in violence on the back of what was said, even though it didn't overtly advocate for violence.

We see similar cases, I mean, there have been attacks that have cited things that people have said online. Great Replacement Theory was cited in the attack in Buffalo, the attack on the Tree of Life synagogue. I think that cited Great Replacement Theory as well. They mentioned in the Tree of Life attack, the attacker talked about this hidden cabal of Jewish people, which was mentioned by elected politicians. And every time it happens, the speaker, they have plausible deniability because they said, well, I didn't tell them to engage in violence. I just said A, B, or C.

So I think that is what I'm most concerned with, not because I think that that kind of speech should be tamped down on, but because it is so sneaky and it's so difficult to understand how to undermine it. So that's what I'm working on now, is understanding how people how they interpret this kind of subtext, what happens when they engage with it, and how we can undermine it when people encounter it. Because it is a persuasive tactic and it has been shown to be useful. And I think we need to push back against it in any way that we can.

18:16: Grace Lovins:

Okay. It looks like we're out of time for today. We've been talking with Dr. Kurt Braddock, author, researcher, and assistant professor, in the School of Communication at American University. Dr. Braddock, thank you so much for joining us today,

18:28: Dr. Kurt Braddock:

Grace. Thank you very much. It's been a pleasure.

References