Deb Lavoy discusses how foreign actors use standard marketing techniques to spread disinformation on social media.

By Stephanie McVicker
12/04/2021 • 02:55 AM EST

Debra Lavoy, CEO and founder of, discusses how standard marketing techniques are being effectively utilized to fuel the spreading of disinformation on social media. She also discusses how minorities, in particular, are relentlessly targeted in these campaigns and what the average person can do to identify and stop amplifying these false narratives.

0:00: Stephanie McVicker:

With us today, we have Deborah Lavoy, the CEO of the disinformation bunking website, Ms. Lavoy has served as a marketing executive where she developed a framework for building core natives and measuring narrative strength, applying it to over a dozen companies. Ms. Lavoy, thank you so much for joining us.

0:19: Debra Louison Lavoy:

Thank you for having me and I am delighted to be here.

0:22: Stephanie McVicker:

So, to play off of your marketing background, in an interview you did with TEDx Atlanta, you discussed recognizing disinformation as a distorted and benevolent form of marketing. I have made a career in marketing myself, and I found that really an intriguing statement. It resonated immediately with me. Can you break that down a little bit and explain what you mean?

0:45: Debra Louison Lavoy:

So, in early 2016, a lot of people were starting to recognize the rise of disinformation and its likely sources, foreign actors. And as a marketer, looking at this, I realized that what was happening here is that they were identifying a target audience, getting to know them by being part of their online communities, looking for issues that mattered to them, running ads to test various messages to you see which messages actually resonated, and then taking the messages that they learned were effective and pushing them through all kinds of different things, having their bots and trolls and sock puppets, repeat those things using advertising and different channels. Taking those messages into different places and endlessly repeating them. Repetition is truth, regardless of what side of truth you're on. Repetition is truth.

That is I mean, you know, that is basically what your standard marketing process is, right? Identify an audience, do some audience research, figure out what they care about, do some message testing and then amplify those messages. It is exactly the same process. It is exactly the same methodology and it uses exactly the same tools.

And because I had spent a career in marketing and I think of myself as someone who likes to do good in the world, I took it really personally. It felt like a personal affront to me that people were using narrative and using the tools and techniques of marketing to mess with people to do harm. It made me furious. And so, after stewing about it for several years and looking around to see what people were doing about this, I saw a tremendous amount of really high-quality research into disinformation. But what I wanted was like, I wanted to get in the fight, like fight back. These people are pushing these messages everywhere, and there's no counterweight to that. They were seeding the field to disinformation. And I simply felt that someone should push back. And since, I kept expecting somebody else to do this and I was going to go join their team, but nobody else did it. And so fortunately a couple of people convinced me that I could in fact do this. And so we just sort of started trying stuff until we got traction.

3:36: Stephanie McVicker:

So you've also said that you've heard from a lot of people that they know people are lying to them, but they're not sure who, which makes them really reluctant to trust anybody. How do you begin to combat that?

3:49: Debra Louison Lavoy:

So the people, well, there are a lot of people who are vulnerable to disinformation. We're all vulnerable to disinformation. We have all been fooled by something, guaranteed. But one population of people who are really vulnerable in this way are those people who aren't news nerds like us, they have very short attention spans. They spend a lot of time on social media, but, you know, they don't read the Washington Post or the Wall Street Journal, or really any mainstream newspaper. They don't engage in mainstream news, but it's not like they're extremists either. They're not, you know, following super extremist content either. They're just people, right, who are just trying to live their lives.

But disinformation tends to be this super-simplistic stuff. That's really easy to share. And reality tends to be more complicated than that. So reality doesn't compete well on social media with disinformation. So there's a lot of research now to show that these guys get a lot more disinformation than credible information. They're not going to read your 1500 word article about anything. So I may be going on too long here, but basically we're trying to change the ratio of good to bad on people's feeds by making credible information for people with short attention spans and also giving them bite-sized lessons on when to be skeptical.

5:29: Stephanie McVicker:

So do you think that these small bites of information, your target audience, do you find that it works better on a certain group of people or demographic than a different demographic? Have you found any commonalities as far as that's concerned?

5:48: Debra Louison Lavoy:

So depending on the topic and, you know, the specific campaign we're working on at that time, our sort of default bread and butter audience is what we call passive information consumers. We target them by saying basically anyone who doesn't engage with any of these media outlets. We're not really targeting them in any other way. But if you do not engage with any of these media outlets, you're our people. We also work with different community leaders. There are some audiences, there are some populations in the United States in particular that are heavily targeted by disinformation, very intentionally and very specifically; black Americans, Spanish speaking Americans are relentlessly targeted with disinformation and it works. And even if you don't believe it, it puts, you know, puts that little seed of doubt in there. So we talk to community leaders, we say, you know, so what is going on in your community? What are these little underlying anxieties and fears that are being exploited by this disinformation? How would you counter that? And then we try to translate it into our format of you know, super-simple and then push it in those communities.

The passive information consumer audience that's sort of our bread and butter, very general. If you do not read news, you are in our audience, right. It's extremely receptive. We do so well with that audience. And really, we didn't think we would because it's so broad and so general, and as a marketer, you know, the broader your audience, the less effective you are, but we did really, really well with them. They seem really receptive to what we're doing. So we like working with them, but we also know that we have to pay a lot of attention to these other communities that are just getting brutalized. And we're constantly looking to make relationships so that we can be of service to those communities.

8:05: Stephanie McVicker:

So in your spring 2021, Ted talk, you said that the most commonly used trick used by people, propagating disinformation is to amuse, anger, shock, or horrify us because reason responds slower than emotion. Can you explain a little bit about how that works?

8:25: Debra Louison Lavoy:

Yeah. So my 10 minute Ted talk in 30 seconds. Daniel Kahneman who was actually an economist who got a Nobel prize in psychology wrote a book where he talked about two brains. We all effectively have two brains, a fast brain and a slow brain, and fast brain is basically all the automatic stuff we do during the day. Fast brain drives, fast brain reads, fast brain is what you use to scan your social media feed. Slow brain is stuff that like is your real analytical thinking, slow brain is like how you do a math problem or how you try to figure something out, or, you know, when you get lost or try to map a route, that's slow brain. Slow brain is hard and it takes a lot of effort. Fast brain is easy. We scan social media feeds with fast brain and fast brain likes emotion. And when something shocks you, not only does it engage fast brain, it actually depresses slow brain. When you feel emotion, you are more likely to react, before you actually think about it, before you have any slow brain thinking at all. So even people who would rather not share disinformation, they have no intention of sharing disinformation, they see something that shocks them, they share it anyway. The bad actors use this trick because it works.

So what we have to do as citizens of the world, who wish not to be ruled by disinformers, is to recognize that when we have an emotional response to something, we see if it makes us mad, or if we say, you know, aha, I told you so! That's a flag that means we might be being manipulated. That is a sign. When you have that aha or, oh my God! We need to teach ourselves to recognize that that is the point of manipulation. It doesn't mean that what you're looking at isn't true, but it means it might not be true. It means that that is people who are really good at manipulating you might be behind that and you need to double-check. Take a second to check.

11:15: Stephanie McVicker:

So have you noticed any particular themes that are the most pervasive when it comes to disinformation pedaling?

11:26: Debra Louison Lavoy:

Yeah. So the disinformation ecosystem in the US is mostly interested in making people afraid, making people not trust the sources that they had for generations learned to trust. Oh, don't trust the CDC. Look, it's not like the CDC has never made a mistake, but the CDC is a lot more likely to get it right then your brother-in-law. Nobody's always going to be right all the time, right? That's just not a thing that happens. But the disinformation ecosystem in the United States is about making people afraid. Making people doubt, making being people distrust the institutions that they have come to trust, because that takes power away from them. It also allows them to sell hundreds of millions of dollars in fake health supplements, which is a thing. So these themes will evolve over time. If you're worried about critical race theory, critical race theory is not a thing. What the disinformation ecosystem in the US is trying to do is tell a constant stream of scary stories to make you feel overwhelmed, to make you unsure about what is real and what isn't, to keep you on edge, and to keep us from having meaningful dialogue as a citizenry.

13:03: Stephanie McVicker:

So you have also mentioned that it's not just uneducated people who are really taken in by disinformation. What would cause an otherwise educated or intelligent person to be taken in by a false statement? Is it just that bombardment, those echo chambers? What is the commonality again that drives somebody to otherwise think that?

13:31: Debra Louison Lavoy:

So some of the things we're talking about are complicated. Some of this vaccine information, climate information, it's really sciency. And even though I have a degree in neurobiology and computer science, I am not qualified to understand some of this stuff. You can come up with something that sounds right. Like climate change is a great example of this. People will do an analysis of, you know, when climate has changed in the past and, you know, numbers and time, and little jargony words like, you know, Mesozoic Era or whatever, and you can have another thing. And unless you have knowledge in that area, you don't necessarily know the context or the issues or the subtleties, or the complete bull lies that are embedded with these. And it's not because you're dumb it's because this stuff is hard. That's why people get PhDs. Right?

And so one of the key ways to protect yourself about disinformation is to recognize when you are the right judge. I am not the right person to judge the underlying mechanisms of vaccine efficacy. And so what our team did when we wanted to be really sure on this stuff is we pulled together. A lot of people, we pulled together somebody who worked on vaccine development. We had a professor who studied zootropism in bats, believe it or not. It's how viruses spread between species. She became very popular very quickly. She'd been doing that for decades. Anyway. We spoke to physicians and said, so, Hey, here's this paper? And I have no idea what it says. I have, no, I do not know what this jargon is. You know, P values, I don't know, oh, you know, what does this say? And how do we condense it in such a way that we can say something useful about it. Recognizing this whole thing about, do your research, forget it. You can't do your research, you can't, you cannot be an expert in all of these things. You just can't. And that's why it's useful to have experts.

So if this is a well-known stable topic, that's been around. If you want to know how combustion engines work. Sure. Go ahead, do your research. If you are concerned about a rapidly evolving area of science or something that is just super complicated. And by the way, I've never seen anything published on climate science that wasn't fantastically, nerdy and complicated. You need to recognize that you're going to have to trust somebody else's judgment.

16:50: Stephanie McVicker:

Okay. Well, it looks like we are out of time for today. We've been talking to Ms. Deb Lavoy, founder of, which is dedicated to supporting journalism, knowledge, and ideas at the intersection of journalism and technology. Ms. Lavoy, thank you so much for spending time with us today. It was a really great conversation.

17:10: Debra Louison Lavoy:

Thank you for having me. Thank you for your work at Propwatch. I've learned a lot from your organization and thank you for such thoughtful questions.