Dr. Peter Warren Singer, strategist and senior fellow at New America and professor of practice at Arizona State University, discusses how groups around the world are using social media as a conflict space to win over people's beliefs. He details the new rules that govern social media weaponization and the dominant role virality (not veracity) plays in building narratives on social media. He closes by discussing how building resilience against these forces will require a multifaceted approach from institutions and corporations down to each individual taking the personal responsibility not to become unwitting spreaders of disinformation.
0:00: Michael Childers:
With us today, we have Dr. Peter Warren Singer, strategist and senior fellow at the New American Foundation and professor of practice at Arizona State University. He's the author of many award-winning books, including LikeWar: The Weaponization of Social Media. Dr. Singer, thank you so much for joining us today.
0:17: Dr. Warren Singer:
Thanks for having me.
0:19: Michael Childers:
We should just get started on the right foot and go ahead and start with what is Like War?
0:25: Dr. Warren Singer:
So, LikeWar came from a project that myself and Emerson Brooking started back, oh gosh, around 2012. And essentially, we were trying to figure out how groups around the world particularly in conflicts were using social media, not just as a communication space, but as a conflict space. And we ended up studying everything from the rise of ISIS in Iraq and Syria, to street gangs in Chicago, to Russian information warriors targeting Ukraine then targeting elections everywhere from Poland to the UK, to the United States, to we studied how celebrities were using social media, how teenagers were using social media. And essentially what we found was this phenomenon, where if you had thought about cyberwar as the hacking of networks, people basically trying to get into your email, steal credit card information, or the, like. What we were seeing on social media was its evil twin. So, if cyberwar was the hacking of networks, what we called LikeWar was the hacking of people on the networks.
By driving ideas viral through likes, shares, often lies on social media. And just like in regular old cyberwar and cybersecurity, the goal was to affect not just what was happening in the network, but to effect the real world. So, if you think about a hacker, they're stealing your credit card information because they want to go out and buy a Ferrari with it. Same thing, someone's driving something viral online, not just for the pure virality of it, they're doing it to shape what people do in the real world. Whether it's what party they go to, if it's a teenager, to what political party they join, to whether they take a vaccine or not, or buy a movie ticket or not, or whatever. It was the goal of LikeWar, just like in cyberwar is to drive something viral, to effect real world action. And so that's the concept of LikeWar and really, it comes back to this idea of understanding social media as this combined space of communication. It's a marketplace. It's for fun. It's for social, but it's also, we see these battles going on for again, people's beliefs and people's actions.
3:07: Michael Childers:
Would you say that the way that people have used social media over the last four years, especially has contributed to a lot of say like January 6th, for example, since we're on that, would you say that that's mostly due to foreign interference or due to domestic propaganda?
3:27: Dr. Warren Singer:
So, what's so important to understand about the relationship between social media and propaganda, that has a history of literally millennia, is that social media, much like it is done for any other form of communication. You know, you compare social media to writing a letter or the telegraph. It's basically put these phenomena on steroids. It's multiplied out their effectiveness. So, you look back at past Russian information operations, like back during the Cold War, there was one called Project Infection. And it was basically about spreading the false story that the US military was somehow behind the creation of AIDS. Now the KGB comes up with this plan to spread it, actually as a way to help sabotage not just America in general, but Olympics. And it takes it multiple years to do everything from creating false front media sites, to people posing as scientists, to spreading the story first within East German, then French, then African media. To then getting after a couple of years, what we might think of as the terminology is "useful idiots" in Western politics, including American politics on the extremes. To then take those stories and elevate them from their own purposes.
So, the Russians conduct this operation and it takes them multiple years to reach hundreds of thousands of people in the West. By contrast, with social media, a Russian actor, you know, sitting in St. Petersburg can pop up a story on Instagram or Twitter, or maybe take their network and elevate something, you know, by some extremist in America and and they can reach potentially millions or tens of millions in a matter of minutes or hours. So, it's the idea of taking the age-old phenomena of propaganda and making it far more effective by reaching greater numbers of people and also enlisting them in your own efforts, the useful idiot numbers grow.
5:56: Michael Childers:
You've talked a lot about the new rules of information warfare and how important it is that we as a country understand them. Can you elaborate on these particular rules?
6:08: Dr. Warren Singer:
So, what was interesting is when Emerson Brooking and myself did this study of everything from how was ISIS using social media to how are street gangs using social media, to how was Taylor Swift using social media how was Donald Trump using it, how were Russian information warriors using it? We found these repeating patterns, wherever the location in the world, Iraq, Chicago, whatever the space, terrorism, celebrity, politics, there was this pattern that kept repeating, these rules that emerged. And essentially, they broke down into four rules of the weaponization of social media.
One was it was a bit of a play on the old TV series, The X-Files, the truth is out there. When we have this technology, you think about your smartphone, it both allows you to gather data about everything around you, camera and the like, but it also allows you to instantly share that with literally everyone potentially in the world. It means that so much is now being recorded and shared.
And also, it creates a long history of it, not just what's happening right now, but you know, what happened two years back, five years back. And so, it has this massive wealth of data and it's not that it's basically harder to keep things secret, so to speak. And that phenomenon, you can see playing out in like everything good and bad ways. You think about the loss of privacy but you also think about, for example police brutality. Police brutality is long happened, but the difference now is we've got smartphones that is documenting it and then it gets shared and goes viral. And that's led to for example, the mass protest movements and the like. So, one is this idea of the truth is out there somewhere, but because of the sheer collection and sharing of data, like never before.
But then there's the second rule. And it's the truth may be out there, but it can be buried underneath a sea of lies. And that is the essence of everything from Russian information warfare to how corporations, how teenagers deal with bad news now, to how it's hit our domestic politics. The truth buried underneath a sea of lies. And again, you can see that that wave of mis and deliberate disinformation affecting everything from our democracy to our public health. Think about the phenomena of battling COVID and just the sheer volume of falsehood that surrounded it on everything from masking to vaccines, you name it. And again, actors all play within this. It's everything from China trying to bury the truth beneath a sea of lies, of where COVID actually came from, to anti-vaxxers bury the truth underneath a sea of lies of the effectiveness of vaccines.
So, then we have a third rule. It's created a world where virality trumps veracity. That is when it comes to the power of information, its ability to shape belief in action, not just online, but in the real world, virality trumps veracity. It's more important that something go viral, it's speed, it's reach, than it be true or not. I'm not saying the truth can't go viral. It definitely can. But again, it's the speed, it's the reach that gives information its power.
And here again, we can see this in everything from pandemic information to, you and I are talking on the anniversary of January 6th and the phenomena of the big lie. The big lie is a series of by one count 16 different conspiracy theories about how the 2020 election was stolen. And what's interesting is many of the conspiracy theories are contradictory. So it was, it's everything from there were immigrants showing up to, no, China hacked the election to no, it was an Italian space satellite. I mean, some of them are absurd and yet they're being voiced by, you know everything from media to Donald Trump himself. There's been 16 different ones that have been put out there.
Now what's interest is that every time those conspiracy theories have been investigated, audited, not just by researchers, all researchers are biased, but literally in courts of law, including adjudicated by judges that Donald Trump himself appointed, every single time they've been debunked. They've said, no, there's no evidence behind it. So, on one hand, we have more data than ever before on the veracity of the election. And yet what is interesting and scary for our democracy is that belief in the big lie has actually grown over the last year. So, the more it gets debunked, including by judges that Donald Trump himself appointed, the more it gets believed, it spreads, and it's become a more potent political force. So that's the idea of virality trumping veracity. And again, it doesn't just play out in politics. We have a similar discussion around almost any other phenomena.
And then we get the final rule, which is that we are in a world now of new possibilities, new dilemmas that result from the power of social media, not just in terms of the actors battling in it, but those that control the network itself. So a different way of thinking about it is a Mark Zuckerberg, 15 years back, he's writing software in his college dorm room to help people vote on who is hot or not. Now he, and those like him are making decisions on everything from should Russian information warfare be allowed to attack an election, should anti-vaxxer conspiracy theory be to sabotage our public health. And there's a huge debate on is that proper or not? Who should have that power? How does that power play out? And so basically this last rule is that we have a world now that shaped by these new powers and we're gonna be battling about it for years to come.
13:17: Michael Childers:
How can U.S. institutions build resilience against these foreign and domestic propaganda and disinformation campaigns? And what does a comprehensive strategy look like?
13:28: Dr. Warren Singer:
That's a great question, and the way you frame it in terms of resilience, I think is, is the most important aspect of it. There's a parallel to, be it public health or traditional cyber security, in that you have to recognize this problem is not something that will have a silver bullet solution to it. One there's no silver bullet. There's no one single thing you do. And two, there's no actual solution. As long as you have the Internet, as long as you have people, as long as you have politics, you're going to have this phenomena play out. Instead it's how do I, as you put it, build resilience or another way of thinking of this, is risk manage? How do I drive down the number of attacks, drive down the risk, but also drive down the consequence of those attacks; build resilience against it. And so, you know, again, you think in public health, no one would say like, how am I gonna solve disease? No, I mean, except there's always gonna be disease, but how do I reduce the amount of disease and make the disease less lethal? And that gets to that notion of the broader public health approach.
So similarly against mis and disinformation, whether it's coming from foreign authoritarian states or domestic actors, and obviously they often come together, you need a overall strategy, and the United States, we're still behind in that. And that overall strategy needs to be one again, parallel to public health, that brings together government, private sector, individual. So, in public health, no one would say, oh, you know, goodness we've got the Centers for Disease Control. I guess we don't need drug companies. I guess we don't need individuals to wash their hands. You need all of that together. Same thing in cybersecurity, no one would say, oh, you know, I've got a good password on my Gmail. The military doesn't need cyber command. You need all of that together.
On the government approach. Our strategy needs to be one that brings together all the different arms of government. This is not a topic that's just about Russian disinformation or just about election security. It needs to be bringing together everything from how do we change our intelligence collection to recognize we're in a world where the threat is not just Russian tanks, but it's Russian information warfare, to at the other end of the spectrum, the Department of Education. What can it do bolster resilience through our schools, through digital literacy.
Private sector, what is the role of that? Well, a huge part of it is the platform companies themselves recognizing that they are no longer merely tech products, creators, and just inventing new software, they're running media market and conflict spaces and running that gives a very different set of responsibilities. And one of the things that we can argue back and forth about what they ought to do, but the thing that's been most disappointing to me and I think harms themselves, is that they are consistently reacting to predictable problems. That is every one of the phenomena, whether it was anti-vaxxer conspiracy theory to Russian attacks on election, to teenagers using Facebook to broadcast their suicide, whatever. Every single one of those phenomena was imminently predictable. In fact, they were often directly warned about it, but then only after the bad thing happened, did they react belatedly? And so a huge part of this is getting ahead of the problem.
But then there's the final aspect of our individual responsibility. And again, that is a combination of both awareness, understanding these kind of threats that are out there, how they target you as an individual, but it's also about an ethic of responsibility. So think about why you wear a mask or why you wash your hands. In public health, it's both to protect yourself, but it's also, you protecting everyone around you, those that you interact with. We need the same phenomena in social media. Understanding how threat actors are going against you, understanding how algorithms work, to protect yourself, but also the ethic of behavior that says, hey, I'm gonna take responsibility for not being a purveyor of mis and disinformation. I'm not gonna be a useful idiot for the propaganda pushers. And unfortunately, that is a major problem in the United States right now, is that combination of both lack of awareness, but also being a willing, useful idiot for mis and disinformation.
18:45: Michael Childers:
Alright, it looks like we're out of time for today. We've been talking with Dr. Peter Warren Singer, strategist and senior fellow at the New American Foundation and professor of practice at Arizona State University. Doctor Singer, thank you so much for spending the time with us today. I hope that you've enjoyed this as much as I have. Thank you.
19:03: Dr. Warren Singer:
Thanks having me.