Dr. Josephine Lukito discusses her latest research on how autocratic state governments have perfected the craft of using digital platforms to spread state-produced propaganda and disinformation, not only abroad but domestically, to de-legitimized journalists, silence activists, and often lay the groundwork for state violence against their domestic populations.
0:00: Blaire Hobbs:
With us today, we have Dr. Josephine Lukito or Dr. Jo, assistant professor at the University of Texas at Austin School of Journalism and Media. Today, we're talking about Dr. Lukito's recent research with the Institute for Data, Democracy, and Politics, titled Understanding Social Movements as Targets of State Sponsored Digital Propaganda. Dr. Lukito, thanks so much for joining us.
0:20: Dr. Josephine Lukito:
Thanks, I'm glad to be here again.
0:22: Blaire Hobbs:
So with this research, you make a distinction between disinformation and propaganda. Can you explain what that distinction is and why you feel it should be made?
0:32: Dr. Josephine Lukito:
Sure. So, I think of disinformation as a specific type of propaganda. So, propaganda can include a wide variety of strategies that try to influence audiences in different ways. And sometimes that includes the use of persuasive messaging or opinion-sharing. Sometimes that includes the use of accurate information. And sometimes that includes the use of inaccurate information. And that last portion, when I study governments and states in particular, but when any actor is spreading specifically disinformation or inaccurate information, I think of that as a specific type of propaganda.
1:08: Blaire Hobbs:
Awesome. And with that, you also identified three types of propaganda, which I thought was along a very clever color scale. Can you elaborate on those three types?
1:20: Dr. Josephine Lukito:
Totally. And this is a really nice flow into this question and I'll be the first to say, I didn't come up with this typology of propaganda. This is something that has been in use probably since about the 1940s or fifties, both by state governments, industry, as well as academics. And typically, propaganda is split into three-ish categories or running along the spectrum of two sides. And the first side is black propaganda. This is where most disinformation work falls into. And so black propaganda is any situation when a state government or actor is hiding or obfuscating their identity. So, you don't actually know who is sending the information. And oftentimes that information can either be incorrect or can use a false identity. That's what's considered black propaganda.
On the other side, you have, what's considered white propaganda. And this is when the state actor, government or individual is honest about who they are presenting as. And so, if for example, Russia produces a broadcast news organization, something like Russia Today, and they are upfront in saying that it is run by the Russian government, that would be a form of white propaganda. And of course, black and white propaganda operate on a spectrum.
And so, a lot of what folks I think are really studying now is that middle-ground, which is what's considered gray propaganda. There's some way to signal or understand that the person behind it is perhaps associated or affiliated with a state government, but it's not necessarily made explicit or maybe the identity is made explicit, but the information provided is factually incorrect. Those kinds of middle cases are generally considered gray propaganda
3:04: Blaire Hobbs:
Makes sense. Also definitely a great allegory for the phases of propaganda using the colors. So, on the other side of your research, one term you discussed that really stood out to me was the idea of a monopoly on violence which was coined by the sociologist Max Weber. This term largely speaks to the relationship between state power and violence. Could you discuss this relationship?
3:31: Dr. Josephine Lukito:
Absolutely. And I think this is really important when we are thinking about disinformation, misinformation, and propaganda in the digital age, because with the availability of the internet, technically anyone can produce disinformation or misinformation or propaganda. But I think one thing we always forget, even with the availabilities of digital technology, is the inherent power that state governments have in being able to exert violence, particularly among their domestic populations. Right. There's really not a lot of individuals that I know that own nuclear weapons, for example, or that have armies. And so, I still think that there is this really strong power imbalance or at least state governments have a lot more power that we still have to think about in relation to the digital technologies that are available to everyone.
And so, especially as I started my time here at the University of the Texas at Austin, I've been thinking a lot about how this sort of power, this kind of tangible military power or this tangible ability to use violence comes hand in hand with the propaganda that is often produced by state governments and actors. And so, this is something that I've been trying to study over the past, maybe about a year and a half. I was very inspired at my earlier work when it came to Russian trolls. And a lot of the work in the kind of Russian trolls/disinformation literature really inspired me to start thinking beyond not just how trolls are engaging online, but how that coincides with other Russian foreign strategies, including both diplomatic strategies, as well as military and national security strategies.
5:03: Blaire Hobbs:
Do you notice that the countries that are good at, like for instance, Russia, good at doing like foreign targeted propaganda, are they also ones that are more likely to do it domestically or do they have a successful domestic propaganda campaign as well?
5:21: Dr. Josephine Lukito:
Yeah, it's great. You mentioned this because this dovetails into, you know, in looking at the differences across the countries that I've been studying. And typically, countries that are not democracies that have more autocratic or authoritarian media ecosystems have a lot more ability or resources to target domestic campaigns, which means that their foreign campaigns are more developed. Russia is a good example of this. China has also done this for a long time with their 50 Cent party. And so, a lot of times they will develop those strategies domestically and then we'll deploy them internationally for foreign targeted goals.
Some of my more recent work has looked at how Russia has moved, not just from domestic to international, but more outside of Russia to their sphere of influence. And so, I had done a study with another colleague whose name is Larissa Doroshenko, which if you want someone else to talk to, she is an excellent researcher out of Northwestern. And she and I were looking at Russian disinformation in Ukraine during Euromaidan in like 2014 time. And actually, this is probably one of the first times that they employed Twitter as a space to produce disinformation was during Euromaidan. And so, we have some good research to show that between 2012 to maybe about 2014 and 15, they were already starting to make it their way outside of Russia and looking more broadly into Europe and other countries.
And then of course, in the United States, their very first campaign was actually targeting the 2016 election. And that was called The Translator Project. Now they had done a fair amount of work prior to the 2016 election. And as I mentioned, because they've done these other campaigns domestically and within their sphere of influence, they knew the kind of homework they had to do before producing this sort of disinformation. And so, their 2016 election campaign started as early 2014 and 2015. And there's some really great insights from the Mueller report as well as from early research that suggest that Russian disinformation actors actually traveled to the United States and studied and attended potentially some of the protest movements among conservatives, as well as liberals. And were actually active in the political ecosystem in the United States, within, you know, US borders before actually going back to Russia and starting to produce their disinformation campaign.
And so, this is something that I would say Russia, isn't the only one that does this, but they are particularly good at this because they have developed their strategies over so many campaigns, both domestically and internationally. So, they know when they're engaging in surveillance and when they're doing their homework and trying to understand the political nuances of, you know, a country, how to tap into the cultural cleavages. They're especially good at that.
8:16: Blaire Hobbs:
Yeah. Along those same lines. Did you notice any similar strategies or tactics between foreign campaigns and domestic campaigns, or would you say that both of those would take an entirely different approach?
8:33: Dr. Josephine Lukito:
I think they take slightly different approaches especially because domestic campaigns have slightly different motivations compared to foreign ones, but a lot of the strategies that are employed are very, very similar. So, hashjacking, attempts to delegitimize journalists, we see both in foreign and domestic contexts. And in some cases, we do find that they're linked with one another. I'm gonna use Russia as a really good example of this because there's a lot of really good research on domestic Russian disinformation and propaganda, as well as foreign Russian disinformation and propaganda.
And what folks have found is that Russia in particular developed a lot of their propaganda strategies domestically before they essentially exported them to other countries. And so, a lot of early cases of Russian disinformation was actually done to influence and manipulate the Russian media ecosystem and to diminish the ability of Russian citizens to engage in discourse online. Then they out outsource those strategies to try to influence propaganda campaigns in kind of their, their sphere of influence. So, they would target Ukraine. For example, there is a ton of disinformation and propaganda during Euromaidan and anything related to Crimea really. And we still see that happening now, once they really, you know, develop their strategies within their sphere of influence and within their country, that's when they started targeting things such as Brexit or the U.S. elections. Once they have really honed those skills in their own native language, they were able to export them and translate their work into other languages as well.
10:13: Blaire Hobbs:
Yeah. So that's an excellent lead in as well. So, I would say that your relationship largely examines the temporal relationship between domestic propaganda and state violence. Can you discuss this relationship more and also what your goals were with your research?
10:30: Dr. Josephine Lukito:
Yeah. and I'm really glad you asked this question because I think a lot of the research being conducted right now, not just in disinformation, but when we look at the digital media ecology, tends to be these kind of snapshots in time. We'll take a portion of data and we try to understand what was going on, on Twitter or Facebook or TikTok at a specific week or at a specific couple of months. But for me, I was very interested in how that changes over time and the kind of dynamic nature in which disinformation and propaganda are produced. So, there's evidence to show that, for example, during elections, foreign actors are much more interested in producing disinformation because it's a really salient political moment, right? A lot of people are paying attention to politics. And in doing that work, I realize that time is a really, really critical factor for state actors in deciding when to produce a lot of disinformation and when to kind of hide behind the scenes.
And so, as I started to look at this connection between violence and propaganda, I realized that time played a really, really essential role in kind of trying to assess the relationship between the two. And our ongoing work has actually found that propaganda almost always proceeds violence. So, the propaganda comes first and then the violence comes second. And so, in thinking about, you know, the next steps for this work, one thing I'm really hoping to do is to try to use this information to help and work with activists to detect when state actors are engaging in violence and try to use their propaganda as a sort of predictive tool to anticipate when there is impending violence.
12:02: Blaire Hobbs:
I guess, just looking again at the countries, did you find any other trends across countries with that relationship or was it generally the same?
12:12: Dr. Josephine Lukito:
That was the most consistent result that we found. And in the case of Myanmar in particular, we found some interesting dynamics between social media, news media, and propaganda and violence. And so, one of the things that we noticed in the Myanmar case in particular was that violence actually also came after an increased amount of social media activity that was against the coup that happened in Myanmar. And so, for that particular case, we were looking at the Myanmar coup and the public reaction to that, and then of course the state reaction to that. And one of the things that we found was when there was increased social media activity, that was critical of the coup, there was increased violence about a week or two later.
And so, this kind of suggests that and unsurprisingly state governments are very aware of what is going on social media. And both qualitatively and quantitatively, we have found a lot of evidence to show that state governments, when they know there is protests that are about to happen, will engage in strategies that try to diminish what citizens can say online. And so that might be in the form of just shutting off the internet completely and preventing certain citizens from being able to engage in international discourse on Twitter or Facebook. We've seen that not only in the case of Myanmar, but also in the case of Indonesia. There's an independence movement going on in Indonesia and when protests were really high, Indonesia also shut off the internet for that particular island. And we also see examples in which state governments are trying to delegitimize or devalue journalists and citizens. And we see a lot of that sort of activity happening in Brazil where the internet isn't necessarily getting shut down, but a lot of journalists are being threatened with violence or a lot of citizens or activists are being delegitimized online or harassed through propaganda.
14:08: Blaire Hobbs:
So, you talked about too, how these countries, most of which are understudied because a lot of times there there's other ones that have cropped up more, but because they're understudied, did you find that you had trouble finding data for your research or did it take you longer?
14:27: Dr. Josephine Lukito:
Yeah, it definitely, I think one of the most important things is that I, and the researchers I worked with, had to be a lot more conscious of the context of those countries in particular. You can't take a Western context like the United States and the political ecosystem and assume it applies exactly to Brazil or Myanmar and that sort of thing. In addition to that, I think the translational and language issues that you had mentioned Blaire are spot on, those were challenges and things that we had to also think about. And so, I've been really, really grateful to have several researchers who speak languages that are very popular in those countries. So, we have one researcher who is from Brazil, who speaks Portuguese and was able to do a lot of that translation work. And we also got similar assistance from countries such as China, Indonesia, as well as Myanmar. And those individuals, who were able to both conduct the research and translate the work, were also able to provide us with a lot of contexts about what was going on in those countries, what news organizations to focus on, what social media platforms are particularly popular in those countries. And no study is perfect. There is certainly a lot of different layers of data and things I would love to have and see, but that's why we still continue to do this work. Right. And so, we're still hoping to expand a lot of our analysis to include social media platforms we might not have been able to capture in our first iteration of the paper.
One data, I guess, one source of data that I would love to point out a little bit more is Twitter. And I have always, I think Twitter is a really interesting platform for researchers, both academic, industry, and nonprofit, because they have a lot more data access relative to other social media platforms. So, it's much easier to collect Twitter data compared to collecting Facebook data or TikTok data. And that's great in some respects. And I really, really appreciate the efforts that Twitter have made to make this sort of data accessible. But I think at the same time a lot of folks end up studying Twitter and assuming that Twitter is reflective of all of social media in which to me, it's not. But with that kind of hedge, I do wanna point to Twitter's information operations archive, which they've been doing a really good job of diligently posting these data sets, anytime that they find what they consider an information operation. So anytime they find a group of users who are pretending to be everyday citizens, but are actually being paid by a state government, they will identify those actors, suspend them and then put all the tweets that they've posted into a data set that they then share publicly. And so, if folks are interested in studying information operations, particularly done by state governments the Twitter information operations data set is really quite expansive
17:17: Blaire Hobbs:
Yeah. Along with this strategy, did you notice any trends across platforms, or did you notice any other strategies on the social media platforms you were studying across countries?
17:33: Dr. Josephine Lukito:
Yeah. One of the most common things that I had seen was both attempts to delegitimize journalists, which we've talked a little bit about. Particularly in the case of Brazil, a lot of the disinformation content and propaganda content coming from state governments will attempt to delegitimize activists as well as journalists. But the other type of content that I noticed was really interesting, and I would love to explore this further, is also a delegitimizing or dismissal of international organizations and individuals outside of that respective country. And so, in the case of Myanmar, Indonesia, and China, disinformation from those countries also included disinformation about how the United States or France or Western countries were interfering with domestic affairs. And this is something I noticed had been very common in domestic campaigns in particular. So, when a country is trying to influence their own local communities and their citizens within their country, they're very cognizant of the fact that these citizens can now communicate with the outside world. And so, a lot of propaganda campaigns will attempt to preemptively stop citizens from engaging with the outside world, because the United States is trying to destroy your country or, you know France is trying to destroy your country. And so that's something that we've noticed, especially across all of our domestic propaganda campaigns.
18:58: Blaire Hobbs:
Okay. Looks like we're out of time for today. We've been talking to Dr. Josephine Lukito from the University of Texas at Austin School of Journalism and Media. Dr. Lukito, thanks so much for spending time with us today.
19:09: Dr. Josephine Lukito:
Thank you so much.