How social media posts from foreign actors fuel polarization and undermine democratic institutions.

By Blaire Hobbs
11/12/2021 • 01:43 AM EST


The Twilight Zone, the classic science fiction series that aired on CBS from 1959 to 1964, often showcased apocalyptic scenarios of 'ordinary people in extraordinary situations' - as its tagline states. While fictional, every episode can be said to hold a kernel of truth that strikes at our deepest fears or most primal tendencies.

Take season one's "The Monsters are Due on Maple Street" - an episode that watches a peaceful suburban neighborhood descend into chaos. In the final scene, we see neighbors accusing neighbors, a frenzy of weapons grabbed, a collection of flashing lights, and a bevy of people running around in aimless panic. The camera then pans out to reveal two indifferent figures on a hillside in front of what looks like a control panel. In a brief dialogue, we learn that these observers are conducting a typical "procedure." As one says to the other, "just stop a few of their machines and radios and telephones, and lawnmowers…throw them into darkness for a few hours… and then, sit back and watch the pattern". 'The pattern' being that, when unexplainable activity occurs, people feel threatened and will almost certainly turn on themselves – "the most dangerous enemy they can find." 

While this may seem extraordinary, its parallels to reality are plausible when you scroll through an 'ordinary' American's social media feed and see an emphatic message or post with hundreds of angry comments, generating a multi-thread heated argument that leads to personalized attacks on other users. What may not be so obvious though is that, all too often, this chaos is engineered by professional trolls - our nonfictional indifferent figures on a hillside, whose messages are carefully crafted to drive a wedge into already polarized ideological lines and undermine trust in democratic institutions.[1]

Posing as Americans, these sock puppet accounts are foreign actors spreading disinformation through fake profiles. These accounts can be either a bot, a troll, or a cyborg - a bot being a computer programmed to post at a specific rate; a troll being a person(s) doing so, and a cyborg being a bot that is occasionally manned by a real person to humanize the account.[2]

It's been well documented that a single organization, Russia's Internet Research Agency, is behind much of this deception.[3] The U.S. House of Representatives Permanent Select Committee on Intelligence has been monitoring this activity since the 2016 election, when advertisements specifically undermining the Hillary Clinton campaign (see below) appeared frequently in the months leading up to the election. Since then, the Committee has indicted 13 individuals in 2018 and has identified hundreds of other fake accounts.[4] But even more exist today. 

Facebook post created by a troll account operated by Russia's IRA designed to stir up anti-Muslim sentiment.[5] Later that year a poster embossed with a quote they falsely attributed to Clinton, "I think Sharia Law will be a powerful new direction of freedom" was widely circulated during the presidential campaign.[6]


So how can ordinary Americans differentiate between the post of a fellow citizen and that of a foreign actor? If you feel yourself reacting to a post, either in deference or scorn, get in the habit of asking yourself the ten following questions:

  1. Can you identify the person behind the post from personal markers or conversations?
    Trolls don't offer information about jobs, family, hobbies, etc., or post threads that contain conversations with a friend, relative, or coworker - because those things don't exist. When personal information does exist, in an attempt to appear more genuine, the information is often sparing and nonsensical; they might forgo a personalized description altogether. Furthermore, as these accounts are not local to a particular community, they typically will not have information indicating a locality, such as events in town, restaurants running specials, or GoFundMe pages for people in the community.

  2. Does the account include a particular state's name in its title? 
    Troll accounts will often target voters in swing states via the name of the account or the profile description. Similar to the prior point– these accounts will often lack content about the state. For example, an account may be supporting Nevada Boys in Blue initiatives but ignoring any news or issues from Nevada.

  3. Do the majority of their posts espouse the same extreme view?
    Because the purpose of these accounts is to push an agenda, the political direction and intensity of the posts rarely stray. The feed from these accounts will position every political, social, or cultural issue in aggressive, disparaging 'us vs. them' terms, with no room for compromise. The most common type of these posts tends to be on the offensive - attacking opposing organizations.[7]
     
  4. Is the account following significantly more people than are following it?
    Most troll accounts, in an effort to gain followers quickly, will follow as many people as possible, generally leading to a skewed ratio of followers-to-followees. This generally only applies, however, to new accounts looking to gain traction, or to less successful accounts that were unable to go viral.

  5. Do they show a sub-par or overly formal grasp of English?
    While recently, foreign trolls are much savvier English writers, this may still indicate a potential troll. Moreover, in terms of tone, bots especially tend to be more formal, especially on Twitter.[8] However, it needs to be stated that there is a difference between foreign trolls and foreign or foreign-born account holders. Many posts are written by legitimate non-native English speakers, who also may not understand American customs, thus appearing like trolls.

  6. Does the account have several posts removed from the platform?
    Many troll accounts are already under observation from their home platform for sharing misleading information. While this doesn't prove troll-ness, it may behoove a user to be wary when engaging with this content.

  7. How many times a day does the account post?
    Bots specifically, can post upwards of a post per minute – hundreds of times a day. However, these are almost always retweets/reposts of other bots faked posts or the same prominent figure's posts in what is known as amplification.[9] In this same vein, almost half of content from trolls in 2015-2017 were retweets or nods to other accounts.[10] 

  8. Are the people in the picture(s) extremely attractive?
    Trolls often use above-average looking people (usually women in their 20's) to attract attention and to engender a sense of trust[11] - studies have shown that a person's attractiveness correlates with others' willingness to trust them.[12]

  9. Does the account proclaim to be managed by someone of a minority community?
    Trolls tend to pretend to be of a minority race or class in order to leverage their appeal to emotions about larger institutions. While many of these accounts post legitimate stories and genuine concerns shared by others who identify similarly, it is worth noting that many of these accounts are not only fake but overly inflammatory; they seek to inspire hatred and discord, not compromise.[13]

  10. Does the account begin with uplifting content before diving into stratified uncompromising messages?
    Trolls will often perform a bait and switch, by posting uplifting, though fake stories that allow the account to gain followers. Other accounts, real or otherwise, will repost these stories, further expanding the original's range. Once they've accumulated followers, the accounts will strike with more extreme targeted stories. Using this honeypot tactic, these accounts can 'catch more flies', in that they will reach more people and appear more trustworthy when posting extreme stories.[14]

While the majority of social media accounts are genuine, even those presenting some of these signs, skepticism of accounts that meet some of the above criteria is warranted. If you feel cuckolded by the suspicion that a social media account is fake, then the best course of action is to simply unfollow or notify the platform. Without a 100% certainty of who is behind the post, trying to expose the hoaxer could lead to engaging a genuine person behind the scenes and causing unnecessary strife. The point of professional troll accounts is to sow discord and malevolence; thus, by confronting an account holder, you are expanding the scope of their work.

As is the endgame for professional trolls, in the final words of the otherworldly visitors, “Their world is full of Maple Streets, and we'll go from one to the other and let them destroy themselves, one to the other, one to the other, one to the other." Like those on Maple Street, this kind of modern-day self-destruction is manipulated by foreign actors outside their national jurisdiction, drawing American citizens into a hotbed of confusion and paranoia. Out of feelings of self- and community- preservation, 'ordinary' people can become online "monsters". More than that, every day these posts appear more and more genuine. And if recent history is any bellwether for the future, it may be only a matter of time before the malevolence and disorder online, spills its way into the streets - unless 'ordinary' people become better at recognizing those behind the controls.

Note: The list of questions was derived from Clemson University professors' Darren Linvell's and Patrick Warren's clever "Spot the Troll" game, where they review all this information in more detail with specific examples. Moreover, through their Media Forensic Hub, Linvell and Warren have synthesized a wealth of information to help the public identify fake social media accounts.

References
1. "Russian trolls can be surprisingly subtle, and often fun to read". The Washington Post. Published: March 08, 2019.

2. "Cyborgs, trolls and bots: A guide to online misinformation". AP News. Published: February 07, 2020.

3. "Why are Russian trolls spreading online hoaxes in the U.S.?". PBS News Hour. Published: June 08, 2015.

4. "Exposing Russia's Effort to Sow Discord Online: The Internet Research Agency and Advertisements". U.S. House of Representatives Permanent Select Committee on Intelligence. Published: January 06, 2017.

7. ""THE RUSSIANS ARE HACKING MY BRAIN!"". Science Direct. Published: October 01, 2019.

8. "How Many Bots in Russian Troll Tweets?". Science Direct. Published: November 01, 2020.

9. "Engaging with others: How the IRA coordinated information operation made friends". Harvard Kennedy School. Published: April 06, 2020.

12. "Rice study suggests people are more trusting of attractive strangers". Rice University News and Media Relations. Published: September 21, 2006.