Conspiracy Theories, Misinformation, Romance Scamming, White Nationalism, Twitter, and YouTube

12 min read

Trigger warning: this post discusses misinformation related to the suicide of a person implicated in sex trafficking of minors. While it does not go into detail on any of these topics, these topics are part of the subtext.

1. Summary

The dataset used in this analysis was pulled from two hashtags that trended on Twitter the morning Jeffrey Epstein's body was found in his cell. By virtue of Twitter surfacing these hashtags as trending, countless people were exposed to lies, misinformation, and conspiracy theories related to the suicide of Epstein.

  • Accounts that were looking to spread conspiracy theories about Epstein's death had a ready supply of web sites and video to support these theories.
  • Multiple videos spreading conspiracy theories were created within hours; these videos were shared hundreds of times within hours of the story breaking, and collectively viewed millions of times in the thirty days after Epstein's suicide
  • YouTube provides a convenient platform for spreading misinformation in real time. 10 of the top 17 videos shared were created on August 10th.
  • Twitter remains an effective medium for spreading misinformation and hate sites over factual reporting. Links to the Gateway Pundit were shared 720 times, compared to 417 shares to CNN.
  • Out of the 7000 most active accounts, over a thousand showed two or more signs of potential inorganic activity. Manual spot checks of these accounts highlight that accounts are using stolen images or stock images for their profiles, and in their feeds.

2. Introduction

The day that Jeffrey Epstein was found dead in his jail cell after his suicide, conspiracy theories and their related hashtags trended on Twitter. Two conspiracy theories were spread broadly: the first conspiracy theory claimed Epstein was murdered because he "knew too much;" and the second, related conspiracy theory claimed that the Clintons -- especially Hillary -- were directly involved. On August 10 - the day Epstein was found dead - these conspiracy theories were spread on Twitter so heavily that the hashtags "epsteinmurder" and "clintonbodycount" trended.

The posts used in this analysis were collected from Twitter in response to the search terms "epsteinmurder" or "clintonbodycount" on the day Epstein was found dead. Because the content was collected from terms that are directly related to conspiracy theories, we can expect to see more content from misinformation sites, and content from accounts that are actively engaged in spreading misinformation. This analysis examines domains shared to support the misinformation, YouTube videos shared, and selected accounts that stood out in this analysis.

3. Activity Overview

In the hours after Epstein's death, at least 312,623 posts were shared by 137,962 accounts. 947 accounts, or 0.69% percent of all accounts active in this spike, created 10% of the content. This imbalance of activity -- where under 1% of participating accounts create more than 10% of all content -- is in line with other recently observed spikes in activity.

Posts on "epsteinmurder" and "clintonbodycount"

4. Domain Shares

During this spike in conversation, the accounts that were active on these two hashtags associated with conspiracy theories shared approximately 400 domains 6280 times.

Looking at the top 20 most popular domain shares, right-leaning to far right sites dominated the domain shares, with 9 right wing domains shared 2515 times. 7 different mainstream to left leaning to left wing sites were shared 928 times. For information on how sites are categorized, see this explanation.

All of the mainstream to left leaning to left wing sites were from news organizations. Links to CNN were the most-shared mainstream site, with a total of 417 shares. In comparison, Gateway Pundit was shared 720 times, and links to the site of Ben Garrison -- a cartoonist -- was shared 1129 times.

Infowars and VDARE were both shared 98 times each, putting them in a tie for the 8th most popular domain shared. Infowars has a long track record as a source of hate and misinformation, and VDARE is a white nationalist web site. In contrast, links to the New York Times were shared 41 times, and the Times was the 24th most popular domain shared.

Expanding out to look at the top 50 most popular domains shared, right wing domains continue to dominate the conversation. Given the hashtags used to generate the data set, this isn't surprising. However, the amount of content ready to be shared -- or created in response to the news of Epstein's death -- is worthy of note. Additionally, the ability for this content to be spread so readily on Twitter, and for Twitter to highlight parallel bodies of content on other platforms -- shows a depth to the far right misinformation universe that should not be overlooked.

In the top 50 most shared domains, 23 right leaning or far right domains were shared 2901 times. 12 mainstream to left leaning to far left domains were shared 1080 times. The full list of domains shared can be read below, and it includes multiple far right misinformation sites and hate sites.

5. Videos and YouTube

Several video sharing sites -- Bitchute, Daily Motion, RedIce, and DLive -- all appeared in the top 50 domains shared. These sharing platforms were all used to distribute a small number of channels of far right content and/or conspiracy theories. While YouTube remains far and away the most popular platform used to share and disseminate hate speech and conspiracy theories, some people appear to be hedging their bets and using smaller platforms as a backup on the (highly unlikely) possibility that YouTube actually makes good on their years' worth of largely empty promises to address hate speech.

Links to YouTube videos were shared 808 times. 17 videos make up just under half (402) of all shares.

Only two of the top 17 videos were from mainstream to left leaning news sources (CBS and MSNBC). However, the content of these videos has been widely used in conspiracy theories. The CBS video is an outtake from a 2011 interview with Hillary Clinton when she was secretary of state; Clinton is on camera discussing the death of Libyan dictaror Moammar Qaddafi. The MSNBC video is a more recent story showing photos of Trump and Epstein together at Mar a Lago in the early 90s.

10 of the top 17 videos were created on August 10th; as of late August these new videos had 2,273,639 views. None of these videos were from mainstream news organizations. While YouTube continues to make promises about improving their platform, these efforts still allow questionable content to flourish. Hate speech and misinformation thrive on YouTube, and given YouTube's track record, will likely continue to flourish for the forseeable future.

6. Questionable Accounts Engaging in the Conversation

This section looks at questionable accounts that were active within the dataset using conspiracy-related hashtags. While this section examines account behavior that can often be traits of inorganic or artificial inputs that attempt to influence a conversation, the analysis does not examine whether or not the inorganic behavior on display within the data set can be attributed to a network, and/or that there is any level of coordination between accounts.

Additionally, while this section will highlight multiple examples that appear to be clear examples of inorganic activity, it is always healthy to remember that reality is complicated. While I am relatively comfortable that the examples I am highlighting show accounts engaging in inorganic behavior, precise attribution with 100% confidence is notoriously difficult. People are wonderful and strange; we do weird things all the time, so while an obervation can appear likely when viewed within a dataset, reality can make a liar out of the best data.

Additionally, when looking at behavior within politically motivated and conspiracy-obsessed communities, the activities of a true believer can look a lot like the activity of a troll, bot, or sockpuppet. Because of this inherent uncertainty, I use screenshots of accounts, but I blank out their username and their account name.

All of this is a long way of saying that additional analysis beyond the scope of this writeup would be required to do attribution with a higher level of accuracy. The platforms themselves (in this case, Twitter) have the most complete raw data that would make a more thorough and more accurate analysis possible.

When looking at accounts within the dataset that engage in behavior that might be inorganic, I limited the analysis to the accounts that were in the 95th percentile or higher (as measured by post count) during the 12 hours of the spike.

Among the 7455 most active accounts, 258 were created during June, July, or August 2019.

Among these 258 new accounts, many accounts post at an incredibly high rate, with some accounts averaging hundreds of posts a day, every day. To put this in perspective, for an account to post 50 times a day, that translates to roughly 6 posts an hour over an 8 hour day, or three posts an hour over 16 hours on social media, every day. Of the new accounts, 89 have been averaging 100 or more posts per day; 147 averaged 50 or more posts per day, and 57 have amassed 1000 or more followers in the short time (between 1 and 100 days) they have been active. While post volume alone is not an adequate measure of abnormal activity, accounts that meet the following criteria merit additional review:

  • new,
  • highly active in general,
  • highly active on a hashtag associated with a fringe conspiracy theory, and
  • collecting large numbers of followers quickly.

A spot check of accounts that meet these 4 criteria show additional signs of being illegitimate, which reinforces the possibility that a subset of these accounts would almost certainly be illegitimate accounts.

As an example, the account pictured below was created in mid-July.

Account page

Since its creation, the account has posted over 37,000 times (measured through September 10), averaging just over 600 posts a day. Throughout this time, the account has also maintained a clear balance between accounts it follows and accounts it's following, and the account has just over 2800 followers.

The account's profile picture appears to be pulled from a profile for a user on the site "Rentmen" from 2015 and 2016.

Rentmen archive

Out of the 7455 most active accounts, 2140 averaged 50 or more posts a day, every day. 937 accounts averaged 100 or more posts a day. Out of the 2140 accounts that averaged 50 or more posts a day, 1102 accounts had roughly the same number of followers as people they followed, which is suggestive of follow-back schemes or of automated processes that balance number of followers with number of accounts followed. While it is worth noting that follower count alone can be attributed to human behavior, when multiple indicators of inorganic behavior can be observed across a large group of accounts, additional review is warranted.

Among the most active accounts, a quick spot check showed multiple accounts that, while not displaying the high post counts or close follower/following ration described above, had other traits that could indicate that the accounts are not authentic.

Account page

For example, this account was active discussing and amplifying conspiracy theories around Epstein's suicide, in addition to pushing content that supports other popular narratives among the far right.

Post attacking Ilhan Omar

However, the profile picture of the account is a http://beautypunkfashion.blogspot.com/2012/10/maria-selena-beauty-fashion-dress-miss.html">lightly photoshopped version of Maria Selena Nurcahya, a person from Indonesia who competed in the 2012 Miss Universe contest.

MS site and image

Another example account that came up in a quick spot check is shown below.

Account page

This account promotes conservative books, and has amplified multiple hard right stories and conspiracy theories, including about Epstein's suicide. The account also makes statements about being in specific locations, and uses photos that suggest they are of the account holder.

Back in "Texas"

However, a quick search shows that the photo is commonly used by romance scammers - the picture is from a person who uses the performance name "Ann Angel."

Romance scammer photos

A second post states that the account holder just donated to the NRA.

Tweet claiming support for the NRA

This post reuses the Ann Angel picture highlighted above, and also incorporates a stock photo of a blond woman on a motorcycle.

Stock photo - woman on motorcycle

While there are possibly legitimate reasons why an account that posts conspiracy theories, has averaged nearly 70 posts a day for over 1000 days, amplifies multiple far right accounts and sites, and uses the same pictures as romance scammers, these behaviors are also strong signs that the account could be engaged in inorganic behavior. The use of tactics favored by romance scammers are especially telling.

The three manual spot checks described here show accounts that each have multiple signs of questionable behavior. Precise attribution, however, is very difficult -- more importantly, the behavior of a fervent believer can -- to a non-believer -- look highly abnormal. But with all that said, taken collectively, the overall behavior and trend within this dataset is away from accuracy, and toward speculation that is founded in racism, paranoia, and misogyny.

7. Conclusion

Between the popularity of hate sites and misinformation sites relative to more accurate news sites, the complete dominance of far right content on YouTube and selected smaller video platforms, and questionable accounts amplifying these questionable sources, Twitter remains a central element of spreading misinformation to a broad audience. As noted in the earlier writeup about conversational spikes about Ilhan Omar:

Over time, the right leaning to far right content creates an ever-growing foundation of sources that it can use to buttress arguments in future conversations. This ever-growing body of content provides a repository that reinforces a world view and a perspective. Conversations about specific issues become less about the individual issue, and more about proselytizing a world view and bringing people into the fold.

Social media platforms are all collecting the data that would help them address these problems more effectively. Twitter's follower recommendations are surprisingly accurate: when a person visits the account of a likely troll or romance scammer, the "other accounts" block is often filled with other trolls or scammers. YouTube's "recommended videos" list and Facebook's "related pages" blocks are equally (and surprisingly) accurate. Moreover, each of these services specialize in targeted ads based on accurate prediction of and insight into our interests. If we are to believe that their adtech isn't complete junk, then we should also expect that they could bring a comparable level of precision to addressing hate speech and misinformation on their platforms.

This analysis doesn't cover the sharing of misinformation on Facebook, Instagram, Reddit, or Snapchat. But, leaving those platforms out of the picture, given the relative ease with which Twitter can be used to get conspiracy theories trending -- and the equal ease with which conspiracy theories can be distributed at no cost on YouTube -- we are in for a long and ugly run up to the 2020 election.

8. Domain sharing information

The top 50 domains shared on Twitter: