The War Against Fake News and Pseudoscience on YouTube: Computational Propaganda Creates the Illusion of Popularity and Support

It’s a simple fact that a user of social media is more likely to interact with, and share content if it looks like other users are doing the same. But what we now have to consider is whether those other initial interactions are even real, or if we’re being duped into thinking we’ve found something that’s more popular than it actually is.

For the most part, people don’t want to belong to a group that is seen as “unpopular.” However, there is a way of developing support for an “unpopular” person, group or cause by creating the illusion of popularity through manipulating how social media works; this artificial popularity can garner actual support in the real world.

In this case: some of the content creators who use social media benefit from an illusion of popularity that can be created by inflating follower/subscriber counts, through the use of “follower bots,” which can translate to real popularity, actual support and financial gain over time.

What is computational propaganda?

According to Professor Phil Howard, of Oxford University, computational propaganda is “the use of information and communication technologies to manipulate perceptions, affect cognition, and influence behavior” (Howard 2016).

While “bots” are not used exclusively for political purposes, the political arena does happen to be where they’re most pervasive and damaging. To a lesser extent, however, bots on social media can play a role in manipulating the perceptions of niche audiences by lending the appearance of popularity; a sort of social media rendition of “might is right,” in terms of higher follower/subscriber numbers.

In The Guardian article, “Facebook and Twitter are being used to manipulate public opinion – report,” the author, Alex Hern, points out that:

At their simpler end, techniques used include automated accounts to like, share and post on the social networks. Such accounts can serve to game algorithms to push content on to curated social feeds. They can drown out real, reasoned debate between humans in favour of a social network populated by argument and soundbites and they can simply make online measures of support, such as the number of likes, look larger – crucial in creating the illusion of popularity” (Hern 2017).

Regardless of where computational propaganda occurs, the idea behind it remains the same: bots inflate the number of followers, likes, shares or general activity, which can give the appearance of support that creates an illusion of popularity.

This is what’s known as “manufacturing consensus,” according to Samuel Woolley, a project director at the Oxford Internet Institute’s Computational Propaganda Research Project. In this case, manufacturing consensus applies to the creation of the illusion of popularity and support for someone who might not have had it in the first place.

The Argumentum ad Populum (Bandwagon) Fallacy

The reason that bots can have success on an uninformed, unaware audience is because of the logical fallacy known as:

“Argumentum ad Populum, (popular appeal or appeal to the majority): The fallacy of attempting to win popular assent to a conclusion by arousing the feeling and enthusiasms of the multitude” (Lander.edu 2018).

The introductory philosophy website from Lander.edu goes on to list three variations of the fallacy, which are “snob appeal, bandwagon, and appeal to emotion.” This blog post will be focusing on just one of these three in relation to computational propaganda: the “bandwagon fallacy.”

The bandwagon fallacy occurs when someone attempts to “prove a conclusion on the grounds that all or most people think or believe it is true,” which is undoubtedly what the expert behind the YouTube channel, Space Weather, is alluding to in his YouTube video where he says that he believes that “Ben [Davidson] is creating a false belief system.

The YouTuber, Ben Davidson (Suspicous0bservers), uses this logical fallacy every single time he mentions how many YouTube subscribers (“0bservers”) he has before, or after, he tries to make a point. Anyone can easily check that information (subscriber/follower count) for themselves, so it’s irrelevant to just about any legitimate argument one would want to make in an age where people know that Twitter followers, Facebook likes, and YouTube subscribers/views can all be purchased. [It’s important to note that there’s a difference between flat-out purchasing these things and paying for advertisement on social media to promote content, the latter of which is considered organic.]

RationalWiki on the argumentum ad populum fallacy:

“The argument is problematic because unfortunately, the premise ‘the majority is always right’ may not be true. When phrased like this, few people would say that they’d fall for such a stupid thing – but it’s still a remarkably easy trap to fall into, precisely because people don’t realise that it’s a bandwagon that they’re jumping on…something that gains attention (legitimately or otherwise) will attract more interest…While this is merely just how information tends to propagate, the bandwagon argument truly becomes fallacious when people use it as an excuse to say that an issue is important or that the circulating opinion must be correct” (RationalWiki 2018).

One of the most obvious examples of someone who uses the bandwagon fallacy, and even has real world experience cruisin’ one across the USA, is none other than YouTuber Ben Davidson (Suspicoius0bservers).

In a Tweet from February 7th, 2017, Ben used the argumentum ad populum (if many believe so, it is so) fallacy as computational propaganda when he requested to speak with Fox News reporter, Will Carr:

Screen Shot 2018-03-27 at 14.45.05.png

In this odd put-down that’s really a compliment about himself, “I only have 13,000 followers on twitter,” Ben Davidson (owner of SpaceWeatherNews LLC) offers up how little he knows about how news (even Fox News) is made. Ben seems to think that he can tell the reporter what a news story is, as opposed to the reporter deeming something newsworthy and then seeking Ben out for an interview in order to ask questions that accurately inform the public of the five W’s (who, what, when, where, why) while answering the all-important question of, ‘so what?’

If the story isn’t about how many subscribers/followers he has on the internet then there’s no reason for Ben to have led with that. As if how many followers/subscribers Ben has is somehow relevant to how he’d “love to speak with (Will)” about a “news story” to begin with. Luckily, reporters are becoming increasingly aware of just how embarrassing, and damaging, it can be when the sources for their stories aren’t vetted properly.

All of this is business as usual for Ben Davidson: instead of offer his ideas up for peer-review, and converse with qualified academics, he’d rather present his “findings” to the general public, or some clueless reporter, so that he can bask in the momentary praise.

How to Spot Fake Twitter Followers

Back to the logical fallacy of “subscriber count equals might/right.” The real question here becomes: how many of Ben Davidson’s Twitter subscribers are even real? 

This is especially relevant seeing as: shortly after Ben Davidson tweeted about how many Twitter followers he had (to Fox News reporter Will Carr), he ended up gaining something like 1,700 followers in less than 48-hours; almost all of which were Twitter accounts with no bio or profile picture. A Twitter account that does not have a bio (profile) is just one of the telltale signs that it is a fake follower or “follower bot.”

In the article, ‘6 Signs a Twitter Account Has Fake Followers,’ the author, Prasanna Bidkar, gives six signs of fake Twitter followers:

  1. Low Following-Follower Ratio
  2. Blank Twitter Bios
  3. Instant Spike in Followers
  4. Twitter Account Analysis
  5. Low Audience Follower Distribution
  6. Twitter Follower Quality Score

Of these six signs numbers two and three are the most relevant for this blog post. The “instant spike” in Twitter followers occurred over the course of two weeks in February of 2017 and consisted of nearly all accounts with “blank Twitter bios,” which are the same as the bot accounts shown in the article about fake Twitter followers.

The author, Bidkar, goes on to point out that:

“A sharp rise of thousands of followers in a day or week should raise a red flag. I have been active on Twitter for a few years. Getting a completely organic following and followers that do not Unfollow you once you follow them is hard work” (Bidkar 2016).

So, Ben Davidson’s tweet about “only having 13,000 followers on twitter” was made on February 7th, 2017 and by February 9th, 2017 his subscriber count rose from 13,000 to 14,700. By February 13th, 2017 Ben’s subscriber count had gone up to 17,100 and by February 18th, 2017 it settled around 18,000, which is where it stayed until it crept up (“organically”) to 18,500 by March 9th, 2017. So in two weeks Ben’s Twitter account gained around 5,000 subscribers nearly all of which were follower bots with fake bios (some without a profile picture) and none of which interacted with the daily Twitter posts in any way, contributing to an even lower “Audience Follower Distribution.”

A “Twitter Audit” on Ben Davidson’s account at that time, via the website ‘TwitterAudit.com,’ showed that 9,476 of his followers were real, while 5,238 were fake; out of a total of 14,714 followers. This means that 64% of his followers at the time of that audit were real, while 36% were most likely gained through “inorganic, fraudulent or dishonest means,” according to TwitterAudit.com. So, by the time Ben Davidson’s (@TheRealS0sTwitter follower count reached 18,000 followers, about half of those subscribers were not real, or, consisted of “follower bot” accounts.

Now I’m not saying Ben Davidson bought Twitter followers, what I am saying is that he has fake Twitter followers (follower bots) and therefore benefits from computational propaganda; whether he’s aware of it or not is irrelevant to the fact that it is happening.

Therefore, proper annotations like these ought to be made whenever Ben Davidson uses his argumentum ad populum (bandwagon) fallacy of, “on behalf of X number of subscribers/followers (0bservers), XX% of which are fake, [insert rest of claim here].”

Who cares? Why does this matter?

In case it isn’t obvious why someone’s ideas and content should stand on their own merit, without the aid of computational propaganda techniques, I’ll spell it out: benefiting from the creation of an illusion of popularity/support is unethical as it misleads people into interacting with something that they would have otherwise ignored.

Also, I think that Prasanna Bidkar summed it up quite well in his article:

“Such subversive tactics undermine the efforts of the majority of honest people who are working hard to build an online presence. I know dozens of bloggers and small business owners who have started with zero followers and are proud to reach 200-250 followers (for example) through months of hard work” (Bidkar 2015).

Profiteering off of the impacts that computational propaganda has on public opinion, particularly those that create the illusion of support/popularity for scientific manufactroversies, is how certain individuals have been able to make a living off of the gullibility, and paranoid worldview, of others using social media; this still requires some kind of serious inquiry.

In the meantime, question your sources and be on the look out for any “red flags” that hint at the usage of computational propaganda as a way of steering public opinion or debate. These are things that every social media user should be aware of, as it’s more important than ever to: fact-check yourself before your wreck yourself.


References:

Bidkar, Prasanna. “6 Signs a Twitter Account Has Fake Followers.” Business 2 Community: December 17, 2015. Website: https://www.business2community.com/twitter/6-signs-twitter-account-fake-followers-01400640#zovKkVHy5p229uVU.97

Frank, Adam. “Computational Propaganda: Bots, Targeting And The Future.” NPR: February 09, 2018. Website: https://www.npr.org/sections/13.7/2018/02/09/584514805/computational-propaganda-yeah-that-s-a-thing-now

Lander.edu. “Philosophy 103: Introduction to Logic Argumentum Ad Populum.” Philosophy.Lander.edu: March 03, 2018. Website: http://philosophy.lander.edu/logic/popular.html

Hern, Alex. “Facebook and Twitter are being used to manipulate public opinion – report.” The Guardian: June 19, 2017. Website: https://www.theguardian.com/technology/2017/jun/19/social-media-proganda-manipulating-public-opinion-bots-accounts-facebook-twitter

Howard, Phil. “Computational Propaganda: The Impact of Algorithms and Automation on Public Life.” Prezi: September 20, 2016. Website:  https://prezi.com/b_vewutjwzut/computational-propaganda/

RationalWiki. “Argumentum Ad Populum.” RationalWiki.org: March 03, 2018. Website: https://rationalwiki.org/wiki/Argumentum_ad_populum

Advertisements

Author: Reality Challenged

I have created this blog to record, analyze, investigate and report on the ideas, events, and people that would otherwise mislead you and waste your time.

5 thoughts on “The War Against Fake News and Pseudoscience on YouTube: Computational Propaganda Creates the Illusion of Popularity and Support”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s