0:01 Hey everyone, so some of you may have seen the video I put out a few days ago debunking the electric universe. Needless to say, it pissed off a few people, most notably, this guy. Ben Davidson. He made sure to let me know how pissed off he was by email. He doesn’t seem to understand that I was debunking the preposterous fantasy led by Wal Thornhill and pals, who themselves refer to it as the “electric universe,” a phrase that is everywhere on various websites, including the URL itself.
Could the earth’s weakening magnetic field “kill tens to hundreds of millions of people in the next few decades,” as the YouTuberBen Davidson (Suspicious0bservers) claims? No, says the solar physicist behind the YouTube channel, Space Weather.
In the video, “DEATH by COSMIC RAYS?,” the expert, Space Weather, debunks Ben Davidson’s (Suspicious0bservers’) outlandish claim about how the weakening of the earth’s magnetic field, “could kill tens to hundreds of millions of people in the next few decades.”
A partial transcription of some of the more important takeaways from this video can be found at the end of this blog, and I encourage readers to watch the video in its entirety for more information. But first…
Ever been in the comments section under a YouTube video and thought: WTF are these lunatics talking about? Chances are you’ve seen, or even taken part of, a comment thread that’s been created and curated by an Internet troll.
This blog post will focus on the impact that “sockpuppet accounts” can have on online discourse between members of the public and niche audiences led by individuals with vested interests. This is going to be an introduction to the darker side of the Internet and how sockpuppet accounts are used for trolling.
In the New Scientist article, “Sock puppet accounts unmasked by the way they write and post,” Edd Gent reports on what researcher Srijan Kumar, of the University of Maryland, said, “In the era of fake news, detecting sock puppets is important…Whenever multiple accounts are used by the same party it is harmful and it skews the discussion and fake news can be propagated very confidently” (Gent 2017).
It’s going to take some time to breakdown and go through the entirety of the scientific Gish Gallop that Ben Davidson uses throughout the video, “Discussion with Suspicious Observers.” For now, I feel it’s important to point out Ben Davidson’s use of motivated reasoning to reduce the unpleasant effects he experiences from the cognitive dissonance that arises whenever an expert disagrees with him.
In a previous blog article, I mentioned that Ben Davidson has a disclaimer (published in all capital letters) on the “About” section of his website that says, “I OFTEN INTERJECT MY OPINIONS ABOUT THE TOPICS PRESENTED ON THE CHANNEL AND ON THIS SITE, AND I ATTEMPT TO CLEARLY COMMUNICATE WHEN THAT IS THE CASE” (Davidson 2018). I went on to point out that this is:
In addition to being the home of cute puppy photos and millions of cat videos, the Internet can also be a house of horrors when someone uses it for the purpose of trolling, stalking or harassing another individual. While a crackdown on cyber-bullying using the laws that are already on the books has been gaining popularity among the public, cyber-harassment is still commonplace on YouTube, which remains the vulgar Wild West of the Web.
The fake news and pseudoscience being propagated through popular social media platforms present unique challenges to the existence of free speech on the Internet, with the old axiom attributed to Daniel Patrick Moynihan being as relevant as ever: everyone is entitled to their own opinion, but not to their own facts.
Taking that a step further: everyone is entitled to their own opinion, but not to preventing another from sharing their own opinion. Cyber-harassment is intimidation that uses threats and coercion in an attempt to control/manipulate the person being targeted. On social media it’s often employed as a tactic to silence an opponent and quell damaging dissent/questions.
It’s a simple fact that a user of social media is more likely to interact with, and share content if it looks like other users are doing the same. But what we now have to consider is whether those other initial interactions are even real, or if we’re being duped into thinking we’ve found something that’s more popular than it actually is.
For the most part, people don’t want to belong to a group that is seen as “unpopular.” However, there is a way of developing support for an “unpopular” person, group or cause by creating the illusion of popularity through manipulating how social media works; this artificial popularity can garner actual support in the real world.
In this case: some of the content creators who use social media benefit from an illusion of popularity that can be created by inflating follower/subscriber counts, through the use of “follower bots,” which can translate to real popularity, actual support and financial gain over time.