It’s Time for YouTube to Die

Red Summit Productions
11 min readApr 5, 2019

By: Ruthie LaMay

Around 1 billion hours of YouTube content are watched per day

Fourteen years ago on February 14, 2005, YouTube was born. One of the most popular social media platforms now and, according to Alexa Internet, the second most popular website in the world — only second to Google. And just 1 year after its release it was acquired by Google for $1.65 billion. YouTube offers its users a place to upload their videos freely and as of February 2017, more than 400 hours of content is uploaded to YouTube each minute, and one billion hours of content being watched daily. I, myself, probably spend at least 2 hours a day watching YouTube and to some extent, it’s replaced the time I spend watching television or streaming shows online. So, what I’m about to suggest next might shock you:

I think it’s time for YouTube to die.

I know that sounds crazy, but hear me out. There’s a lot of issues that have come up over the past few years that YouTube has yet to address in a meaningful way. And now that youth society has shifted to wanting to be YouTubers (out of 1,000 children surveyed, 75% say they want to pursue an online video career), YouTube has no choice but to make a change, and soon. My belief is: the clock is ticking on YouTube before a new company comes with its issues already solved and takes over.

But what exactly are these issues?

YouTube’s Review System

YouTube has, since its release, struggled greatly with keeping up with the massive number of videos being uploaded every minute. Of course, it’s impossible to screen every single video before it goes up, so YouTube has an automated way to scan through videos to see if videos adhere to their community guidelines which, in summary, are:

  1. No Nudity or sexual content
  2. No Harmful or dangerous content
  3. No Hateful content
  4. No Violent or graphic content
  5. No Bullying or harassment
  6. No Scams or misleading content
  7. No Copyrighted material

Seems pretty simple, but it’s pretty hard for a robot to tell what the line is between videos that are genuinely harmful and that are jokes. The biggest scandal to strike YouTube and call out its lackluster reviewing system was Logan Paul’s vlog back in 2018.

Logan Paul and the Suicide Forest

Logan Paul in his since deleted vlog in Japan

Back in 2018 YouTube star Logan Paul posted what seemed like a regular vlog to his YouTube channel. Despite being a controversial figure, even before the scandal, Paul was immensely popular, estimated by Forbes to be worth around $12.5 million. But it didn’t take long into the vlog before it was clear something was going seriously wrong.

Paul and his friends took a trip to the Aokigahara forest in Japan, also known as the “Suicide Forest.” Japan has the highest suicide rate in the world and the Aokigahara forest had become a popular place for people to go to carry that out. While filming there, Paul and his friends discovered the body of a man hanging from a tree. Despite appearing shocked at first, they continued to film the body and crack jokes about suicide. The video was met with widespread criticism calling it “disrespectful” and “disgusting.” What’s worse is that the video was live on YouTube for 10 days garnering over 6 million views before Logan Paul himself took down the video.

Popular YouTuber Shane Dawson recently investigated a theory of whether or not Logan’s brother, Jake Paul (also a YouTuber) was a sociopath. In that docu-series Logan himself admitted to “sociopathic tendencies.”

Besides the obvious issues with Paul, YouTube took a lot of heat for somehow allowing the video to make it through review without anything being flagged as an issue. Several days after the fiasco, YouTube issued their apology on their Twitter:

“... We expect more of the creators who build their community on YouTube... The channel violated our community guidelines, we acted accordingly, and we are looking at further consequences. It’s taken us a long time to respond, but we’ve been listening to everything you’ve been saying. We know that the actions of one creator can affect the entire community, so we’ll have more to share soon on steps we’re taking to ensure a video like this is never circulated again.”

In the aftermath, YouTube did follow up on its promise to punish Paul. YouTube spent most of its efforts removing Paul from their preferred program, ending their original series with him, and briefly removed advertising from his channel (although it was restored less than a month later). And although they promised a change coming to the reviewing process, they never really took accountability for their fault in the whole ordeal.

Momo and YouTube Kids

Example of a thumbnail using Peppa Pig to mask more disturbing content found on YouTube Kids (via James Bridle)

YouTube Kids, a subset of YouTube, is advertised as a safe place for kids to enjoy YouTube content with extra filters to keep content safe for the sensitive age range. Many parents download the app and let their kids watch videos freely, but time and time again it’s been proven to not be nearly as safe as it seems.

The image associated with “Momo” is actually a Japanese sculpture titled “Mother Bird” and is completely unrelated to the challenge itself

The Momo challenge was popularized a few years ago, but has since resurfaced. The premise of the challenge involved children discovering Momo in videos, and would then instruct them to send a message to her. After doing so, these kids would reportedly receive instructions of how to harm themselves, or in more serious instances, kill themselves. In the time since its resurfacing Momo herself has been debunked due to lack of concrete evidence, but the threat is still very much real, just a little less on the nose.

In 2017 writer and artist James Bridle set off major alarms by discovering a slew of videos of Peppa Pig and Mickey Mouse being tortured, sexualized videos of the Disney Princesses, and channels “pranking” their children showing them wetting themselves and screaming in fear (an example was made of family YouTubers who lost custody of their children over their disturbing pranks). All of which easily accessible on the YouTube Kids platform.

The community at the blog PediMom has discovered several videos of cartoons with disturbing content spliced into them on YouTube Kids. One writes this while she was trying to stop her son’s bloody nose:

“Four minutes and forty-five seconds into the video. The man quickly walked in, held his arm out, and tracing his forearm, said, “Kids, remember, cut this way for attention, and this way for results,” and then quickly walked off.”

YouTube’s reaction to the increasing number of inappropriate content on the platform is to increase the user’s responsibility -- calling on users who see these disturbing videos to report them themselves for further review. YouTube Kid’s welcoming message points out that no automated system is perfect, but there’s still a long way to go on YouTube’s end to improve their reviewing process, especially in the Kid’s app. Beyond issues of self harm, there are plenty of videos that children shouldn’t be exposed to that can come up naturally from “family friendly” channels; like conspiracy theories, racial slurs, anti-semitic and alt-right ideals, and more.

The advice from the bloggers who’ve discovered these kinds of videos on YouTube Kids suggest the same thing: watch what your kids are watching with them or delete the app. Simple as that.

Child Exploitation

Mike and Heather Martin, also known on YouTube as “DaddyOFive,” who lost custody over their children for their YouTube pranks

On the topic of child exploitation, recently a ring of pedophiles has been discovered in the comment section of YouTube. A while it sounds a little dramatic, it was true. Although the videos in question weren’t sexual in anyway — just regular family vlogs — the comment sections were filled with users sexualizing the children. And in some cases, these predators would interact with each other, offering to exchange real child pornography right under the video in the comments.

Tweet from family vlogger BubzBeauty regarding the recent policy change

After deleting hundreds of thousands of comments and striking tens of millions videos, YouTube has landed on a compromise for now: banning all comments on all channels that feature children. This came as a happy medium to most creators on the platform, especially since restricting of comments won’t affect a channel’s monetization. However, there are still a lot of creators that are disappointed they’ll no longer be able to interact with fans under their videos and are waiting for a more permanent solution to the problem.

Abusing True Creators

Now with all that horrible stuff on YouTube that we’ve discussed, there’s equally as much good stuff. YouTube is full of passionate, strong creators that just want a platform to share their stories. However, YouTube hasn’t been great at handling them either. Like I said, I spend a lot of my time watching YouTube and there’s been a disappointing trend over the past few months over the increase in copyright issues. This will presumably get worse over time, especially with new laws such as Article 13 for the EU.

Remember number 7 of YouTube’s Community Guidelines? No copyrighted material. Unless that material falls under Fair Use. Fair use is a slippery condition that is defined as “a legal doctrine that promotes freedom of expression by permitting the unlicensed use of copyright - protected works in certain circumstances.” But what, exactly, those “certain circumstances” are is where the issues lie.

Thumbnail from YouTuber Danny Gonzalez’s video on his battles over copyright

Fair use is when the copyrighted content is transformed, used for educational purposes, or not used that much, so it’s a very subjective topic. YouTube is a place full of commentary channels: movie reviewers, comedians reacting to content, and drama channels, all of those uses falling under fair use. However, YouTube’s algorithm has no way of knowing whether or not copyrighted material falls under fair use and will sometimes incorrectly flag content as copyrighted. And even beyond that, companies have the right to manually search videos to try and find stolen copyrighted content. However, with the control in the companies’ hands, there’s not much stopping them from copyright claiming and taking down a video unfairly. For example, let’s use YouTuber Danny Gonzalez. Gonzalez’s channel is based on comedic commentary on different materials found on the internet. For one video, he used footage from a children’s animation YouTube channel called “Billion Surprise Toys.” Gonzalez posted a video poking fun at the company and in turn Billion Surprise Toys placed a copyright claim on the video, blocking it in the US and Canada. Even though Gonzalez’s video falls under fair use, Billion Surprise Toys can use its power to flag any video they feel paints the company in a bad light, wrongfully declaring it’s not fair use.

So what happens once a copyright claim is place on a video? That video can then be demonetized, blocked in certain countries, or completely taken down. After that, the creator has an option to appeal this copyright claim, but here’s where the system is broken: that copyright appeal will go to the company who issued it. That company will then decide whether or not it’s fair use. And if they decide it’s not, that creator will get a copyright strike on their channel. These strikes prevent a creator from posting for one week and stay on that channel for 90 days. On the second strike in 90 days the user is banned for 2 weeks. If one channel accumulates 3 strikes within a 90 day period, it’s banned forever.

Gonzalez has also been hit with copyright claims for the song at the end of all of his videos recently, despite having used it for years, requiring him to change the end of all of his videos on YouTube manually.

So now content creators are struggling to fight the copyright appealing process and Gonzalez has found a hack to fighting these copyright claims. Gonzalez made a video series describing his issues with a company and didn’t want a dispute -- if the company claimed the video was theirs why would they change their mind and say it was fair use after that? So he took to Twitter and wrote:

“I’m working with my team and reviewing my legal options against @BillionSurprise to get my video back up. We are well within the rights of fair use and BST is totally abusing the system.”

Less than 10 minutes after that tweet, the company released their claim. The same thing happened this February with a different company and Gonzalez made a similar tweet regarding “his legal options” and the claim was released the next day.

Cody Ko’s video on his copyright issues

“Reviewing my legal options” became a kind of meme in the YouTube community and, although it worked for Gonzalez, not all creators have been so lucky. Popular YouTuber and peer of Gonzalez, Cody Ko was hit in December of 2018 with two of his videos being struck due to issues of copyright, despite both of them falling under fair use.

Ko is known for reacting to “cringey” content and one of the videos he reacted to was a YouTuber with some of his friends trying a “vape hotbox.” YouTube dealt Ko a copyright strike on his channel for showing a minor participating in a dangerous activity, however the original video is still up and active on that YouTuber’s channel. Another YouTuber, and friend of Ko, Noel Miller had the sixth episode in a game play through on his channel permanently deleted and received a copyright strike as well. However, the other 7 videos in the series remain untouched and were all well within fair use.

It’s disappointing that something that can be so easily reviewed by the YouTube team themselves is not being done. Frustrations surrounding the copyright appeal system are mounting and for something that is so easy for YouTube to fix, there doesn’t seem to be any plans to change the process.

So What’s Next?

After venting all of my problems with YouTube, now we can circle back to my thoughts on why YouTube needs to die. I think as a platform YouTube has undoubtedly paved the way for produsage. Now anyone can share their ideas and creations with the world in an instant, but of course there’s tons of issues that comes with that.

It’s not that YouTube isn’t capable of reviewing videos more thoroughly before letting them be uploaded — it’s that they aren’t willing to do so. Not only is this frustrating its user base, but it’s also scaring off advertisers, which is where YouTube makes its money. Big names like AT&T, Disney, Verizon, Johnson & Johnson, and Epic Games (the creators of Fortnite) have all decided to stop advertising on YouTube out of fear of being associated with alt-right ideals and pedophilia.

And besides all of that, I think announcing the impending doom of the platform will spark an idea from someone, anyone, who has the drive to fix the problems plaguing YouTube today. And if someone can come up with a platform that fixes all of these issues before YouTube can fix itself, isn’t that a statement to make? Even if someone can’t come up with a perfect model right away, any competition to YouTube will put the pressure on to fix itself. That’s the issue with monopolies; there’s no one to challenge YouTube to improve. So what if they don’t fix their issues? Where else are people going to go?

It’s not a change that can happen overnight. It’ll take a shift from both the consumers and the creators and, of course, a platform to shift to. But it’s time to take a step in the right direction.

With or without YouTube.

--

--

Red Summit Productions

We’re a production company comprised of dedicated professionals, making quality content for brands in New York and beyond. We’d love to hear from you!