YouTube is both a massive industry and browsing staple that people use to fill their educational and entertainment needs. According to Business Insider, the website accumulates around 1.8 billion logged-in users per month. With its high traffic rates, it makes sense that the company wants to pay its creators in order to encourage them to upload more videos. Lately, though, you may have seen YouTubers complaining about the company’s advertisement revenue algorithm, as YouTube is trying to figure out the best way to pay their creators.
Simply put, the more views a video gets, the more money it makes. While some YouTubers operate with integrity and continue to create the content they wish to, others have decided to alter their content in hopes of appealing to a large, easy-to-please audience: children.
In 2015, YouTube launched YouTube Kids, which was specifically created for child viewers. This branch of YouTube consists of user-face and family-friendly videos that are readily available for young kids. Personally, I believe it was created with good intentions, and to ensure parents that their children are watching age-appropriate content.
Somewhere along the way, however, something went horribly, horribly wrong.
On its current website, YouTube Kids describes itself as “a safer online experience for kids,” acknowledging that some inappropriate videos may find their way into the service.
“We use a mix of filters, user feedback and human reviewers to keep the videos in YouTube Kids family friendly,” the website explains. “But no system is perfect and inappropriate videos can slip through, so we’re constantly working to improve our safeguards and offer more features to help parents create the right experience for their families.”
The question is, what terrible thing happened on YouTube Kids that prompted them to put this disclaimer on their homepage?
I was born in 1997 and have had access ti a computer for as long as I can remember. As a kid, I played Freddie Fish and Putt-Putt computer games, eventually graduating to Neopets and Webkinz when we upgraded from dial-up internet. To my knowledge, social media didn’t even exist until Myspace picked up around 2008.
Technology evolved with me. I didn’t have a lot of restrictions placed on my internet consumption, but I really don’t think I needed them. Everything was so new back then, people hadn’t figured out how to exploit them in a truly damaging way. The worst that could happen was accidentally stumbling across shock sites like Meatspin or Lemonparty.
Back then, YouTube was a harmless place for people to waste time. In 2008, FilmCow had posted “Charlie the Unicorn” on its YouTube channel, with “Llamas with Hats” to follow a year later. Both videos became cult classics among middle schoolers of the time, and, admittedly, the videos are a little crude. Charlie gets his kidney stolen by the other unicorns, and Carl the llama has a problem with stabbing people.
But these videos are no worse than what people saw on Adult Swim, and they don’t come even remotely close to the damage done by the Logan Paul scandal earlier this year.
On YouTube today, children are being exploited for money. YouTubers with channels specifically marketed toward children are cranking out videos to provide kids with loads of content to consume, as each video around 16 minutes long. (Which is the sweet spot for maximum ad revenue.) Frankly, YouTubers are practically begging their viewers to “smash” that like button and comment on their videos.
I spent a weekend babysitting my brother’s children and they spent most of that time watching channels like Chad Wild Clay. He would ask a question like, “Who is going to win this game?” ask kids to comment their predictions in the comments and then proceed to play the game, giving the kids the answer in the same video. He’d do that same thing several times throughout the video.
What’s the point of the interactive bits if they can just skip ahead and get their answers without commenting at all? It’s simple: the more engagement the video gets, the more likely it is to be picked up by YouTube’s recommendation algorithm, thus bringing in more traffic and more money.
I’m not trying to be a curmudgeon about all this. It’s not like children’s television was any less brainless or exploitative when I was growing up. “Ed, Edd n Eddy” drove my mom nuts from how stupid it was, and shows like “Blue’s Clues” very often asked questions before they were immediately answered. And besides gems like “Mr. Roger’s Neighborhood,” I’d also bet that cartoon companies really cared more about making money than they ever did about me. It’s all the same principles, just in a more modern format.
What’s the most concerning is that now, through YouTube, these videos are not passed through a strict censorship board like the FCC before they hit the air. With this severe lack of regulation, especially despite YouTube’s best efforts, it’s little kids that we really need to worry about protecting.
There have been reports of disturbing content coming from the YouTube Kids app, from gore-filled animated videos of beloved cartoon characters like Elsa or Mickey Mouse, to live-action videos of children in borderline-abusive situations.
Even if children start off watching harmless content on the app, these disturbing videos can still end up on their screens. The app uses an autoplay algorithm based on keywords, so a wholesome video can easily pave the way for a more sinister video to be viewed later on.
Why do these videos exist? Is it all to make money, or is there something more sinister going on?
Ethan and Hila Klein of h3h3Productions have been making videos about these YouTubers for a long time, trying to make sense of it all. Last year, Post Malone, Ethan and Hila sat down on the H3 Podcast and talked about this travesty.
Ethan explained to Post Malone “Elsagate,” the phenomenon of YouTubers making content that appears to be suitable for children but is actually filled with graphic and inappropriate content. They scroll through the comments of these videos, where (presumably adult) users are adding pedophilic comments about the children depicted in the videos.
The two then go even deeper down the rabbit hole, finding a conspiracy where seemingly gibberish comments on these videos are actually codes. These deciphered comments are equally, if not more, worrisome than the pedophilic comments. The Elsagate subreddit deciphered some of these codes, finding messages like “C u soon parkinglot bring her.” There’s no solid proof legitimizing what these comments really mean, but, nonetheless, they sound way too close to child trafficking for my comfort.
I’m not making any definitive claims on these conspiracies. What I can claim, however, is that children are being exploited, either for monetary gain or something more sinister. Next time you see a child watching YouTube on their iPad, maybe take a look over their shoulder and make sure the videos they’re watching are truly appropriate for them.