“I want Hello Kitty”, came my daughter’s repetitive chorus from the living room.
In my household, “Hello Kitty” is what we call the YouTube Kids app. She named it after the first video that got her hooked.
It was 2016 and “Hello Kitty” – the app experience, not the smiling cat – had been banned for more than a year, yet she still asked for it.
Like many parents, I was enamored with YouTube Kids when I first discovered it. The app gave me so much; 30 minutes to cook dinner unhindered by demands; an hour to finish off an essay I’d been dreading; 20 minutes to have my coffee alone on the balcony in the morning.
My love affair ended after going through my child’s viewing history. I did so because, on a rare trip to a toy store, I realized my daughter knew about every single toy in there. She knew more than I did. What I found in the history was a seemingly perpetual stream of videos showing a floating pair of hands unwrapping surprise egg after surprise egg.
I initially banned the app because I didn’t want my child to be exposed to so much mindless consumerism by the tender age of 5. Plus, I was concerned by the ramblings of some of these voices while they unpacked toys. After all, who were these anonymous people speaking to my child?
All I had to go on was a set of hands.
As parents of younger children will know, banning YouTube Kids is taken by small people to be an act of supreme treason, a cruelty of the highest order. Many of us have been through the arduous process of rehabilitating our kids after finding out about questionable content. YouTube Kids is like crack cocaine for toddlers.
But if you thought those unpacking videos were weird enough, think again. Things can take a dark turn swiftly in the app andwithin the space of 14 videos, click-throughs can take you from a CBeebies video to a rip-off version of Paw Patrol showing the heroes attempting to commit suicide.
This isn’t an unusual find either. While researching for this article, I spent some time using YouTube Kids, and it didn’t take me long to find another strange example.
I typed in ‘Mickey Mouse’ and made sure that the app was on auto-play. The third video that came up was ‘Mickey Mouse & Minnie Mouse Learn Colors with Kids Song Cartoon for Kids by Mickey Mouse’, uploaded by the channel ‘Hear Unlimited’.
Note that the title hits a number of keywords little people are likely to search for when they use the voice-search function in the app. This is a relatively short title for these types of videos. Normally they stuff in every single character name they can think of.
The video starts and from the outset it’s clear that this animation is a cheap, knock-off version of the original. But it’s the content that’s troubling here, not the quality of the cartoon.
Mickey and Minnie are walking with another human-like female mouse. This one is overtly sexualized and features half-lidded eyes, a skirt that is little more than a belt, and proportions that would put Barbie to shame.
As the three hike up a hill it becomes clear, by the pulsating hearts replacing his eyes, that Mickey’s attention is focused on the Barbie Minnie. This, in and of itself, isn’t so bad, right? After all, in classic Looney Tunes cartoons there was often a bit of mild romantic interest expressed in the same way.
It’s what happens next where it begins to get weird. The rain begins to fall and Minnie V2 falls off a cliff. She manages to hang on to a branch with her feet, exposing her underwear. Mickey comes over to help at this point but is incapacitated by the sight of her nether regions and instead kneels on the ground, drooling.
Unfortunately, this is just the tip of the iceberg when it comes to weird, disturbing content on the kids’ YouTube app. Sexualized cartoon characters abound and suicide, snuff, masturbation, sex, death, and murder feature, too. You name it and it’ll be in there, masquerading as content for kids.
Of course, not all content on YouTube is harmful to children, and there’s lots of videos that are great. My daughter, for example, loves the mini nature documentaries where an “explorer” searches in tide pools and his own backyard to find interesting animals and insects then shares information about them.
It’s just unfortunate that 5 or 6 videos after learning about an octopus’ camouflage tactics, she can be led to a video that features an over-sized octopus killing a cartoon child. Thankfully, since the time of writing, this particular video has been reported and removed.
Since the 2017 uproar over nightmarish remakes of popular cartoons, YouTube announced that it would put tighter controls on its algorithms and employ actual people to screen and filter content that shouldn’t make it from YouTube to YouTube Kids.
But the app is still riddled with content that in no way should be watched by small children. The task is too big, and the allure of making an easy buck through advertising revenue (if that is the end goal) means that the number of people producing and then uploading these bizarre videos is never-ending.
Of interest here is the question why; why make such disturbing stuff and then tag it with keywords that will draw in small children? If the end goal is just revenue for these content producers, why not produce simple, harmless, fluff content? This too can be stuffed with keywords and achieve the same potential revenue.
There is no clear answer to this question, or at least not one that I know of. Perhaps my critical thinking skills have been hindered by the knowledge that anonymous people like this have access to my child’s developing mind.
In aTED talk based on his viralessay, ‘Something is wrong on the internet’, writer James Bridles notes that whether these videos are intentionally malicious or not isn’t the point. The point is that by allowing small children to be mindlessly led from one video recommendation to another, we are training them to be directed by algorithms rather than by their own developing, critical thinking processes.
It’s hard enough for adults to navigate the media-saturated digital domain with finesse, knowing which links and information sources are valid and which aren’t. For children, it’s harder still, especially when known, trusted, and branded cartoons are interspersed with unverified, twisted versions that at first glance are indistinguishable from the originals.
Even if your child hasn’t been exposed to any particularly frightening or confusing content on kids’ YouTube, I bet that they’ve spent time watching something so vapid that it makes one wonder if we are allowing a kind of digital lobotomy to take place.
Videos of other people playing games can last for hours on end. Prior to deleting the app, I found my daughter watching these several times. It makes me wonder how her watching someone else playing a game could possibly be more interesting than playing the game herself.
Is there something relaxing and genuinely enjoyable about watching these types of videos? Or is it the pure dopamine hit delivered each time a level is won, or a box unpacked, that keeps her eyes glued to the screen?
Either way, as the person largely responsible for her developing mind, I’ve decided that there are better ways she can spend her time, and better videos to watch when it is screen time.
Author Bio:
Summer is a freelance writer, a mother, and an absolute scuba addict who would love to see our oceans protected for future generations. Read more tips on online safety and parental control on Safe Vision blog.