Parents have been concerned about what their kids watch on TV ever since the first broadcast captured a child’s attention. Remember the V-chip, that blocked out objectionable content back in the ’90s? Parents who wanted to protect their kids from violent TV programs loved it. Everybody else hated it.
The V-chip never really amounted to much, but we did get a TV ratings system to help guide parents trying to police their TVs. Even that has issues. But in the digital age, it’s all gotten messier and messier.
YouTube has long been a go-to for parents who just want to throw on an episode or two of Peppa Pig or Paw Patrol to entertain their little ones.
It’s convenient, easy, and you don’t even need to have cable as long as you have an internet connection.
It’s hard to ask for better than that — except perhaps some confirmation about whether the content shows what it’s supposed to show, and maybe some oversight of what kinds of things people are actually uploading to YouTube.
To help parents, YouTube launched YouTube Kids, which is supposed to show only child-appropriate content.

However, as one mother discovered, some pretty sick people have been uploading disturbing content onto YouTube Kids, and unless you’re looking over your kid’s shoulder for a good chunk of the show, you’d never know it.
Free Hess is a pediatrician and a mother with a blog on child safety.

On her blog, Pedi Mom , she published an anonymous letter from a mom who described herself as an emergency room doctor who doesn’t shock easily.
But, while watching a cartoon with her son when he got a nosebleed, she got the shock of a lifetime. “Four minutes and forty-five seconds into the video, a man quickly walked onto the screen, held his arm out, and taught the children watching this video how to properly kill themselves,” she wrote.
“This video was intentionally planted on YouTube Kids to harm our children,” the mother wrote.
“He waited until parents’ guard were down, thinking their kids were just watching a harmless cartoon when he made his entrance four minutes and forty-five seconds into this video.”
Needless to say, Hess reported the video to YouTube and it was soon taken down. But a few months later, another parent alerted her to another cartoon with the same clip in it, again 4:45 into the clip.
This time it was on YouTube instead of YouTube Kids, but other than that, it was the same situation.

Hess wrote that “In looking back at the comments it appears that people began reporting this video approximately 8 months ago, yet it is still able to be viewed.”
So, she recorded the scene in question, where a strange man is clearly spliced in, giving instructions on slitting your wrists. It’s pretty disturbing. Hess embedded the video and encouraged her readers to report it, and eventually they got the video taken down again.
Now her curiosity was piqued, and so she started seeing how easy it would be to find other objectionable, inappropriate content on YouTube Kids.

Sadly, Hess had little trouble in her search. “My research has led me into a horrifying world where people create cartoons glorifying dangerous topics and scenarios such as self-harm, suicide, sexual exploitation, trafficking, domestic violence, sexual abuse, and gun violence which includes a simulated school shooting,” she wrote . “All of these videos were found on YouTube Kids, a platform that advertises itself to be a safe place for children 8 years old and under.”
She tried getting recordings of it all, but there weren’t just a few troublesome vids out there.
“There were just so many that I had to stop recording,” she wrote.
“It makes me angry and sad and frustrated,” she told CNN . “I’m a pediatrician, and I’m seeing more and more kids coming in with self-harm and suicide attempts. I don’t doubt that social media and things such as this is contributing.”
YouTube has responded to Hess’s concerns.
“We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video,” the company’s statement said. “Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed.
“We’ve also been investigating in new controls for parents including the ability to hand pick videos and channels in the app. We are making constant improvements to our systems and recognize there’s more work to do.”