Yorkshire school issues warning to parents over unsafe Momo challenge

The You Tube app on an iPad

The You Tube app on an iPad

A mom in Ocala, Florida found videos on YouTube Kids that gave children instructions for suicide. According to Yahoo News, she was playing what she thought was "a simple, innocent cartoon" for her son but it was much more than that.

Hess added to CBS News that she made it her mission to bring awareness to disturbing and violent content children consume on social media after seeing higher rates of suicide in children in her own emergency room over the last few years.

"Challenges appear midway through Kids YouTube, Fortnight, Peppa pig to avoid detection by adults".

The man featured is YouTuber Filthy Frank, who has over 6.2 million subscribers and calls himself "the embodiment of everything a person should not be", although there is no evidence that Frank, whose real name is George Miller, was involved in creating the doctored video.

Hess said she has been writing about the distressing video clips on her blog, PediMom, to raise awareness and to get them removed from the platform. In order to distract him, she put on a cartoon on YouTube Kids, only to see roughly five minutes into the video a man walking into the picture, holding out his arm, and giving advice as to how to slit your wrist.

Like nearly anything at Google, YouTube uses "smart detection technology" to flag inappropriate content.

Most parents feel pretty safe letting their children watch YouTube Kids, the child-friendly version of the video platform.

But that's not all she said she found.

"As parents, if we want our kids to be spending time in those places, we just have to make sure that they're equipped to know what to do when they run into some of that dark, or unhealthy, or in this case, self-harm content", McKenna said.

Hess said she reported the video, but said it took YouTube a week to pull the video down. She said she found videos glorifying not only suicide but sexual exploitation and abuse, human trafficking, gun violence and domestic violence.

I don't doubt that social media and things such as this are contributing, "she later told CNN".

YouTube spokesperson Andrea Faville told the Washington Post the company works to ensure it is "not used to encourage risky behaviour" saying it has "strict policies" prohibiting videos which promote self-harm.

"We are always working to improve our systems and to remove violat [ing] content more quickly". She said such videos can cause children to have nightmares, trigger bad memories about people close to them who have killed themselves or even encourage them to try it, though some of them may be too young to understand the consequences.

It added that such content "could be extremely risky if children copy what they see".

For support in the United States, call the National Suicide Prevention Line at 1-800-273-8255.

Latest News