Children watching videos on YouTube and on the YouTubeKids app may have been exposed to chilling instructions on how to commit suicide.
The first awareness of this horrifying trend was noticed by a female doctor whose son had a nosebleed. In order to distract him, she put on a cartoon on YouTube Kids, only to see roughly five minutes into the video a man walking into the picture, holding out his arm, and giving advice as to how to slit your wrist.
The mom wrote, “What did I just see? Did I really just see that? I immediately turned off the video. My son’s nose stopped bleeding, and I further investigated the video in private while he went to play. I watched it again, certain that I had dreamt it up. I know YouTube had some sick videos, but I thought YouTube Kids was safe. They sure make it seem like it is. But – no. There it was again. Four minutes and forty-five seconds into the video. The man quickly walked in, held his arm out, and tracing his forearm, said, ‘Kids, remember, cut this way for attention, and this way for results,’ and then quickly walked off.”
That particular video was later removed, but as The Washington Post reports, there have been others. Free Hess, a pediatrician and mother, told the Post the scene has been spliced into videos from the Nintendo game Splatoon on YouTube and YouTube Kids.
Andrea Faville, a spokeswoman for YouTube, issued a statement saying, “We rely on both user flagging and smart detection technology to flag this content for our reviewers. Every quarter we remove millions of videos and channels that violate our policies and we remove the majority of these videos before they have any views. We are always working to improve our systems and to remove violative content more quickly, which is why we report our progress in a quarterly report [transparencyreport.google.com] and give users a dashboard showing the status of videos they’ve flagged to us.”
YouTube emailed Ars Technica, “We work to make the videos in YouTube Kids family-friendly and take feedback very seriously. We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed. We’ve also been investing in new controls for parents including the ability to hand pick videos and channels in the app. We are making constant improvements to our systems and recognize there’s more work to do.”
The Post reported, “In a blog post last week, Hess alerted other parents to numerous concerning videos she said she found on the app — a Minecraft video depicting a school shooting, a cartoon centered on human trafficking, one about a child who committed suicide by stabbing and another who attempted to commit suicide by hanging.”
ArsTechnica added, “Videos have been found with adult content ranging from foul-language to depictions of mass shootings, alcohol use, fetishes, human trafficking stories, and sexual situations. Many contain—and attract clicks with—popular cartoon characters, such as Elsa from the 2013 animated Disney film Frozen. This chilling phenomenon has been referred to as Elsagate. Though YouTube has deleted channels and removed videos, Hess points out that it’s still easy to find a plethora of “horrifying” content aimed at children on YouTube Kids.”