I sit here struggling to know how to find the way to express my profound concern for the content our kids are consuming in flood-like fashion every day and night of their lives. I, too, am a mom, and my kids are also consumers of digital content (including, at one point, YouTube Kids). I have seen hundreds of kids across campuses in the US respond to a call of, ‘Have you ever contemplated suicide?” I see the pain on their faces and feel the hurt in their hearts. Though so much of the pain of this generation relates to many other contributors, I am certain that negative and harmful (x-rated, cyberbullying, etc.) digital content is leading our kids to a darker headspace that some have not been able to get out of.

“We are in the midst of an uncontrolled experiment on the next Generation of children”. -Dr. Chrestokes (Seattle Children’s Hospital).

I don’t want my kids to be an experiment, I want them to thrive. We can’t passively sit back and wait for the results of an experiment to come back. Studies have already shown that suicide is up 200% in girls since the beginning of smartphones. I want to get on the offense and change the direction of this derailed train.

2018 Teen Mental Health Infographic 4

While YouTube is on blast right now (and rightly so), the awareness of suicidal content can not be limited to that one platform. “The most powerful tech companies in the world are making deliberate decisions that do great harm. They’ve created the attention economy and are now engaged in a full-blown arms race to capture and retain human attention, including the attention of kids.”Tristin Harris, former design ethicist at Google

We see it across all social media sources, and it’s time to make the changes necessary to save our kid and their friends. The United Kingdom’s Secretary of State for Health, Matt Hancock, calls the impact of social media on the mental health of our kids “a public health crisis”, and I couldn’t agree more. After, yet another death by suicide by a young 14-year-old girl with an Instagram account containing distressing material about depression and suicide.Kids Youtube Info And Momo Challene Hancock called upon the big social media hitters to make necessary changes to control the mentally abusive content that floods our kids’ feeds. When Instagram learned of another death, they issued this statement, “Facebook, which owns Instagram, said it was ‘deeply sorry; over the case.
Instagram has said it “does not allow content that promotes or glorifies self-harm or suicide and will remove the content of this kind.”-theguardian.com My question is, if Instagram was great at “removing the content of it’s (promote or glorify self-harm or suicide) kind” why is there currently 688,000 posts under the #selfharm alone, and 8.2 million #suicide mostly from teens? All content that is accessible to our teen with one press of the #.

So, where is the accountability?

Do we believe that companies like WhatsApp, Facebook, Snapchat and Google will take responsibility for the excessive negative propaganda their community is posting? Should they be held responsible?

Let’s Talk about how content is monitored:

YouTube explicitly uses its community to maintain healthy content. On their policy page they say “When you use YouTube, you join a community of people from all over the world. Every cool, new community feature on YouTube involves a certain level of trust. Millions of users respect that trust and we trust you to be responsible too. Following the guidelines below helps to keep YouTube fun and enjoyable for everyone.”YouTube

Do you trust 1.8 billion people with your child’s emotional, mental and physical wellbeing?

The irony of this community accountability policy is it is asking us to trust strangers. We don’t teach our kids to trust strangers in the “real world,” but we freely allow them to trust strangers from around the world through our tiny devices to feed us all of our content.” That’s a big ask, YouTube. Also, when was the last time your 6-year-old reported a community violation, we are lucky if they tell us they have seen how to perform a suicide act embedded video.

Say your child is really on top of things, sees a harmful video, comes to you, you report it like we are supposed to, it may take up to a week before that video to be taken down, all while more and more kids are being exposed to explicit material that you would never allow them to watch on Saturday morning cartoons. It has been reported that after several videos were found with a clip showing a normal looking man almost looking like a dad, he motions cutting the wrist and says “this way for attention and this way for results, end it.” It took over a week for the video to be taken down. If it’s a copyright infringement, the video would have been reported as soon as the video uploaded because of bots that search for repeated content, but a man telling our kids to kill themselves, a week plus!

So where are these videos coming from?

While the Momo challenge (which is a resurfacing), a video of a horrific form of a birdlike person popping up telling our kids they will die, has brought a lot of attention to the topic of frightening content within YouTube. This is not new, and it’s not going away. The core issue is explicit, hateful and harmful content woven throughout positive quotes. For example, content coming from YouTuber Filthy Frank, who “has over 6.2 million subscribers and calls himself the embodiment of everything a person should not be.” –cbsnews.com  If we don’t continue the conversation people like Filthy Frank will continue to create and release content that is changing the mental health of an entire generation. YouTube, is this the kind of community member you want me to trust? Though we don’t know if he is creating the videos, we do know he is on them.

What videos should we look for? The videos that our kids watch daily like Paw patrol, Minecraft, and Peppa the Pig can contain harmful material about halfway through. People have edited other video content into the kids programming, not an add pop-up, but actual embedded material inside the video of someone telling them to kill themselves. The suicide propaganda is not limited to this, there are also animated videos that are made to look like all the favorites that are also teaching death and suicide. I am not talking only about YouTube, but these videos are being found on YouTube Kids.

So, where are they coming from?

In a Ted Talk by James Bridle, he explains it so well. It is an odd idea that we have consumed information and entertainment produced by who, really? He says that YouTube Kids is often like “fake news for kids” and there are millions of videos at our children’s searchable fingertips. “It’s impossible to know where these things are coming from. Professional animators make some, some are randomly assembled by software, some are people who clearly shouldn’t be around children at all. Is this a bot? Is it a person? Is it a troll? what does it mean that we can’t tell the difference between these things anymore?” -James Bridle. So YouTube, are the bots the community members we should trust?



For now, what we have is the ability to report. Use your voice, report any explicit content to the source right away, on all platforms. I regularly report pornographic content on Instagram. How do I find it? They request to follow me as a seemingly relatable person, and when I click through, there’s a naked private part. I have seen more explicit xxx rated material scrolling through the “discover” pages of Instagram and Snapchat than I have in my entire life. Report, report, report! It’s like dialing 911 when you see a car crash. We have to do it, and it may save someone’s life.

If you don’t believe me, there are heart-wrenching stories of young children that are depressed and taking their lives after being exposed to this kind of content all over the Google search.

One mom bravely shared her story on Facebook:

Meridy Leeper Kids Youtube Story“This is an exceptionally hard thing for me to post. I’ve thought long and hard about this. I’ve decided it’s way too important not to bring awareness to other parents. This is not up for criticism. I only want to let all parents know what to watch for. Kids youtube, roblox, fortnight… no matter how much you think you are monitoring your child’s notifications to what your child is watching. It doesn’t matter. My 7-year-old child was taught how to attempt suicide by kids youtube and these games. She has expressed that she doesn’t feel neglected or unloved. Instead, she was constantly told to ‘go kill yourself’ by other gamers, by kids youtube. Shown HOW to.

Sunday night, she had a full blown anxiety attack. Which I held her and sang to her while she got through it. Monday, she drew this in school.

This is a VERY real danger! I NEVER thought I would find myself helping my SEVEN-YEAR-OLD CHILD through an anxiety attack. PLEASE, keep your children away from these things. I’m just so glad my child was able to express her feelings before she actually tried to harm herself. I never thought something as ‘innocent’ as kids youtube would have these subliminal messages. Again, I’m only sharing our experience in the hopes to prevent another child going through this.” –Maridy Leeper via FB

So I share the sentiment of Matt Hancock when he says “I want to work with internet and social media providers to ensure the action is as effective as possible,” he continues. “However, let me be clear that we will introduce new legislation where needed.” Here in the US, we need legislation, and I realize what that looks like is tricky. The laws that have passed are in regards to DATA and kids under 13, the Children’s Online Privacy Protection Act (COPPA). The other being about what kids see on school and library computers. “The Children’s Internet Protection Act (CIPA) was enacted by Congress in 2000 to address concerns about children’s access to obscene or harmful content over the Internet.” –fcc.gov The state of California recently passed legislation having to do with older kids and their DATA. It seems that’s all we are ready for. Meanwhile ,our kid’s mental health and often physical health are being taken.

What do you think legislation should look like?

One thing is for sure, I will not passively watch my kids, their friends, our future country, and world leaders lose themselves to social media, it’s not worth it. Some of us have to hit the restart button and that’s ok! We are in this, and we can help. But we have to make some changes for the sake of the iGeneration.

Would you like to chat about how to reset your families digital lives? YES, PLEASE.

Now What:

Share what you find and your personal stories with us to join the movement to stand up for our kids mental and emotional health. Post with the #MakeSocialMediaSafe #MakeYouTubeSafe and email us at Hello@Lets_Talk_Teens.org

Remember it’s not just YouTube, it’s WhatsApp, Kik, Snapchat, Instagram and others.

The YouTube filters aren’t great, but they help. You will find them under the login at the bottom of the screen (in tiny print).

Set up on Kids YouTube the ability to only watch videos from approved channels.

Parent suggestion: Only watch sources that are accountable like PBS kids, Amazon, etc..

Only watch Youtube on shared family screens like through the TV.

Would you like more regular info & resources? YES


What’s your action plan?

Write a comment

Your email address will not be published.

Follow us @lets.talk.teens