Disclaimer: The goal of Conquer with Na’Kole is to provide education and support for moms (and concerned loved ones) as they conquer the giants that seek to conquer their children. I, Na’Kole Watson, am not a licensed mental health professional nor am I offering professional mental health services or advice. Therefore, I accept no liability or responsibility to anyone as a result of any reliance upon the information produced on this site or in any communication issued by Na’Kole Watson and Conquer with Na’Kole. The views expressed on this and any other affiliated website are my own, and all sources are linked. If you are in crisis, please contact The National Suicide Prevention Lifeline at 800-273-8255. If this is an emergency, please dial 911 immediately.
There is a lot happening in these social media streets, and I wanted to make sure you knew all about it! I’m going to break things down by app, so feel free to skip to the app(s) you’re most interested in!
The Blackout Challenge and Robert Craig Jr
Content Warning: Child death
The Blackout Challenge makes its way around TikTok every few months. This time, it is believed to have claimed the life of a 10-year-old boy named Robert Craig Jr.
During The Blackout Challenge, children are challenged to make themselves go unconscious. When Robert’s 12-year-old sister Madison found him, he was hanging from a tree and his tablet was nearby. This is what led him to believe that he was participating in the challenge.
This challenge has claimed the life of far too many young people, and I would encourage you to talk to your child(ren) about it today. Even if you think they would NEVER do something like talk to them. It may not be them, but it may be their friend, someone on the bus, someone in their class or someone in their family. Please have that conversation.
Here are some articles about it:
- Family of 10-Year-Old Boy Speaks Out After Police Say TikTok Challenge May Have Led to His Death – People
- Child deaths blamed on TikTok “blackout challenge” spark outcry – CBS News
The Wall Street Journal Talks Talks About How Tiktok Shows Sex and Drugs to Minor Users (Who Are Registered With Their Correct Age)
I have a paid subscription to The Wall Street Journal, so my Sunday Scoop will almost always include their articles. However, because I know that they are behind a paywall, I will always link paraphrased articles that correctly articulate what the WSJ article has stated.
The WSJ states that a 13-year-old TikTok user who was registered with their correct age did a search on TikTok for “OnlyFans”. When doing so, this teen watched several videos including videos of someone selling porn.
This wasn’t an actual 13-year-old, it was a bot account created by The WSJ. They do this a lot to understand how TikTok is serving its content to minors.
After this, the user went back to the For You page where TikTok shows you what it thinks you’re most interested in. The For You page showed normal videos that are popular among teens and other users.
And then… TikTok started showing the user videos of a more sexual nature, including roleplay videos in which people pretended to be young people in relationships with their caregivers. As the user watched more and more of those videos (TikTok pays attention to what you linger on and what you immediately scroll past), the user found themself on “Kinktok” where LOTS of sexually explicit stuff happens.
I’m going to give you some quotes from the article:
“TikTok served one account registered as a 13-year-old at least 569 videos about drug use, references to cocaine and meth addiction, and promotional videos for online sales of drug products and paraphernalia. Hundreds of similar videos appeared in the feeds of the Journal’s other minor accounts.”
“TikTok also showed the Journal’s teenage users more than 100 videos from accounts recommending paid pornography sites and sex shops. Thousands of others were from creators who labeled their content as for adults only.”
TikTok removed about 1/3 of these videos when The WSJ showed them to its representatives – a lot of them were not removed because they didn’t violate any rules, they were just shown to the wrong audience.
There’s no easy fix to this, but being proactive and setting up the right controls on your child’s TikTok account is a great way to start. If you would like to know more about how to do that, please check out The Cyber Safe Experience, a course I offer that teaches you about all the social media platforms and how to keep your child safe on them.
Here are the WSJ Article (if you have a subscription) as well as an article that paraphrases it:
- How TikTok Serves Up Sex and Drug Videos to Minors – WSJ
- WSJ Reveals TikTok Algorithm Served ‘Endless Stream’ of Sex, Drugs to Minors, Plus It’s Stealing Our Private Info – CBN News
TikTok Reminded Us That They Have Mental Health Resources and Eating Disorder Resources in Their Safety Center
TikTok’s Safety Center is a great place to find out about what the app is doing to keep your child(ren) and yourself safe while using it. There are so many resources about:
- mental health
- eating disorders
- bullying prevention
- suicide and self-harm
There is even a Guardian’s Guide for parents and caregivers! As much as I highlight the evils of TikTok, I also want to highlight that they really do make an effort to provide users with solid information about how to be safe during this digital age. You can learn about parental controls and safeguards in the Safety Center as well!
The Safety Center Articles:
Instagram is STILL in the news because of the WSJ study… so here’s my take
As I said last week, yikes.
I think it’s terrible that Facebook has known for at least two years that Instagram is toxic for young girls. I think it’s even more terrible that we all knew this, but that they had the actual data to prove it and acted like they knew nothing of it.
Facebook knows about a lot of things. Privacy issues, safety issues, favoritism, the list goes on. The Wall Street Journal has a whole segment called The Facebook Files that delves deep into what they do in fact know.
I do not think Instagram is necessarily a bad place. I think algorithms are bad with influencing human feelings and behavior. And I think a lot of teen girls are not safeguarding themselves from the dangers of algorithms because they don’t know how. For that matter, a lot of adults don’t know how.
Algorithms are about keeping people on the app as long as possible. More time on the app = more revenue for the app. We can’t change the algorithm, but we can make sure that we are pouring into our kids enough to combat what they see online. Because even if you take their phone away, they are still going to see things. Whether it’s on a computer, their friend’s phone, your phone or wherever, they are still going to see it.
This is why I always teach education vs confiscation. Confiscation doesn’t protect them from anything. I’m not saying that you should never confiscate their phones – I do not pay those bills over there at your house! However, I’m saying that education should always be the weightier matter when it comes to teens and smartphones.
Here is a link to The Facebook Files. You won’t be able to read all of it without a subscription, but you’ll get a good summary of what they investigated and what they found.
Here are some articles that provide great insight into the WSJ article (these are the same ones from last week):
- Instagram is even worse than we thought for kids. What do we do about it? – The Washington Post
- Wall Street Journal’s Facebook Files series prompts comparisons to Big Tobacco – CNN
To be fair, here is what Facebook has said in response:
Twitter is Seeking Input on Filtering and Limiting Controls on Tweets
Twitter’s Paula Barcante announced that they are testing features that would warn trolls in advance not to troll.
These features would allow you to turn on filters so that harmful comments would not be shown to you at all. When a potentially harmful user goes to interact with your tweet (and filters are on), they would be warned that you have the filters on and that they may not be able to reply.
If a reply to a tweet is considered harmful, the user who wrote the reply would see it, but you would not (if the filters are on).
This is just an idea at this point. I see the potential for it, and I will keep you updated on how it goes and when it rolls out!
You can hear directly from Paula Barcante here: Paula Barcante’s Twitter Thread
Alphr Mag’s Dave Johnson Gave Us a Few Workarounds for Recovering Deleted Messages on Snapchat
I love Dave for this!
If you know anything about Snapchat, you more than likely know that it is famous for its “disappearing messages”. While I totally get the concept of disappearing messages and I’m actually a fan of them, I understand that they can pose a huge risk for tweens and teens.
In my Cyber-Safe Experience course, I talk about how teens who go missing are often groomed by their abductors on Snapchat because predators know that those messages, in theory, “disappear”.
Oh, but Dave Johnson to the rescue!
Dave outlines several ways in which you may be able to recover deleted messages should you need to. While I make no guarantees, I definitely suggest that you bookmark this article in case you or a parent you know needs it in the future!
You can read Dave’s amazing advice here:
How To Recover Deleted Messages From A Snapchat Account [IPhone & Android]
Roblox Will Start Verifying the Age of Teens Who Wish to Use Certain Services Within the App
In the near future, teens will have to verify their age to use services like Spacial Voice, the audio chat feature in Roblox that is geared towards older users.
It appears that verification will be optional for the general use of the app.
To use certain services within the app, teens will have to scan their government-issued identification card as well as take a selfie in the moment to ensure that their face indeed matches that government-issued ID card. This is a major move on Roblox’s part to ensure that they are doing their due diligence to combat predatory practices on their app.
Article: Roblox will start verifying the age of teenage players – The Verge
Wrapping It Up
This has been a WEEK! I definitely want to send love and light to Robert Craig Jr’s family as well as the other families who have lost their precious babies to the pressure of social media challenges
It is my goal to always bring you a roundup of the things that I believe are most important to you as a parent. If I missed anything, or if you have any questions, please let me know!
If you would like to receive a PDF version of The Sunday Scoop in your email inbox every Sunday morning, you can sign up below!
If you would like to take my Cyber-Safe Experience course, it’s here: Cyber-Safe Experience Course (formerly Cyber-Safe Summer)
And finally, if you would like to contribute to my work, feel free to Buy Me A Coffee here: http://BuyMeACoffee.com/NaKole
Let’s Conquer Together!