My Students Watched a Suicide On TikTok, and That’s Not Okay

Three studies report child well-being taking a terrible hit, is it any wonder?

by
https://cdn.psychologytoday.com/sites/default/files/styles/article-inline-half-caption/public/field_blog_entry_images/2020-09/istock-172168061_boy_0.jpg?itok=lhUvpBS5
Source: Photo by Biffspandex on iStock

A new survey finds six out of ten teachers are worried about students’ online safety. Additionally, a trio of studies just published in Pediatrics reported that parent and child well-being is taking a terrible hit during COVID-19.

Is it any wonder?

I asked myself this question last week when a graphic video of a Mississippi man committing suicide that first aired on Facebook Live in late August recirculated across TikTok and Instagram, two networks largely used by kids who are online more than ever. Many of my students saw it.

During our weekly digital literacy class (via Zoom), students wanted to talk about the disturbing video. They were upset and looking for answers: Why don’t social media companies block this stuff? How can I keep other kids from sharing it in group texts? Why did TikTok add this video in its recommendations “For You”?

As I did my best to moderate this uncomfortable conversation, another discussion between students was unfolding in Zoom’s chatbox:

Student 1: That man that was on the video… it was just so sad.

Student 2: [three sad face emojis]

Student 3: omg, omg, omg

Student 4: I’m scarred for life

Student 5: It’s really graphic.

Student 2: People on TikTok are disguising the video as puppies and kittens to trick kids into watching.

Student 6: I had to watch Peppa the Pig for an hour afterward to feel better.

This is not okay. Like my students, I have some questions.

How is it we live in a world where artificially intelligent assistants answer our every question, ads for products we search for once target us for weeks, and apps know our exact location, yet the same tech companies capable of creating these innovations can't manage to keep kids from watching some poor guy blow his brains out?

Social media platforms say it’s not their fault. As reported in The Conversation,

The sad reality is users will continue to post disturbing content and it is impossible for platforms to moderate before posting. And once a video is live, it doesn’t take long for the content to migrate across to other platforms.

In other words, it is the user’s responsibility not to post or spread disturbing content. But I disagree. I think social media companies should bear the responsibility of not providing shocking events, like suicides, a platform in the first place. Of course, this means we'd have to give up a perk we've become accustomed to—the immediate gratification of posting and sharing content.

But is that such a bad thing?

Let's Give Moderation Some Time

Why don’t we give moderators, human or automated, the time they need to effectively evaluate content? According to this news report, the man in the disturbing Facebook Live video died by suicide at 10:30 p.m., yet Facebook did not remove the video for several hours. Unbelievably, at first, Facebook did not think the video had breached its community standards. Even that took time to effectively evaluate. And here we are, weeks later, and the video is still being circulated on TikTok, Instagram, and evidently in kids’ private group chats.

Don't Expect Digital Citizenship to Solve This Problem

You might have heard, or read, that “Digital Citizenship” is the solution to this problem. If I hear that assertion again I’m going to scream for an hour (I may even record and post it). This many seem duplicitous since I teach digital citizenship, but let me tell you this, convincing schools to shoehorn this subject into an already jam-packed day is a feat. It will be a while, unfortunately, before every student receives such lessons. Plus, there will always be bad actors, no amount of digital citizenship education is going to change that.

A quicker and more efficient solution would be for social media companies to put their ingenuity to work figuring out how to stop horrible, disturbing content from ever being seen by children. This would give teachers like me a chance to actually teach digital citizenship and not have to spend a precious hour allocated to this opportunity trying to answer students' questions when I have so many of my own.