Earlier this month, TikTok warned users of videos circulating on the platform that showed a man shooting himself with a gun. According to The Verge, the video quickly spread through reuploads. In response, TikTok raced to remove these videos and ban users who re-uploaded the clip.
“Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide,” a TikTok spokesperson told The Verge.
While the video was prominently reported as an issue on TikTok, users also reported the video’s presence on more social media platforms, like Instagram, Twitter, and Facebook. Now it’s been revealed that the viral promotion which spread the video across social media wasn’t kids messing around, but actually a “coordinated effort” by groups operating on the dark web.
About the Video
The video features 33-year-old Army veteran Ronnie McNutt (pictured above). According to Buzzfeed News, the video is from a Facebook Live recording that the Mississipi man created last week as he shot himself in the head. There were unconfirmed reports that McNutt had lost his job and broken up with his girlfriend.
The Verge reports that users are now inserting clips of the suicide video into what seem to be harmless TikToks to trick people into watching the full video.
A Facebook spokesperson told CNN Business that the original recording was removed when it was first streamed. They are currently using “automation technology to remove copies and uploads since that time.”
As the video circulated on TikTok, users uploaded videos to alert others on what to look out for. They shared a screenshot of the video — McNutt sitting at his desk — to warn others about what the video looked like. An example of these warning videos are posted below.
A Coordinated Attack
From the “George Floyd Challenge” to the “Autism Challenge,” social media companies have been able to halt the distribution of offensive videos in the past. So why was the TikTok suicide video so difficult to stop?
“Through our investigations, we learned that groups operating on the dark web made plans to raid social media platforms including TikTok, in order to spread the video across the internet,” said Theo Bertram, director of government relations and public policy at TikTok Europe during a UK sub-committee meeting on disinformation and online harms. “What we saw was a group of users who were repeatedly attempting to upload the video to our platform.”
TikTok’s main feed — the “For You Page” — isn’t curated by what accounts a user follows; it’s controlled by an algorithm. This randomized feed made it harder for any user to avoid the footage.
“What we saw was a group of users who were repeatedly attempting to upload the video to our platform, and splicing it, editing it, cutting it in different ways,” Bertram said. “I don’t want to say too much publicly in this forum about how we detect and manage that, but our emergency machine-learning services kicked in, and they detected the videos.”
CNN Business did state that TikTok implemented steps to make it harder for users to search for self-harm content. When using the platform’s search to look up terms like “suicide video,” results are blocked and the app provides mental health resources, including the National Suicide Prevention Lifeline.
At the UK sub-committee meeting, Bertram also stated that TikTok is attempting to work with other social media companies to protect users from harmful content.
“Last night, we wrote to the CEOs of Facebook, Instagram, Google YouTube, Twitter, Twitch, Snapchat, Pinterest and Reddit,” Bertram said. “And what we are proposing is that, in the same way these companies already work together around [child sexual abuse imagery] and terrorist-related content, we should now establish a partnership around dealing with this type of content.”
The National Suicide Prevention Lifeline is 1-800-273-8255. You can also text TALK to 741741 for free, anonymous 24/7 crisis support in the US from the Crisis Text Line.