A 19-year-old adolescent live-streamed his suicide on TikTok on February 21, 2019, with an approximate of 280 viewers following the tragic incident live.
As a new report by The Intercept alleges, the moderators of the social media platform have failed to adhere to protocol, informing the police about three hours after the harrowing event took place. According to the news organization, TikTok published a press statement before reporting the suicide to the authorities.
TikTok failed to alert the authorities about the suicide.
The Intercept is the first news organization to acquire information on the events taking place throughout February 21, including the content posted by the teenager in the lead-up to the suicide and the action plan TikTok devised to prevent a potential public backlash.
Citing an interview with a TikTok employee and an action plan, the article outlines how the social media platform prioritized commercial interests over the deceased user's needs. As it states, TikTok failed to alert the police and remove the sensitive content on time.
As The Intercept reveals, the unnamed Curitiba-native posted an oblique clip announcing a "special performance" on February 20, 2019. He took his own life the next day at 3:23 p.m. His TikTok account was deleted at 5:13 p.m. The police were not alerted until 7:56 p.m. — and the teenager was announced dead a few minutes later.
TikTok employees reportedly received alerts calling attention to the existence of the gruesome recording via WhatsApp, but were advised to implement the relevant PR strategy. They were also allegedly instructed to delete the teenager's account and prepare a press statement.
"We are extremely saddened by this tragedy. At TikTok, it is our top priority to create a safe and positive in-app environment, and we have developed guidelines to foster a positive environment for everyone in this community," reads the announcement cited by The Intercept.
The suicide led TikTok to update its content moderation guidelines.
According to The Daily Dot, TikTok has taken disciplinary action since. As a representative of the company informed the outlet, the incident raised concerns about the moderators' failure to flag the video and delete the content on time.
"The moderation team has been vastly improved in the year since [the] stream went up," a spokesperson for the platform said.
“Almost a year ago, we removed this content and alerted local authorities because we do not allow content that promotes self-harm or suicide, as stated in our Community Guidelines. We have since updated our live-streaming policies and improved our safety and moderation measures, as well as added new training, tools, and reporting protocols."
If you or someone you know is contemplating suicide, call the National Suicide Prevention Lifeline at 1-800-273-8255.