The social media platform is hugely popular with teenagers, and since its launch in 2017, has gained ground on Instagram, but not without concerns over some of its content.
Hence the introduction of a host of features intended to help users struggling with mental health issues and thoughts of suicide.
Tara Wadha, Director of policy, TikTok US says, “While we don’t allow content that promotes, glorifies or normalizes suicide, self-harm or eating disorders, we do support people who choose to share their experiences to raise awareness, help others who might be struggling and find support among our community."
"To help our community do this safely, we've rolled out new well-being guides to support people who choose to share their personal experiences on our platform, developed with the guidance of the International Association for Suicide Prevention, Crisis Text Line, Live For Tomorrow, Samaritans of Singapore and Samaritans (UK). The guides, which are available on our Safety Center for informational purposes only, also offer tips to help our community members responsibly engage with someone who may be struggling or in distress.”
TikTok’s announcement comes after a Wall Street Journal report said that Facebook has repeatedly found its Instagram app could be harmful to teenagers’ mental health in certain situations. The article reported that “Its own in-depth research shows a significant teen mental-health issue that Facebook plays down in public.” The two apps vie constantly for the attention of the teenage demographic.
Wadha also said that they would be featuring curated content from their partner organizations to learn about and explore important well-being issues.