Toxic TikTok

Devil’s Advocate
2 min readDec 17, 2022

We have seen any number of times why both the Indian and the American Governments are wary of Tiktok as a security threat. At the same time, the social media platform is also being flagged now as leading to psycho-social issues, especially among children & young adults.

Over 1200 families in the USA are now pursuing lawsuits against social media companies including TikTok and Instagram. These suits allege that content on social media platforms profoundly impacted the mental health of their children, sometimes leading to their death.

The Center for Countering Digital Hate (CCDH) has now published a report that says TikTok is recommending self-harm and eating disorder content to some young users within a few minutes of joining the platform.

Researchers set up TikTok accounts posing as 13-year-old users interested in body image and mental health content. Within as few as 2.6 minutes after joining the app, TikTok’s algorithm recommended suicidal content and eating disorder content within as few as 8 minutes.

To validate their findings, CCDH researchers registered as users from the USA, UK, Canada, and Australia and created “standard” and “vulnerable” accounts on TikTok. 8 accounts were created, both male and female, and data gathered from each account for first 30 minutes of use.

CCDH says the small recording window was done to show how quickly the video platform can understand each user and push out potentially harmful content. Researchers also found 56 hashtags on TikTok leading to eating disorder videos with over 13.2 billion views.

TikTok has a billion active monthly users as of 2021. The CCDH study has shocking revelations on how the current social media platforms, especially TikTok, are prioritizing sensational content over sensible content when feeding it to the impressionable youth audience.

#Tiktok #SocialMedia #Instagram #MentalHealth #Children #Youth #Sensationalism #Law



Devil’s Advocate

Seeker for life. Looking to make technology simpler for everyone.