METAVERSE

The dark side of TikTok: How the app’s AI shows users potentially harmful content

TikTok has been recommending content about self-harm and eating disorders to some users several minutes after they join the platform, according to a new report published on Wednesday by the US Center for Countering Digital Hate. In one experiment, researchers impersonated 13-year-old users interested in body image and mental health content by creating TikTok accounts.

The dark side of TikTok: How the app's AI shows users potentially harmful content

They paused at videos on these topics, feigning interest. Accounts were opened in countries around the world, including the United States, the United Kingdom, Canada, and Australia. It was discovered that the TikTok algorithm recommended suicide-related content as quickly as in 2 minutes and 6 seconds after joining the app. Eating disorder content was recommended in just 8 minutes.

Researchers discovered 56 TikTok hashtags associated with eating disorder videos received over 13.2 billion views in this study.

The video platform remembers that the content launched is unique to each person, and the social platform remains focused on creating a safe and comfortable space for all,

according to a TikTok spokesperson.

TikTok is capable of detecting user vulnerabilities and attempting to exploit them

TikTok is capable of detecting user vulnerabilities and attempting to exploit them

Social media can be harmful to users’ mental health. However, what’s even more worrisome is that some platforms, like TikTok, seem to be specifically designed to exploit users’ vulnerabilities. TikTok’s algorithms are designed to keep users engaged for as long as possible. And one way they do this is by constantly feeding users the content they find appealing, even if it’s harmful.

For example, TikTok has been known to promote suicide, eating disorders, and body image issues. This is especially harmful to teenagers, who are already more vulnerable to mental health issues. What’s even more alarming is that TikTok seems to be specifically targeting teenagers with this content. This is highly dangerous, as it can further exacerbate their mental health crisis.

TikTok, which was launched globally in 2017 by the Chinese company ByteDance and uses algorithms based on personal data, including likes, subscriptions, watch time, and user interests—has become the world’s fastest-growing social media application, with a billion monthly active users expected by 2021.

Read more about TikTok:




Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
%d bloggers like this: