TikTok research finds teens exposed to harmful content
…
In 2022, TikTok faced many issues, with security issues coming in the first place.
A recent study raises the possibility that it might negatively affect young users.
After kids create an account on the video-sharing app, it may begin to promote inappropriate material regarding eating disorders and suicide.
The outcomes are expected to fuel the fires as TikTok’s issues worsen, especially in light of how it impacts young users.
The study
The charity Center for Countering Digital Hate (CCDH) released a report on Wednesday.
They discovered that it takes less than three minutes to watch content on TikTok about body image and suicide after signing up.
Users can find a community on the app that promotes information about eating disorders five minutes later.
The researchers claim that they created eight more accounts in the US.
New TikTok users in the US, UK, Canada, and Australia must be at least 13 years old.
The accounts took a little break and liked content about mental health and body image.
Every 39 seconds throughout the course of a 30-minute period, TikTok suggested videos about mental health and body image, according to the CCDH.
TikTok woes
The study is being released as local, state, and federal officials look into potential sanctions for TikTok, particularly concerning privacy and security concerns.
They are also evaluating the app’s safety for teenagers.
The study was made available to the public more than a year after senators questioned executives from social media companies during congressional hearings.
They were worried that the harmful content that would be shared on their platforms would expose younger users, particularly adolescent girls, to their mental health and self-esteem.
Following hearings and disclosures by Facebook leaker Frances Haugen, the companies decided to tighten their control over teenagers.
The CCDH study, however, indicates that more work has to be done.
“The results are every parent’s nightmare,” said Imran Ahmed, the CEO of the CCDH.
“Young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them and their physical and mental health.”
Read also: NetChoice claims California law violates First Amendment, sues state
Response
In response to the study’s publication, a TikTok official claimed that it was incorrect for a number of reasons, including:
- Small sample size
- The limited 30-minute window for testing
- How the accounts scrolled past unrelated topics to find other content
“This activity and resulting experience does not reflect genuine behavior or viewing experiences of real people,” said the spokesperson.
“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need.”
“We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”
The representative claims that the CCDH doesn’t differentiate between positive and negative videos on particular issues, noting that people frequently share inspiring stories of conquering eating disorders.
Safeguards
TikTok asserts that it is constantly enhancing user protections.
For instance, the app now has filters that may exclude explicit or “possibly harmful” videos.
TikTok developed a “maturity score” in July to identify videos with potentially mature or advanced content.
Additionally, users may choose how long they want to spend watching TikTok videos, regularly schedule screen breaks, and access a dashboard that shows information like how often they use the app.
Additionally, TikTok offers a number of parental restrictions.
Algorithm
The US Senator Richard Blumenthal’s office faked a 13-year-old girl’s Instagram account last year.
The account was followed by posts regarding dieting and eating disorders (which are supposed to be banned).
Blumenthal claims that the account began to be elevated to accounts with more extreme diets.
Instagram later deleted the accounts because it had violated its policies against encouraging eating disorders.
Read also: Donald Trump slumps in voter standing based on recent poll
Policy violations
According to TikTok, it is forbidden to post anything that suggests suicide or other self-destructive behavior or that normalizes, normalizes, or glorifies such behavior.
The information below shows videos that were taken down for breaking the laws against self-harm and suicide between April and June 2022:
- 93.4% were removed at zero views
- 91.5% were removed 24 hours after being posted
- 97.1% were removed before anyone reported them
The representative claims that anyone looking for prohibited terms like “#selfharm” won’t come up with anything.
They will be recommended to local aid programs instead.
Despite the assurances, the CCDH argues that additional steps are required to limit some content and enhance protection for individuals under 18.
“This report underscores the urgent need for reform of online space,” said Ahmed.
“Without oversight, TikTok’s opaque platform will continue to profit by serving its users – children as young as 13, remember – increasingly intense and distressing content without checks, resources or support.”
Reference:
TikTok may push potentially harmful content to teens within minutes, study finds