close
close

TikTok is reportedly aware of its harmful effects on adolescent users

TikTok is reportedly aware of its harmful effects on adolescent users

TikTok executives and employees were well aware that its features promoted compulsive use of the app, as well as its negative effects on mental health, according to NPR. The broadcaster reviewed unredacted documents from the lawsuit filed by the Kentucky Attorney General’s Office, as released by Kentucky Public Radio. More than a dozen states sued TikTok a few days ago, accusing it of “falsely claiming (that it is) safe for young people.” Kentucky Attorney General Russell Coleman said the app was “specifically designed to be an addiction machine, targeting children who are still developing appropriate self-control.”

Most of the documents submitted for the lawsuits contained redacted information, but the Kentucky one had faulty redactions. Apparently, TikTok’s own research found that “compulsive use correlates with a range of negative mental health effects, such as loss of analytical skills, memory training, contextual thinking, depth of conversation, empathy and increased anxiety.” TikTok executives also knew that compulsive use could interfere with sleep, work and school responsibilities, and even “connecting with loved ones.”

They would also know that the app’s time management tool does little to keep younger users away from the app. While the tool sets the default limit for app usage at 60 minutes per day, teens still spent 107 minutes on the app, even when it was on. This is only 1.5 minutes less than the average usage of 108.5 minutes per day before the tool launched. Based on internal documents, TikTok based the tool’s success on how it “improved public trust in the TikTok platform through media coverage.” The company knew the tool wouldn’t be effective, with one document stating that “(m)infants do not have executive function to control their screen time like young adults do.” Another document reportedly stated that “for most engagement metrics, the younger the user, the better the performance.”

Additionally, TikTok would be aware of the existence of “filter bubbles” and understand how potentially dangerous they could be. Employees conducted internal studies, according to the documents, in which they found themselves sucked into negative filter bubbles shortly after following certain accounts, such as those focused on painful (“painhub”) and sad content ( “sadnotes”). They are also aware of content and accounts promoting “thinspiration,” associated with eating disorders. Because of how TikTok’s algorithm works, its researchers found that users are placed in filter bubbles after 30 minutes of use in a single session.

TikTok also struggles with moderation, according to the documents. An internal investigation found that underage girls on the app were receiving “gifts” and “coins” in exchange for taking their clothes off live. And the company’s top officials reportedly asked their moderators not to remove users reported as under 13 unless their accounts indicate that they are in fact under 13. NPR According to TikTok, it has also acknowledged that a significant amount of content violating its rules passes through its moderation techniques, including videos that normalize pedophilia, glorify minor sexual assault and physical abuse.

TikTok spokesperson Alex Haurek defended the company and told the organization that Kentucky AG’s complaint “cherry-picks misleading quotes and takes outdated materials out of context to misrepresent our commitment to security of the community.” It also said that TikTok has “robust protection measures, which include proactively removing suspected underage users” and that it has “voluntarily launched safety features such as default screen time limits, pairing family and confidentiality by default for minors under 16 years of age.