close
close

TikTok knew its algorithm was harming children, internal documents accidentally reveal

TikTok knew its algorithm was harming children, internal documents accidentally reveal

  • Attorneys general from 14 states are suing TikTok for harming children’s mental health.
  • Internal research accidentally revealed this week indicates that TikTok knew its algorithm was dangerous.

Details of a multistate investigation into TikTok accidentally became public this week, revealing that those responsible for the wildly popular video app knew they were harming American teenagers, according to a new report.

The incriminating internal documents became public when Kentucky Public Radio reporters realized that redacted sections of court documents were instantly no longer redacted when copied and pasted into a new text file, NPR reported Friday.

The court documents came from Kentucky’s part of the coordinated 14-state effort Tuesday to sue TikTok for what officials say is an addictive algorithm that endangers the mental and physical health of children who use the app.

According to NPR, internal TikTok documents show officials at parent company ByteDance discussing internal studies showing the app can harm children.

Young users may become addicted to the app after watching 260 videos on the platform, TikTok employees determined, according to the documents. Since most TikTok videos are about 8 seconds long, Kentucky officials calculated that a child could become addicted after just 35 minutes of using the app.

NPR reports that in one of the recently revealed internal documents, TikTok employees claim that heavy use of the app “correlates with a range of negative mental health effects, such as loss of analytical skills , memory formation, contextual thinking, conversation depth, empathy and increasing empathy.” anxiety.”

Adam Wandt, associate professor and vice president of technology at John Jay College of Criminal Justice, told Business Insider that copy-and-paste errors are very common in the US justice system and that he has encountered dozens of cases similar.

“Writing documents is quite difficult,” Wandt said. “Very often people just put black bars in PDFs or black text, but the text still remains.”

In a statement to BI on Friday, a spokesperson said it was “highly irresponsible for NPR to publish information that is under court seal.”

“Unfortunately, this complaint cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety,” a company spokesperson said. “We have robust protections in place, which include proactively removing suspected underage users, and we have voluntarily launched security features such as default screen time limits, family matching, and privacy by default for users. minors under 16 years old. We are maintaining these efforts.”

TikTok disputed claims made in the state’s lawsuits, saying Tuesday through a spokesperson that the app protects its youngest users with “robust safeguards.”

According to NPR, the unredacted internal documents show that TikTok touted its tools for limiting teens’ screen time, even though it knew from its own research that those features had “little impact.”

Jayne Conroy, an attorney at Simmons Hanly Conroy, which represents about 50 plaintiffs in a class-action tort accusing social media platforms of harming children, said internal documents uncovered during state investigations show that tech companies deliberately design products to “relentlessly engage.” and harnessing adolescent brains.

“They have worked repeatedly to maximize both user engagement and profits, all at the expense of the mental health of our young people,” Conroy told BI. “The fact that these companies knew and ignored this harm is exactly consistent with the allegations in our complaint,” she said. .

Wandt told BI that the contents of the documents were “not surprising at all.”

TikTok’s policies prohibit users under 13 from creating an account, but unredacted internal documents show that TikTok asks moderators to use caution when deleting accounts, according to NPR. An internal document reveals that TikTok instructs its moderators not to take action on reports of underage users unless the account identifies them as under 13, NPR reported.

Wandt said it was “well within TikTok’s ability to prevent minors under a certain age from using their app.”

“However, from a business perspective, it’s not necessarily in their business interest,” he said. He called TikTok’s algorithm “one of the most dangerous influences on the planet right now” for children.

“So they’ve put in place poorly designed measures and policies that they know don’t work, and it doesn’t surprise me at all that their own internal research shows that it doesn’t work because they really aren’t not incentivized to fix the problems,” he said.

Matthew Bergman, founding attorney at the Social Media Victims Law Center, which represents more than 3,000 plaintiffs in cases involving teens injured by social media, told BI that the unsealed information is “certainly consistent with what we’re seeing.” on TikTok and other social networks. media.

“They design these products to be addictive, especially through their endless scrolls,” Bergman said. “They make their money by showing children not what they want to see but what they can’t look away from.”