close
close

TikTok targets and exploits teens, Illinois and 13 other states say in new lawsuits

TikTok targets and exploits teens, Illinois and 13 other states say in new lawsuits

Illinois Attorney General Kwame Raoul joined 13 other states and the District of Columbia on Tuesday in filing lawsuits accusing TikTok of harmful business practices for allegedly targeting children and misleading the public about the dangers of social media platform.

The lawsuit comes nearly a year after Raoul joined 32 other states in a federal lawsuit against Facebook and Instagram company Meta Platforms Inc., claiming the platforms are addictive and exploit children’s psychological vulnerabilities.

“Even though TikTok knows internally that excessive and addictive use of its platform can be harmful to children and adolescents, it has failed to protect young users in the United States by offering a safer version of the platform. What is particularly concerning is that the version of TikTok used in the United States does not contain the same protections for children as those found in China,” Raoul said during a press conference Tuesday.

“In Illinois, we always put our children and youth first,” Raoul continued. “I am committed to holding TikTok and all other social media companies accountable for putting profits ahead of the safety and well-being of our children.”

Tuesday’s lawsuit, filed in Cook County Circuit Court, stems from an investigation Raoul announced in March 2022. The state is seeking an injunction to remedy TikTok’s “misconduct,” as well as monetary penalties .

“Defendants’ business model relies on keeping users on the platform as long as possible, in order to both show them more ads and catalog their data,” the suit states. “In service of (their) business model, Defendants perniciously employ an arsenal of addictive features specifically designed to exploit, manipulate and capitalize on the developing brains of young, nascent users, harvesting their data to exploit the vulnerabilities unique to each young person. user.”

The suit also claims the defendants are aware that TikTok, which is owned by Chinese company ByteDance, has the potential to “cause serious harm to young users,” including sleep deprivation, depression, anxiety, self-harm. , suicide and death.

Raoul said young girls use the platform’s beauty filters, which blur the boundaries of reality and can contribute to harm.

“Online features like this promote unrealistic beauty standards and are particularly dangerous for younger users because they can cause negative self-obsession or self-hatred toward their appearance,” Raoul said. “…TikTok’s decision to impose these appearance-altering effects on children, despite its knowledge of the harm, is immoral and a violation of our law prohibiting unfair trade practices.”

Raoul seeks to enforce state consumer protection laws to prevent the platform from capitalizing on young users and wants to hold defendants accountable for “the unfair and deceptive design, operation and marketing of the platform TikTok to entrap and addict young Illinois users,” the suit says.

The lawsuit notes that 6.2 million residents and 280,000 businesses in Illinois use TikTok.

In a statement, TikTok hit back at the characterizations contained in the suit — and said it was trying to resolve differences with attorneys general.

“We strongly disagree with these claims, many of which we believe are inaccurate and misleading. We are proud and remain deeply committed to the work we have done to protect adolescents and will continue to update and improve our product. We provide robust safeguards, proactively remove suspected underage users, and have voluntarily launched safety features like default screen time limits, family matching, and default privacy for minors under 16 said TikTok spokesperson Michael Hughes.

“We have been working with the attorneys general for more than two years, and it is extremely disappointing that they have taken this action rather than working with us on constructive solutions to the challenges affecting the entire sector. »