In a significant legal challenge to one of the world's most popular social media platforms, more than a dozen states have filed lawsuits against TikTok. They alleged that the company deliberately designed its app to keep young users addicted while misleading the public about its safety measures.
The bipartisan group of attorneys general, led by California and New York, filed separate lawsuits in 13 states and the District of Columbia. The legal actions accuse TikTok of violating consumer protection laws and contributing to a growing mental health crisis among teenagers.
At the heart of the lawsuits is the claim that TikTok's design features, including its personalized algorithm and endless scrolling capability, are intentionally manipulative and harmful to young users. The states argue that these features encourage excessive use, leading to emotional and behavioral changes in adolescents.
California Attorney General Rob Bonta said in a statement, "TikTok intentionally targets children because they know kids do not yet have the defenses or capacity to create healthy boundaries around addictive content." He added that the company "must be held accountable for the harms it created in taking away the time — and childhoods — of American children."
The lawsuits seek to force TikTok to modify its product features and impose financial penalties on the company. This legal action comes at a challenging time for TikTok, which is already facing a potential ban in the United States unless it severs ties with its China-based parent company, ByteDance, by Jan. 19.
TikTok, used by approximately half of the American population, now finds itself defending against allegations that tap into growing national concerns about the design of social media platforms and their potential contribution to mental health issues such as depression and body image problems among young users.
While the exact role of social media in exacerbating mental health issues remains a complex issue, the state authorities assert that TikTok has prioritized its growth and profits over the safety of children.
The lawsuits also target specific features of the app, including its use of beauty filters. New York Attorney General Letitia James stated, "Beauty filters have been especially harmful to young girls. Beauty filters can cause body image issues and encourage eating disorders, body dysmorphia, and other health-related problems."
New York investigators argue that TikTok failed to warn teens about these potential harms and instead promoted beauty filters to its youngest users to increase engagement time on the app.
The District of Columbia's lawsuit alleges that TikTok traps teens in online bubbles that "bombard them with precisely the kinds of content that TikTok claims not to allow, including videos about weight-loss, body-image, and self-harm content."
Another feature under scrutiny is TikTok's live-streaming capability. The state attorneys allege that thousands of underage users have hosted live-streamed videos where viewers can send digital "gifts" that can be converted to money. They claim this feature has been used to incentivize the sexual exploitation of children.
In response to these allegations, TikTok spokesman Alex Haurek said the accusations in the lawsuits are misleading. "We provide robust safeguards, proactively remove suspected underage users, and have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16," Haurek stated.
Haurek noted that the lawsuits follow more than two years of negotiations with the attorneys general. "It is incredibly disappointing they have taken this step rather than work with us on constructive solutions to industrywide challenges," he said.
The legal actions will proceed in 14 separate state courts, as each complaint relies on specific state consumer protection laws. Individual trial dates will be set in the coming months or years unless the cases are dismissed, or settlements are reached.
In recent months, many social media platforms, including TikTok, have enhanced their child safety tools in response to growing concerns. TikTok has implemented measures such as preventing young users from sending direct messages and setting their accounts to private by default. The app also uses screentime reminders to alert users about their scrolling duration.
The legal action against TikTok follows a similar lawsuit filed last year by a coalition of states against Meta, the parent company of Instagram and Facebook. That case, which is still pending, accused Meta of failing to keep children safe on its popular apps.
Last month, Meta revealed several new features to improve parental monitoring on Instagram and also changed all teenage accounts to private in order to protect young individuals from engaging with potential predators.
However, the states dismiss these safety measures as ineffective public relations efforts. They argue that TikTok has not done enough to verify users' ages when opening accounts, allowing adolescents to circumvent child safety measures by lying about their age.
Bonta criticized TikTok's child safety features, stating they "do not work as advertised." He further claimed, "The harmful effects of the platform are far greater than acknowledged, and TikTok does not prioritize safety over profit."
The states involved in the lawsuit include California, New York, Illinois, Kentucky, Louisiana, Massachusetts, Mississippi, North Carolina, New Jersey, Oregon, South Carolina, Vermont, Washington, and the District of Columbia.