Lawsuit filed by 33 US states alleges Meta knew Facebook and Instagram's 'addictive features harmed young people's physical and mental health'

The Facebook 'Like' emoji logo is seen in this photo illustration on 22 August, 2023 in Warsaw, Poland. (Photo by Jaap Arriens/NurPhoto via Getty Images)
(Image credit: Getty Images)

Meta is being sued by the attorneys general of 33 separate US states over allegations that it intentionally created and launched features on its Facebook and Instagram social media platforms that "purposefully addict children and teens."

"Kids and teenagers are suffering from record levels of poor mental health and social media companies like Meta are to blame," New York state attorney general Letitia James said in a statement. "Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem. Social media companies, including Meta, have contributed to a national youth mental health crisis and they must be held accountable."

The lawsuit was heavily redacted when it was originally filed in October, but a new "less-redacted" version shared by the state of California reveals some numbers that don't look good for Meta. It alleges that in 2021, for instance, Meta received more than 402,000 reports of under-13 users on Instagram through the platform's reporting process, but acted on fewer than 164,000 of them. 

It also allegedly made active efforts to avoid acting on complaints about underage users: One internal email chain in 2018 referenced in the lawsuit talks about "coaching" parents in order to convince them to allow their children to remain on the platform, while another included a discussion about Meta's failure to delete a 12-year-old girl's four accounts despite complaints from the girl's mother, which the lawsuit says were ignored because employees "couldn't tell for sure the user was underage."

The lawsuit says Meta's business model is "based on maximizing the time that young users spend on its social media platforms," and to that end it designed and deployed "psychologically manipulative" features designed to exploit them. At the same time, it promoted those features as specifically not being manipulative, and "routinely published profoundly misleading reports purporting to show impressively low rates of negative and harmful experiences" amongst its users. 

It also allegedly "continued to conceal and downplay" research indicating a range of negative outcomes associated with the use of social media, including its own internal studies, which "reveal that Meta has known for years about the serious harms associated with young users’ time spent on its social media platforms."

Naturally, allegations of widespread violations of the Children's Online Privacy Protection Act (COPPA)—the one that cost Epic a whopping half-billion dollars in 2022—are also in the mix: "Meta has marketed and directed its Social Media Platforms to children under the age of 13 and has actual knowledge that those children use its Platforms. But Meta has refused to obtain (or even to attempt to obtain) the consent of those children’s parents prior to collecting and monetizing their personal data."

In June, Meta posted an announcement about new parental supervision tools available in Messenger, which it expanded to Facebook, Instagram, and Horizon Worlds in November. "These tools allow parents to see how their teen uses Messenger, from how much time they’re spending on messaging to providing information about their teen’s message settings," Meta said. These tools do not allow parents to read their teen’s messages. 

"Over the next year, we’ll add more features to Parental Supervision on Messenger so parents can help their teens better manage their time and interactions, while still balancing their privacy as these tools function in both unencrypted and end-to-end encrypted chats."

One of the features touted in that announcement is the "Take a Break" tool, rolled out to Instagram in 2021, which enables teen users to set themselves a reminder to stop scrolling and go do something else. But the lawsuit dismisses it out of hand, because "instead of being able to set it and forget it, young users who make what can be a difficult choice to limit their daily use or take a break must make this difficult decision over and over again. Meta’s design choices make the proverbial wagon that much easier for young users to fall off."

"Meta knows that what it is doing is bad for kids — period," California Attorney General Rob Bonta said. "Thanks to our unredacted federal complaint, it is now there in black and white, and it is damning. We will continue to vigorously prosecute this matter."

The matter has yet to be tried before a court, but the lawsuit certainly does seem comprehensively assembled. The states involved are seeking a permanent injunction against "engaging in the acts and practices" on all Meta-owned social media platforms, as well as per-state fines, civil penalties, and legal costs, varying based on each state's individual laws. All told, it could add up to a lot.

I've reached out to Meta for comment on the lawsuit and will update if I receive a reply.

Andy Chalk

Andy has been gaming on PCs from the very beginning, starting as a youngster with text adventures and primitive action games on a cassette-based TRS80. From there he graduated to the glory days of Sierra Online adventures and Microprose sims, ran a local BBS, learned how to build PCs, and developed a longstanding love of RPGs, immersive sims, and shooters. He began writing videogame news in 2007 for The Escapist and somehow managed to avoid getting fired until 2014, when he joined the storied ranks of PC Gamer. He covers all aspects of the industry, from new game announcements and patch notes to legal disputes, Twitch beefs, esports, and Henry Cavill. Lots of Henry Cavill.