New York Attorney General Letitia James and a bipartisan coalition of 40 states filed a federal lawsuit against Meta Platforms Inc., the parent company overseeing Facebook, WhatsApp and Instagram on Oct. 24.
The states alleged that Meta’s products contain addictive features that put young users in jeopardy — and that Meta downplayed the potential dangers.
The bipartisan investigation was led by Attorneys General Jonathan Skrmetti of Tennessee and Phil Weiser of Colorado.
The federal suit stated that Meta maximized the attention that teen users spend on its social media platforms at the expense of their mental health and that the tech company is aware of it.
“Despite overwhelming internal research, Meta has redoubled its efforts to misrepresent, conceal and downplay the impact of the harmful features on young user’s mental and physical health,” the suit stated.
As evidence, the attorneys provided an internal Meta document, made public by former employee and whistleblower Frances Haugen, who took over 20,000 screenshots of records about the company’s research on the potential harms they were creating. These documents revealed how Meta studied teenagers’ behavioral patterns to make the platform more alluring to their audience.
In an interview, Weiser added that employee access to data became restricted and that Meta disbanded internal research teams. He interpreted the moves as part of an effort to protect its business model to maximize user engagement.
In their complaint, they added a violation of consumer protection laws by designing “psychologically manipulative product features to induce young users’ compulsive and extended use” of social media platforms. With addictive features such as the “infinite scroll” and persistent notifications, users get hooked.
The lawsuits, driven by legal and public pressure, are likely to reshape the future of social media platforms, particularly concerning the welfare of younger users.
To regain trust and comply with stricter regulations, the states suggest that social media companies should implement enhanced privacy protections, greater transparency in their operations and more responsible marketing practices. A possible settlement was discussed between both parties, requiring the companies to acknowledge the alleged risks of its products and limit specific design features aggravating existing mental health issues.
Meta was unwilling to comply, and the company has chosen to defend itself in court.
“32% of teen girls said that they felt bad about their bodies; Instagram made them feel worse,” according to Meta’s own documents. But the company pushed back by saying the report was misleading, adding that teenage girls said Instagram was helpful in addressing 11 out of 12 “potential well-being issues.”
Lawmakers and regulators worldwide have attempted to push accountability in the industry. With young users constantly exposed to potentially harmful content, there lies a pressing need for platforms to prioritize safety. Many young users have taken their own lives through explicit content. These lawsuits serve as a significant catalyst for positive change in the industry to ensure a safer online environment.
Larissa May, founder of a digital well-being non-profit, said most teens are on their phones for about eight hours every day — about 30 years of their lives influenced by a screen.
Since the lawsuit, Meta suspended its project Instagram Kids, which was designed for users under the age of 13. Meta had temporarily paused their plans for the version of its Instagram app tailored for children due to strong backlash from 40 states that collectively wrote a letter to Mark Zuckerberg, Meta’s CEO, underscoring the company’s historical lapses in safeguarding the welfare of children on its platforms.
The attorney generals plan to continue to pursue similar inquiries with Meta’s social media competitors, like TikTok, and attempt to produce internal records related to teen mental health.