Massachusetts High Court Explores Meta’s Impact on Teen Addiction
Massachusetts’ highest court convened on Friday to examine allegations that Meta has knowingly designed addictive features on its platforms, Facebook and Instagram, particularly affecting the state’s youth. This lawsuit, initiated by Attorney General Andrea Campbell in 2024, claims that Meta’s practices are financially motivated and have adversely impacted countless teenagers across Massachusetts.
Why It Matters
The outcome of this case could have significant consequences for Meta and its business model, shedding light on the legal boundaries of social media companies in relation to user addiction, especially among vulnerable populations like children. As scrutiny over the effects of tech on mental health intensifies, this case may pave the way for future regulatory actions against similar practices across the industry.
Key Developments
- The Massachusetts Supreme Judicial Court heard oral arguments regarding the lawsuit against Meta.
- State officials argue that Meta designed features that cultivate addiction to boost profits.
- Meta’s representatives counter that the lawsuit threatens the First Amendment rights related to traditional publishing.
- Judges expressed concerns about the nature of Meta’s notifications designed to exploit teenage users’ fears of missing out.
- The company is also facing multiple federal and state lawsuits over alleged violations regarding minors and data collection.
Full Report
Arguments Presented
During the oral arguments, State Solicitor David Kravitz emphasized that the lawsuit is focused on specific tools developed by Meta, which, according to their own research, foster addictive behaviors among users. He clarified that their claims do not pertain to Meta’s algorithms or content moderation practices.
In rebuttal, Meta’s attorney, Mark Mosier, firmly rejected the allegations, asserting that the company’s actions align with First Amendment protections. He stated that the lawsuit could wrongfully impose liabilities on a company engaged in standard publishing practices. Mosier further noted that for the state to circumvent the First Amendment, its claims would need to demonstrate that Meta disseminated false information.
Judicial Concerns
Some justices, including Justice Dalila Wendlandt, indicated they were more focused on how Meta’s notifications were crafted to trigger responses in young users. “I didn’t understand the claims to be that Meta is relaying false information… but that it has created an algorithm of incessant notifications,” she remarked. Meanwhile, Justice Scott Kafker raised a point about Meta’s strategy being centered on attracting eyeballs rather than the content itself.
Wider Legal Context
Meta is embroiled in a series of ongoing legal challenges from both federal and state levels. In 2023, a coalition of 33 states filed a collective lawsuit asserting that the company unlawfully collected data on children under 13 without obtaining parental consent, which violates federal regulations.
Additionally, Massachusetts has pursued its own legal action addressing the potentially harmful designs of Meta’s platforms for young users. Reports from whistleblowers and studies have indicated that Meta’s own research recognized the detrimental impact of their platforms on mental health, particularly concerning teen girls, with some reporting worsened thoughts of suicide and increased eating disorder concerns.
Context & Previous Events
Earlier media reports, notably by The Wall Street Journal in 2021, revealed that Meta was aware of the potential harms posed by Instagram on teenagers, sparking growing concern among parents, lawmakers, and mental health advocates. Critics argue that Meta’s response to safety and mental health challenges has been inadequate, pointing out that the company has opted for superficial solutions rather than enacting robust measures to safeguard young users.
As the case continues to unfold, its implications for both technology regulation and teen safety remain profound and potentially transformative.








































