Massachusetts’ Supreme Judicial Court heard arguments Friday in a state lawsuit claiming Meta intentionally engineered Facebook and Instagram features to be addictive for young users.
Attorney General Andrea Campbell’s 2024 lawsuit asserts that Meta designed these features to boost profits, affecting hundreds of thousands of Massachusetts teens who use the platforms.
State Solicitor David Kravitz argued that the case focuses solely on Meta’s own design tools, saying the company’s internal research shows these features encourage addictive behavior. He emphasized that the claims do not involve Meta’s algorithms or moderation practices.
Meta rejected the accusations on Friday, insisting it has long worked to support youth safety. Company attorney Mark Mosier argued the lawsuit would improperly penalize Meta for standard publishing activities, which he said are protected under the First Amendment.
Mosier added: “If the state claimed the speech was false or fraudulent, its argument would be stronger. But acknowledging the content is truthful places this squarely under First Amendment protection.”
Several justices, however, seemed more focused on Meta’s engagement-driving functions—such as persistent notifications—rather than the content itself.
Justice Dalila Wendland said the state’s complaint centers on “incessant notifications designed to exploit teenagers’ fear of missing out,” not on Meta spreading false information.
Justice Scott Kafker questioned Meta’s argument that this is simply about choosing what to publish: “This isn’t about publishing—it’s about capturing attention. The content doesn’t matter; the goal is to keep users looking.”
Meta faces multiple state and federal lawsuits accusing the company of creating features—like nonstop notifications and infinite scrolling—to hook young users.
In 2023, 33 states sued Meta, alleging it illegally collected data on children under 13 without parental consent. Several states, including Massachusetts, filed separate lawsuits targeting addictive design and youth harms.
Investigative reports starting with a 2021 Wall Street Journal series revealed Meta knew Instagram could be harmful to teens, especially girls, citing internal research showing that 13.5% of teen girls said the app exacerbated suicidal thoughts and 17% said it worsened eating disorders.
Critics argue Meta has failed to meaningfully address youth mental-health risks. A 2024 report by whistleblower Arturo Bejar and several nonprofits claimed Meta prioritized publicity-friendly features over substantive safety improvements.
Meta said the report distorted the company’s efforts to protect teens