Massachusetts Court Weighs Whether Meta Designed Apps To Hook Teens
Massachusetts’ highest court heard two days of arguments in a high stakes lawsuit accusing Meta of intentionally engineering Instagram and Facebook to foster compulsive use among adolescents, a debate that could reshape liability and regulation of social media. The case hinges on internal company research and a central constitutional question, because a ruling for the state could expose platform design choices to new legal oversight with broad implications for children and the tech industry.

The Massachusetts Supreme Judicial Court heard oral arguments on December 5 and 6 in a landmark lawsuit brought by the state attorney general alleging that Meta Platforms intentionally designed features of Instagram and Facebook to be addictive for adolescents. The suit, filed in 2024, relies in part on internal Meta research that the state says shows specific product features encourage compulsive use by young people, including endless scrolling, persistent notifications and other engagement tools.
State lawyers argue that those design choices amount to commercial conduct that can be regulated and that Meta can be held responsible for harms caused to minors. The case was argued by the state solicitor before the court, which was asked to determine whether the platforms’ engineering choices can give rise to liability when they foreseeably result in harm to children.
Meta’s lawyers responded by contesting the state’s characterization of product design. They argued that the company’s product choices amount to protected expression under the First Amendment and warned that treating design as commercial conduct would risk sweeping constitutional implications for how online services operate. During the two days of argument, justices pressed both sides on the legal boundary between product design and publishing, probing whether user interfaces and algorithms should be treated like speech or like the sale of a consumer product.
The dispute centers on competing legal frameworks. If the court accepts the state’s view, platform features aimed at capturing attention could be subject to regulation or civil liability when they cause foreseeable harm to minors. If the court accepts Meta’s position, many core software and interface decisions could be shielded as expressive conduct, narrowing the ability of states and consumers to pursue claims tied to platform design.

Observers say a ruling against Meta could have far reaching consequences beyond this case. Plaintiffs in other state and private suits that target addictive design features could find new avenues for relief, and regulators seeking to impose design standards intended to protect children could be emboldened. Conversely, a decision for Meta would reinforce broad First Amendment protections for technology companies and raise the bar for lawsuits that seek to police user engagement techniques.
Legal scholars and child safety advocates have framed the litigation as part of a larger reckoning over how modern digital products are built and marketed to young people. The Massachusetts suit puts into sharp relief the tension between business models premised on maximizing attention and growing public concern about the mental health and developmental impacts of intensive social media use among adolescents.
The court’s decision is likely to be closely watched by state attorneys general, technology companies, parents and lawmakers across the country. Whatever the outcome, the case will help define whether courts treat platform architecture as a form of speech immune from regulation, or as a commercial practice that can be checked when it harms vulnerable users.


