A Massachusetts court has ruled that Meta Platforms must face a lawsuit over allegations its Instagram service was designed to promote addiction among children.
The decision came on Friday from the Massachusetts Supreme Judicial Court where Judges agreed the case can go ahead, rejecting Meta’s argument that it is protected by federal law.
At the centre of the case is Section 230 of the Communications Decency Act, which usually shields tech companies from liability over content posted by users.
The court said that protection does not apply here because the allegations focus on how the platform itself was built.
Writing for a unanimous bench, Justice Dalila Argaez Wendlandt said the lawsuit does not seek to hold Meta responsible for posts made by users. Instead, she said it targets the company’s own actions in designing the platform.
“Instead, the claims allege harm stemming from Meta’s own conduct either by designing a social media platform that capitalizes on the developmental vulnerabilities of children or by affirmatively misleading consumers about the safety of the Instagram platform,” Wendlandt wrote.
The case was filed by Andrea Joy Campbell, who argues that features such as endless scrolling, notifications and post “likes” were built to keep young users engaged for longer. Her office also claims the company understood the risks but failed to make changes.
Campbell welcomed the ruling, calling it a “major step in holding these companies accountable for practices that have fueled the youth mental health crisis and put profits over kids.”
Meta has denied the allegations, saying it has taken steps to improve safety for teenagers and younger users, although it has not yet responded publicly to the latest ruling.
The decision is the first time a state high court has clearly said Section 230 does not protect a company from cases tied to its own design choices. That point could affect other cases across the United States.
Across the country, states, school districts and families have filed similar lawsuits. Many accuse Meta and other platforms of creating products that are difficult for young users to step away from.
Some of those cases have already reached juries. In Los Angeles, a recent trial found Meta and Google negligent for designing platforms that harmed a young user, awarding damages.
In another case, a jury ordered Meta to pay civil penalties over claims it misled users about safety and allowed harmful activity on its platforms.
Back in Massachusetts, the focus will now move to trial proceedings. Meta must now defend how Instagram was designed, not just what appears on it.





