Meta Platforms, TikTok and YouTube have been taken to court over allegations that their platforms were built in ways that trap children’s attention and worsen mental health.
At the centre of the case is a 19-year-old woman from California, identified in court papers as K.G.M. She argues that she became hooked on the apps while still a child and that prolonged use damaged her mental health.
She is asking the court to hold the companies liable for the effects of their product design, not just the content she consumed.
Beyond a single dispute, the trial emphasises whether a digital product can be treated like any other consumer good when it causes harm. That question will now be argued in open court, under oath, and in front of a jury.
Lawyers for the plaintiff say this is the first time technology firms must defend themselves at trial on claims that their platforms injured a young user.
Report Links Growing Mental Health Crisis among Children to Use of Technology
Matthew Bergman, the lead attorney, said: “They will be under a level of scrutiny that does not exist when you testify in front of Congress.”
The jury must decide whether the companies were negligent, and whether use of the platforms played a role in K.G.M.’s mental health challenges, as distinct from other factors in her life or the third-party material she viewed. Legal experts say the result could influence hundreds of similar cases awaiting resolution.
“This is really a test case,” said Clay Calvert, a media lawyer at the American Enterprise Institute. “We’re going to see what happens with these theories”.
Senior executives are expected to be called. Meta chief executive Mark Zuckerberg is due to testify, an uncommon sight for a technology founder in a civil courtroom.
Snap’s chief executive Evan Spiegel had also been expected, but Snap agreed to settle the case against it earlier this month. The company has not disclosed the terms.
The remaining firms are preparing distinct defences. Meta has said its products did not cause the plaintiff’s difficulties and mental health challenges. YouTube plans to argue that its service is different in nature from platforms such as Instagram and TikTok and should not be treated the same way. TikTok has declined to outline its courtroom strategy.
Since 2022, thousands of lawsuits across the United States have accused social media companies of deliberately designing addictive features that harm children.
In September 2025, a California court allowed expert witnesses to explain how tools such as endless scrolling, autoplay and algorithm-driven feeds affect young users’ mental health. That ruling cleared the path for this bellwether case.
At the same time, the companies are fighting a parallel issue for public trust. They have rolled out new parental controls, funded school workshops and partnered with youth groups to show they take safety seriously. Meta has sponsored “Screen Smart” sessions in schools.
TikTok has backed parent programmes under the banner “Create with Kindness”. Google, YouTube’s parent company, has worked with the Girl Scouts on online safety badges.
Individuals say these initiatives muddy the waters. Julie Scelfo, founder of Mothers Against Media Addiction, said: “These companies are using every lever of influence that you can imagine. It can be very confusing for parents who to trust.”


