Join our FREE personalized newsletter for news, trends, and insights that matter to everyone in America

Newsletter
New

Social Media Giants Face Landmark Youth Addiction Trial In California

Card image cap


SACRAMENTO, California — A trial kicking off this week in Los Angeles will weigh one of the most fraught questions of the digital age: whether social media giants are responsible for harming children.

For the first time in the U.S., the tech behemoths behind social media platforms that reshaped modern life and communication will be forced before jurors to confront allegations they negligently created products that cause addiction, depression and other trauma.

The case names Meta, YouTube and TikTok as defendants. High-profile industry figures including Meta CEO Mark Zuckerberg and Adam Mosseri, who heads Meta-owned Instagram, could be called to testify.

“This is the first time families have ever had their right to a day in court,” Matthew Bergman, the Social Media Victims Law Center attorney representing plaintiffs suing the companies, told reporters last week. ”This is a historic point.”

The trial, which begins with jury selection in Los Angeles Superior Court on Tuesday, is a bellwether for similar suits pending in California state court.

A verdict in favor of the plaintiff, a 19-year-old, Northern California woman identified in court documents by the initials K.G.M., could require companies to pay damages or alter their platform designs, and encourage them to settle in thousands of similar lawsuits.

It also challenges a long-standing precedent that’s protected Big Tech firms from liability for alleged harms linked to social media use. For years, social media companies have gotten lawsuits dismissed by relying on a decades-old law, Section 230 of the Communications Decency Act, which exempts them from liability for most user-posted content on their platforms.

While some judges have accepted companies’ argument that content posted on their sites or suggested by an algorithm is protected speech, Superior Court Judge Carolyn Kuhl ordered earlier this year that K.G.M.’s case and a handful of similar lawsuits can proceed to trial. Plaintiffs contend more than a decade of social media use left K.G.M. addicted and depressed, and are requesting unspecified monetary damages.

The suit tests a novel legal theory that social media sites or their specific features are defective products that encourage addictive behavior — similar to cigarettes and opioids — and subject to personal injury law. It capitalizes on a trend where courts have increasingly allowed cases to proceed if they fault companies for deliberately making risky and addictive products or ignoring known harms associated with their platforms, not just hosting harmful content.

“Jurors will have to decide whether the harm was caused by alleged defective design features of the social media platforms at issue, or whether it was caused by the speech of third parties,” said Clay Calvert, a legal scholar specializing in First Amendment issues.

If jurors find the third-party content that K.G.M. viewed caused the harm, that’s a win for social media companies, said Kathleen Farley, vice president of litigation for left-leaning tech industry trade group Chamber of Progress. “That’s saying that nothing the social media companies were responsible for, like the design of their platform, caused the harm.”

Yet even if that happens, Farley said, the fact that K.G.M.’s case is proceeding to trial raises “troubling First Amendment implications” for social media companies because Section 230 protections "should have prevented” it from moving forward.

“The threat of being called into court in the first place is recognized as causing a chilling effect,” she added.

The trial could thrust some of the biggest names in tech back into the hot seat. In early 2024, social media CEOs were hauled before Congress to answer for the dangers their platforms are accused of having posed, particularly to children — a spectacle that resulted in an apology from Zuckerberg. They haven’t faced further public grilling since President Donald Trump reentered the White House last year.

Both Zuckerberg and Mosseri may be called to testify as soon as next week, since the state court judge overseeing the trial rejected Meta’s attempt last fall to prevent plaintiffs from summoning its top leaders in court.

If either takes the stand, they’ll field tough questions about what their platforms knew about risks to kids’ safety from experienced prosecutors armed with thousands of pages of internal company documents, not lawmakers looking for a sound bite.

For Zuckerberg, it would be a sharp shift from his experience testifying in front of Congress, Farley said.

“This inquiry is centered more on whether or not the jury believes his testimony. He has to answer specific questions, and [he] will be cross examined,” Farley added. “It's simply a different experience.”

A spokesperson for Meta, when asked about the company’s legal strategy, pointed to a recent company blog post that accuses the plaintiffs in the cases of constructing a "misleading narrative” to downplay the company’s safety initiatives and erroneously attribute wider teen mental health issues to its platforms.

Lawyers for the teen have highlighted unsealed court documents that appear to cut against the companies’ arguments. In November, they published an analysis of expert reports and witness depositions, including Zuckerberg, that accused social media executives of obscuring their social media platform’s potential dangers from parents and children.

Snap CEO Evan Spiegel was also expected to testify until his company settled its lawsuit on undisclosed terms last week. TikTok declined to comment on its legal strategy for the trial. Google, which owns YouTube, is aiming to tout YouTube Kids and other safety features while arguing the video-sharing site more closely resembles a streaming platform, rather than a traditional, likes-based social media like Facebook, Instagram or TikTok.

The LA trial is one of a series of bellwether cases for a joint proceeding of more than 1,600 plaintiffs, ranging from California school districts to families who accuse social media platforms of harming their kids.

At the federal level, more than 235 plaintiffs are suing Meta, Snap, TikTok and Google parent company Alphabet on similar grounds. That case includes a legal complaint against Meta from California top prosecutor Rob Bonta and a bipartisan coalition of at least 32 other state attorneys general. Trials for the federal multi-district litigation are slated to begin on June 15, the federal judge overseeing proceedings said during a court hearing Monday.

Meta is separately facing lawsuits from the Social Media Victims Law Center on behalf of two families who claim Instagram is a defective product that contributed to their children’s death by suicide. Yet another case brought against Meta by New Mexico’s attorney general accusing the company of creating “a breeding ground for predators” is scheduled for a trial next month after Meta invoked Section 230 to dismiss the complaint and a judge denied the motion.

State legislators have also been enacting laws throughout the country that cover many of the issues addressed in the cases, some of which are facing court challenges of their own from tech industry groups. Those efforts won’t let up even if companies get their best-case scenario in the Los Angeles trial.

“One scenario is that the defendants have a clean sweep, the litigation becomes irrelevant, and yet the legislation will be untouched by that outcome,” said Eric Goldman, an internet law professor at Santa Clara University School of Law in Silicon Valley. “Defendants are unlikely to knock out all of the different ways in which they're being regulated.”

Ruth Reader contributed to this report.