World

Meta, Google face jury setbacks in US child safety cases

Verdicts in California and New Mexico hold tech firms liable; legal battle may redefine scope of liability protections for online platforms

Meta, Google face jury setbacks in US child safety cases
EU top court sides with Germany in Meta antitrust case 

Jurors in two early trials in the United States have found Meta Platforms and Alphabet Inc.’s Google liable in cases alleging harm to children, setting the stage for an appeals battle that could reshape how US law shields technology companies from lawsuits.

In California, a Los Angeles jury on 26 March held Meta and Google responsible for a young woman’s depression and suicidal thoughts, which she attributed to addiction to Instagram and YouTube from a young age. The jury awarded a combined $6 million in damages.

In a separate case in New Mexico, jurors on 25 March ordered Meta to pay $375 million after finding the company misled users about the safety of its platforms for younger audiences and enabled sexual exploitation of children, according to court proceedings.

Both companies have denied the allegations and said they plan to appeal.

Section 230 shield under scrutiny

The verdicts challenge protections under Section 230 of the Communications Decency Act, a 1996 US law that generally shields online platforms from liability for user-generated content.

In both cases, plaintiffs argued that harm arose not from third-party content but from the companies’ platform design choices. Trial judges allowed the cases to proceed, rejecting arguments by Meta and Google that Section 230 barred the claims.

A Meta spokesperson said the company “respectfully disagrees with the verdicts” and will appeal, adding that it remains committed to building safe environments for young users. Google has also indicated it will appeal the California ruling.

Legal experts said the appeals are likely to focus heavily on whether Section 230 applies to platform design decisions.

Gregory Dickinson, a law academic at the University of Nebraska, said courts are increasingly distinguishing between liability for third-party content and liability tied to platform functionality.

Published: undefined

Thousands of cases pending

The rulings come amid a broader wave of litigation against technology companies over alleged harms to children and teenagers.

Meta, Google, Snap Inc., and ByteDance face thousands of lawsuits in US courts over claims that their platform designs contribute to mental health issues among young users.

More than 2,400 cases have been consolidated before a federal judge in California, with additional cases proceeding in state courts.

Legal analysts said lower courts have increasingly adopted a narrower interpretation of Section 230, particularly in cases involving product design and platform features. However, no US appellate court has yet issued a binding ruling on the issue.

Wider implications for internet platforms

Experts said appellate decisions in these cases could have consequences beyond social media, potentially affecting a wide range of online platforms.

More than 130 lawsuits are pending against Roblox Corporation in federal court, alleging failure to protect users from sexual exploitation. The company has denied the claims.

Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University, said the litigation could extend to the broader internet ecosystem.

“The internet is on trial, not just social media,” he said, adding that similar legal theories could be applied across different types of platforms.

Supreme Court may intervene

Appeals in both cases will first be heard in state-level appellate courts, but legal experts said the issue could ultimately reach the US Supreme Court.

The court has previously shown interest in the scope of Section 230. In 2023, it heard a case involving YouTube but avoided ruling on the law’s protections. In 2024, it declined to hear a case involving Snap, though two justices dissented, warning against delays in addressing the issue.

Meetali Jain, director of the Tech Justice Law Project, said the court may now be prepared to take up a case that directly tests the limits of Section 230.

The outcomes of the appeals could determine whether technology companies remain broadly protected from liability or face expanded legal exposure over how their platforms are designed and operated.

Published: undefined

Follow us on: Facebook, Twitter, Google News, Instagram 

Join our official telegram channel (@nationalherald) and stay updated with the latest headlines

Published: undefined