A first-of-its-kind decision marked a crucial moment of accountability for social media companies. It’s just the beginning.
Meta, YouTube, TikTok and Snap face thousands of lawsuits from individuals and families, as well as school districts and state attorneys general. Each case, some which are set to go to trial next year, is different. But a landmark judgment Wednesday could offer hints about what’s to come.
History is littered with companies that have lost significant court cases with massive penalties and survived just fine, of course. But often those cases brought changes within companies – to product ingredients or manufacturing, for example. In many cases, waves of legal pressure have also sparked cultural change, shifting how consumers engage with these companies and their products.
A Los Angeles jury on Wednesday found that Meta and YouTube knew their platforms posed risks to young people and bore responsibility for a young woman’s mental health challenges. It follows years of concerns from parents, advocates and whistleblowers. TikTok and Snap settled the Los Angeles case ahead of trial.
The financial repercussions of the Los Angeles case — a combined $6 million in compensatory and punitive damages — are a small price to pay for companies as large as Meta and Google. What’s more, the companies plan to appeal the decision, and there’s no guarantee that subsequent cases will go the same way.
“Teen mental health is profoundly complex and cannot be linked to a single app,” a Meta spokesperson said in a statement. “We will continue to defend ourselves vigorously as every case is different, and we remain confident in our record of protecting teens online.”
Google spokesperson José Castañeda said in a statement that the case “misunderstands YouTube, which is a responsibly built streaming platform, not a social media site.”
Still, the verdict proves that social media companies are not immune from responsibility for their impact on users. And it came one day after a New Mexico jury ordered Meta to pay $375 million in damages for failing to prevent child sexual exploitation on its platforms. Together, the decisions could herald major changes for Big Tech, whether through the courts, Congress or beyond.
“This verdict sends a clear message to an entire industry that the era of operating without consequence is over,” Mark Lanier, founder of the Lanier Law Firm and lead trial counsel for the plaintiff, said in a statement.
A new legal path
Tech giants have for years avoided legal liability for user safety-related issues thanks to Section 230, a law shielding them from responsibility for the content that third parties post on their platforms.
But the Los Angeles case, brought by a young woman named Kaley, tested a novel legal theory: holding social media companies accountable for harms caused by their design decisions rather than the content they host.
Kaley’s lawyers pointed to endlessly scrolling feeds, autoplay videos and beauty filters, features advocates hope the companies could eventually be forced to change or do away with for teens.
The jury agreed: Ten of the 12 jurors found the companies negligently designed their platforms, failed to warn users of known risks and played a substantial role in causing Kaley’s mental health challenges.
The damages are less than plaintiffs had asked for but represent a lot of money to Kaley, said attorney Jayne Conroy, if not necessarily to the companies. But perhaps even more important for the subsequent cases: “We were looking for yeses and to prove our theory,” Conroy, a partner at Simmons Hanly Conroy and a member of the trial team, said in an interview.
A wave of litigation
The decision helps legal teams determine how to use evidence unearthed in the litigation — including testimony from company executives and whistleblowers and internal documents and research — in subsequent trials. The next “bellwether” case, this one brought by a teen boy, is set to go to trial later this year.
“It really hones our strategy,” said Conroy, adding that bringing such a case requires combing through millions of internal documents. “What we’re able to do is analyze what documents we were using and really crystallize which ones make the most impact and why.”

The Tuesday decision by a New Mexico jury that Meta is liable for failing to prevent child sexual exploitation on its platforms could also set a precedent for state cases. Meta plans to appeal the New Mexico case, too.
“You add it all up and it could be hundreds of billions of dollars,” Jonathan Haidt, social psychologist and author of “The Anxious Generation, told CNN. “That, I think, would get Meta’s attention, and I think that would possibly cause them to change their behavior.”
While tech giants argue they’ve already invested heavily in youth safety features, some experts are comparing the wave of legal pressure to Big Tech’s Big Tobacco moment.
“I’m old enough to remember when we had smoking sections on airplanes and now, because of litigation, anyone who buys a pack of cigarettes sees cancer warnings all over the packaging,” former federal prosecutor Neama Rahmani said in emailed commentary, adding that Wednesday’s verdict could be the start of similar dramatic change.
Legislation ahead?
Although litigation could take time to play out, advocates are already looking at how the decision could accelerate other change.
“It’s been a complete validation of what we’ve been screaming on the tops of roofs about for years,” Julianna Arnold, who founded the nonprofit Parents RISE! following the death of her daughter 17-year-old daughter, Coco, said outside the Los Angeles courthouse Wednesday. “We know this is a long game. We’re headed to DC with the evidence we have in hand and this verdict, and we’re demanding safety protections and legislation to keep kids safe online from our legislators.”
US lawmakers who for years have pushed for more comprehensive online safety legislation are calling on their colleagues to see Wednesday’s decision as a reason to pass it, although such efforts have floundered for years.
“I would urge any member of Congress that continues to do (Meta CEO) Mark Zuckerberg’s bidding to look at this verdict and their conscience,” Sen. Richard Blumenthal said in a statement Wednesday advocating for his Kids Online Safety Act bill.
In the meantime, revelations from these cases could encourage families and even teens themselves to shift how they approach social media.
“We had a perception of reality before that this was just inevitable: ‘What are you going to do? The kids are on it. The technology is here to stay,’” Haidt said. “Now suddenly we’re all saying, ‘Wait, everybody agrees this is harmful for kids. So why are we giving it to kids?’”
CNN’s Lisa Eadicicco contributed to this report



