

Meta and Google have been hit with a $6M verdict after a jury found their platforms designed addictive features harmful to young users. The ruling intensifies scrutiny over child safety and platform accountability.
At trial, the plaintiff’s lawyers sought to show Meta and Google intentionally targeted kids and made decisions that put profit over safety.
Alphabet’s Google and Meta were found liable for designing platforms that are dangerous for kids and teens. This landmark verdict could force tech firms to rethink how they defend themselves against safety claims. The jury found Meta liable for $4.2 million in damages and Google for $1.8 million, small amounts for two of the world’s most valuable companies.
The case involves a 20-year-old woman, a minor when the case began, who is known in court by her first name, Kaley. She said she became addicted to Google’s YouTube and Meta’s Instagram at a young age because of their attention-grabbing design. The jury found Google and Meta were negligent in the design of both apps and failed to warn about their dangers.
“Today’s verdict is a referendum, from a jury, to an entire industry, that accountability has arrived,” the plaintiff’s lead counsel said in a statement.
Meta disagrees with the verdict, and its lawyers are “evaluating our legal options,” a company spokesperson said. Google plans to appeal, said company spokesperson José Castañeda.
The plaintiffs alleged that some features social media companies built into their platforms, such as an infinitely scrollable feed and video autoplay, are designed to keep people on the apps and have made the products addictive.
Meta’s attorneys emphasized the plaintiff’s difficult home life as a child as the cause of her mental health struggles, while YouTube argued her usage of the streaming platform was minimal.
When asked about Meta’s decision to lift a temporary ban on beauty filters that some inside Meta warned could be harmful to teen girls, Zuckerberg said he decided to let users express themselves.
“I felt like the evidence wasn’t clear enough to support limiting people’s expression,” he said.
The verdict is a “setback” for Meta and Google, said Gil Luria, a technology sector analyst at investment firm D.A. Davidson.
“This process will likely get dragged out through future cases and appeals, but eventually may cause these companies to put in consumer safeguards that may dampen growth,” he said. Snap and TikTok were also defendants in the trial. Both settled with the plaintiff before it began. Terms of the agreements were not disclosed.
Also Read: Zuckerberg Bets Big On AI, Meta Introducing Personal AI Agent
Tech giants have faced mounting criticism over child and teen safety. The debate has now shifted to courts and state governments. The US Congress has declined to pass comprehensive legislation regulating social media. At least 20 states enacted laws last year on social media usage and children, according to the nonpartisan National Conference of State Legislatures.
In a separate case, a New Mexico jury on Tuesday found Meta violated state law in a lawsuit brought by the state’s attorney general, who accused the company of misleading users about the safety of Facebook, Instagram, and WhatsApp, which prompted child exploitation.
Another state trial is slated to begin in Los Angeles in July, said Matthew Bergman, one of the attorneys leading the cases for the plaintiffs. It will involve Instagram, YouTube, TikTok, and Snapchat.
The verdict could mark a turning point in the global backlash against their platforms’ perceived mental health harms to youth, more than two decades after the emergence of social media.