LmCast :: Stay tuned in

Meta’s reckoning over kids safety is in the hands of two juries

Recorded: March 24, 2026, 4 p.m.

Original Summarized

Meta’s reckoning over kids safety is in the hands of two juries | The VergeSkip to main contentThe homepageThe VergeThe Verge logo.The VergeThe Verge logo.TechReviewsScienceEntertainmentAIPolicyHamburger Navigation ButtonThe homepageThe VergeThe Verge logo.Hamburger Navigation ButtonNavigation DrawerThe VergeThe Verge logo.Login / Sign UpcloseCloseSearchTechExpandAmazonAppleFacebookGoogleMicrosoftSamsungBusinessSee all techReviewsExpandSmart Home ReviewsPhone ReviewsTablet ReviewsHeadphone ReviewsSee all reviewsScienceExpandSpaceEnergyEnvironmentHealthSee all scienceEntertainmentExpandTV ShowsMoviesAudioSee all entertainmentAIExpandOpenAIAnthropicSee all AIPolicyExpandAntitrustPoliticsLawSecuritySee all policyGadgetsExpandLaptopsPhonesTVsHeadphonesSpeakersWearablesSee all gadgetsVerge ShoppingExpandBuying GuidesDealsGift GuidesSee all shoppingGamingExpandXboxPlayStationNintendoSee all gamingStreamingExpandDisneyHBONetflixYouTubeCreatorsSee all streamingTransportationExpandElectric CarsAutonomous CarsRide-sharingScootersSee all transportationFeaturesVerge VideoExpandTikTokYouTubeInstagramPodcastsExpandDecoderThe VergecastVersion HistoryNewslettersArchivesStoreVerge Product UpdatesSubscribeFacebookThreadsInstagramYoutubeRSSThe VergeThe Verge logo.Meta’s reckoning over kids safety is in the hands of two juriesComments DrawerCommentsLoading commentsGetting the conversation ready...PolicyClosePolicyPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PolicyReportCloseReportPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All ReportTechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechMeta’s reckoning over kids safety is in the hands of two juriesUnfavorable verdicts could cost it billions.Unfavorable verdicts could cost it billions.by Lauren FeinerCloseLauren FeinerSenior Policy ReporterPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Lauren Feiner and Adi RobertsonCloseAdi RobertsonSenior Editor, Tech & PolicyPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Adi RobertsonMar 24, 2026, 2:56 PM UTCLinkShareGiftMark Zuckerberg. Image: The Verge | Photo: Bloomberg via Getty ImagesPart OfSocial media on trial: tech giants face lawsuits over addiction, safety, and mental healthsee all updates Two juries are currently deliberating a series of cases that could either usher in a legal reckoning for Meta, or maintain the status quo in an uphill battle to impose changes or penalties on tech platforms in court.Yesterday, a New Mexico jury heard closing arguments in a trial where Meta is accused of facilitating child predators on its platforms — allegations the company vehemently denies. And as soon as today, a Los Angeles jury is tentatively expected to reach a verdict in a separate case, which concerns whether Meta and Google should be held liable for making defective products that addicted a young woman. Verdicts against the company could result in damages and civil penalties that could exceed $2 billion dollars. Perhaps more significantly, such an outcome could also invite more legal action after years of failed or stalled attempts to sue tech companies over alleged harm.It’s already just the tip of the iceberg for Meta, as well as many other tech platforms, that are set to face several more trials this year. Meta’s products, Facebook and Instagram, have often been at the forefront of criticism over the tech industry’s alleged failure to protect kids online, fueled by leaks from former employees like Frances Haugen. Meta, meanwhile, argues that harming users is not good for business.“While New Mexico makes sensationalist, irrelevant and distracting arguments, we’re focused on demonstrating our longstanding commitment to supporting young people,” Meta spokesperson Andy Stone told The Verge in a prior statement. He also said the company “strongly disagree[s]” with allegations in the separate set of lawsuits playing out in California, and “are confident the evidence will show our longstanding commitment to supporting young people.” The jury in Los Angeles has been deliberating for just over a week, following a five-week-long trial.During closing arguments in New Mexico on Monday, Linda Singer, an attorney representing the state, told the jury that Meta has failed to install adequate protections for young people on its services, and misled the public about the safety of its products. Throughout the six-week trial, the state presented evidence from Meta’s own internal discussions and state investigators’ undercover operations. “Meta chooses how to design its algorithm,” Singer said. “When you’re optimizing for a metric, the algorithm takes all of that data to get better. Right now, it’s getting better given that goal of showing engaging content. But Meta could choose to program its algorithm to get better at safety, to get better at integrity, to get better at things that keep kids safe.” While Meta has promoted numerous additional child safety features over the years, Singer compared them to “adding a filter to a cigarette. It doesn’t change the fundamental nature of the product or make it safe.”“Meta could choose to program its algorithm to get better at safety.”Both juries in New Mexico and California heard similar evidence — including testimony from a set of former Meta employees — about internal concerns over the platform’s guardrails, discussions about getting users onto Meta platforms young, and harms it was allegedly aware of but didn’t take sufficient action to address. Singer said Meta ignored clear signals of kids under 13 on its platform, even though it said they weren’t allowed on. One elementary school principal wrote to Instagram head Adam Mosseri that almost all her kids were on the app, she said.New Mexico attorneys also presented evidence from their own law enforcement investigations that led to the arrest of three suspected child predators. Investigators used decoy accounts that claimed to be minors to lure suspects, and found they were flooded with new friend requests and sexual chats from adults, even when the decoy account repeatedly claimed to be a minor in messages. The state said three suspects’ accounts weren’t shut down until after New Mexico announced their arrests, even though Meta’s own systems had allegedly flagged policy violations repeatedly.In the company’s own closing arguments, Meta attorney Kevin Huff argued that Meta had clearly disclosed the limits of its safety systems and taken action whenever possible, while the state had focused on a “small amount of bad content” and “cherry-picked” statements. “We believe the evidence has shown that Meta works incredibly hard to protect users including teens,” Huff said. He also argued that the state’s investigators used “hacked and stolen accounts” and real people’s images nonconsensually to lure predators, arguing they were “not trying to replicate a true teen experience.”“We believe the evidence has shown that Meta works incredibly hard to protect users including teens.”Singer disputed the claims. “I want to be as plain as I can possibly be on this point. This is not a hacked account, this is not an image of an actual adult. It’s an age-regressed image of Mr. Kitch,” Singer said, referring to a New Mexico investigator. Another image used in a decoy account, she said, was AI-generated. “After all of the evidence you’ve heard about the way that Meta put kids in harm’s way, after the fact that they failed to detect that his 13-year-old account is being chatted with by sex offenders, Meta had the audacity to question whether he placed someone in danger. When the scale of what Meta has done here is astonishing and absolutely contrary to what it has said.”One key hurdle for plaintiffs in each of these cases is overcoming the fact that Meta is protected by Section 230 for liability over third-party content. Singer clarified early in her presentation that “when I say harmful content, I’m not talking about the nature of the content. I’m talking about Meta’s misrepresentations about what it knew about the harmful content that was present and recommended on its platforms.” Huff, conversely, drew the jury’s attention to Section 230 multiple times and said the state’s claim of misrepresentation “doesn’t even get out of the starting gate.”Singer urged the jury to award the maximum amount in civil penalties if they decide that Meta willfully misled the public about safety and engaged in “unconscionable trade practices” under New Mexico law. If the jury agrees that all teen users in New Mexico were not properly informed of Meta’s risks and award the maximum of $5,000 apiece, that sum could total more than $2 billion.Meta’s attorney, Huff, argued the state had presented “zero evidence” that teens were using Instagram because they weren’t informed of the risks and said the calculation of users under 18 was “based on a fake number that doesn’t represent the number of teens in the state.” (The state’s attorney said the count was drawn from Meta’s own numbers.) “There is no evidence that anyone ever saw any of the 42 misstatements” attributed to Meta among New Mexico’s teen user base, Huff argued — and therefore, no reason to grant a penalty for it at all.Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.Lauren FeinerCloseLauren FeinerSenior Policy ReporterPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Lauren FeinerAdi RobertsonCloseAdi RobertsonSenior Editor, Tech & PolicyPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Adi RobertsonLawCloseLawPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All LawMetaCloseMetaPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All MetaPolicyClosePolicyPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PolicyReportCloseReportPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All ReportSpeechCloseSpeechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All SpeechTechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechMore in: Social media on trial: tech giants face lawsuits over addiction, safety, and mental healthSmart glasses in court are a privacy nightmareStevie BonifieldFeb 20The executive that helped build Meta’s ad machine is trying to expose itLauren FeinerFeb 19Someone was still wearing Meta’s Ray-Bans in the courthouse after a judge warned against it.Lauren FeinerFeb 19Most PopularMost PopularNvidia CEO Jensen Huang says ‘I think we’ve achieved AGI’The US government just banned consumer routers made outside the USDonut Lab’s solid-state battery could barely hold a charge after getting damagedConfronting the CEO of the AI company that impersonated meVideoSome writing advice from Project Hail Mary’s Andy WeirThe Verge DailyA free daily digest of the news that matters most.Email (required)Sign UpBy submitting your email, you agree to our Terms and Privacy Notice. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.Advertiser Content FromThis is the title for the native adMore in PolicyJohn Deere, Garmin, and Philips may have undermined military right to repairKalshi says it will block politicians and athletes from trading in markets they’re tied toTrump takes another shot at dismantling state AI regulationNine months later, the Trump phone still doesn’t exist‘Work from home,’ encourages the world’s energy watchdogPrediction markets are trying to lure journalists with partnership dealsJohn Deere, Garmin, and Philips may have undermined military right to repairEmma RothAn hour agoKalshi says it will block politicians and athletes from trading in markets they’re tied toEmma RothMar 23Trump takes another shot at dismantling state AI regulationHayden FieldMar 20Nine months later, the Trump phone still doesn’t existDominic PrestonMar 20‘Work from home,’ encourages the world’s energy watchdogJess WeatherbedMar 20Prediction markets are trying to lure journalists with partnership dealsMia SatoMar 19Advertiser Content FromThis is the title for the native adTop StoriesMar 23The US government just banned consumer routers made outside the USTwo hours agoThe MPC Sample is my new favorite portable beat makerAn hour agoJohn Deere, Garmin, and Philips may have undermined military right to repairMar 23Confronting the CEO of the AI company that impersonated meVideoMar 23Nvidia CEO Jensen Huang says ‘I think we’ve achieved AGI’A minute agoGoogle’s Android Automotive is moving from the dashboard to the ‘brain’ of the carThe VergeThe Verge logo.FacebookThreadsInstagramYoutubeRSSContactTip UsCommunity GuidelinesArchivesAboutEthics StatementHow We Rate and Review ProductsCookie SettingsTerms of UsePrivacy NoticeCookie PolicyLicensing FAQAccessibilityPlatform Status© 2026 Vox Media, LLC. All Rights Reserved

Meta’s legal battles concerning children’s safety are currently being adjudicated by two separate juries, representing a significant test for the company’s defense against accusations of inadequate protection for young users on its platforms. The first trial, unfolding in New Mexico, centers on allegations that Meta facilitated child predators on Instagram and Facebook, a claim the company vehemently denies. Simultaneously, a jury in Los Angeles is evaluating a separate lawsuit alleging that Meta and Google defectively designed products that fostered addictive behavior in a young woman. The potential outcomes of these cases—ranging from substantial damages and civil penalties exceeding $2 billion—carry significant weight, potentially opening the door for further legal challenges against the tech industry.

The proceedings in both cases are fueled by revelations stemming from disclosures by former Meta employees, most notably Frances Haugen, who presented internal documents highlighting concerns regarding guardrails, user targeting toward young audiences, and a lack of sufficient action taken to address harmful content. The plaintiffs’ arguments center on the alleged misrepresentation of Meta’s safety measures and the company’s prioritization of engagement metrics over user well-being. Linda Singer, the lead attorney for the state of New Mexico, argued that Meta’s algorithms—designed to maximize user engagement— inadvertently amplified harmful content and failed to adequately protect young users. Singer emphasized that the state’s focus wasn’t merely on instances of inappropriate content but on Meta’s systemic failures to program its algorithms for safety, integrity, and the protection of minors. She connected Meta’s pursuit of engagement with the reality of potential harm, noting that the algorithm’s objective was to increase user engagement — which often translated to more content, regardless of its safety.

Meta’s defense, through attorney Kevin Huff, contested the plaintiffs’ claims, asserting that the company had clearly disclosed the limitations of its safety systems and had taken action whenever possible. Huff highlighted the state’s selective use of evidence, characterizing it as cherry-picked data and highlighting what he described as "hacked and stolen accounts" employed to bait predators, and he argued that the state’s allegations were based on a distortion of the facts. He repeatedly invoked Section 230 of the Communications Decency Act, which shields online platforms from liability for user-generated content, arguing that the state’s case lacked substantial evidence of misrepresentation. Huff’s main argument was that Meta was trying extremely hard to protect users, including teens, and that the state’s actions were a misrepresentation of the situation.

Evidence presented in the trials included undercover operations conducted by New Mexico law enforcement, utilizing decoy accounts to lure suspected predators. The state demonstrated how these accounts were flooded with sexually explicit messages, even when the decoy accounts repeatedly stated they were under 13 years old, illustrating a failure on Meta’s part to effectively enforce its age restrictions. The investigation showcased alleged systematic failures to detect and remove harmful content—even after investigators had announced their arrests. The testimony underscored a clash between Meta’s stated commitment to user safety and the reality of its internal discussions and operational practices.

Both juries face a crucial determination regarding Section 230's implications in these cases. The plaintiffs’ ability to hold Meta accountable hinges on demonstrating a willful and systemic failure to mitigate harm, moving beyond mere instances of problematic content. Mr. Huff’s strategy focused on diminishing the importance of the plaintiff’s arguments, particularly the alleged misrepresentation and focusing on the protections that Meta did provide. The outcomes of these ongoing trials will have ripple effects not just for Meta, but potentially for other tech companies navigating similar concerns and will ultimately shape the legal landscape governing online safety and user protection.