LmCast :: Stay tuned in

Meta’s legal defeat could be a victory for children, or a loss for everyone

Recorded: March 28, 2026, 2 p.m.

Original Summarized

A jury said Instagram and YouTube are defective — now what? | The VergeSkip to main contentThe homepageThe VergeThe Verge logo.The VergeThe Verge logo.TechReviewsScienceEntertainmentAIPolicyHamburger Navigation ButtonThe homepageThe VergeThe Verge logo.Hamburger Navigation ButtonNavigation DrawerThe VergeThe Verge logo.Login / Sign UpcloseCloseSearchTechExpandAmazonAppleFacebookGoogleMicrosoftSamsungBusinessSee all techReviewsExpandSmart Home ReviewsPhone ReviewsTablet ReviewsHeadphone ReviewsSee all reviewsScienceExpandSpaceEnergyEnvironmentHealthSee all scienceEntertainmentExpandTV ShowsMoviesAudioSee all entertainmentAIExpandOpenAIAnthropicSee all AIPolicyExpandAntitrustPoliticsLawSecuritySee all policyGadgetsExpandLaptopsPhonesTVsHeadphonesSpeakersWearablesSee all gadgetsVerge ShoppingExpandBuying GuidesDealsGift GuidesSee all shoppingGamingExpandXboxPlayStationNintendoSee all gamingStreamingExpandDisneyHBONetflixYouTubeCreatorsSee all streamingTransportationExpandElectric CarsAutonomous CarsRide-sharingScootersSee all transportationFeaturesVerge VideoExpandTikTokYouTubeInstagramPodcastsExpandDecoderThe VergecastVersion HistoryNewslettersArchivesStoreVerge Product UpdatesSubscribeFacebookThreadsInstagramYoutubeRSSThe VergeThe Verge logo.Meta’s legal defeat could be a victory for children, or a loss for everyoneComments DrawerCommentsLoading commentsGetting the conversation ready...PolicyClosePolicyPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PolicyReportCloseReportPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All ReportTechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechMeta’s legal defeat could be a victory for children, or a loss for everyoneA jury said Instagram and YouTube are defective — now what?A jury said Instagram and YouTube are defective — now what?by Adi RobertsonCloseAdi RobertsonSenior Editor, Tech & PolicyPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Adi RobertsonMar 28, 2026, 2:00 PM UTCLinkShareGiftImage: Cathryn Hutton / The VergeAdi RobertsonCloseAdi RobertsonPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Adi Robertson is a senior tech and policy editor focused on online platforms and free expression. Adi has covered virtual and augmented reality, the history of computing, and more for The Verge since 2011.Is social media not just bad, but illegally bad? Should tech companies pay for making it that way? According to two US juries — and no shortage of outside commentary — the answer to both questions is “yes.”Earlier this week, two juries — one in New Mexico, one in Los Angeles — held Meta liable for a total of hundreds of millions of dollars for harming minors. YouTube was also found liable in Los Angeles, and both companies are appealing their losses. In one sense, the decisions were surprising. Meta and Google operate platforms for transmitting speech and are typically protected in a variety of ways by Section 230 and the First Amendment; it’s unusual for suits to clear these hurdles. In another, it feels inevitable. The web of 2026 has become almost synonymous with a few widely disliked for-profit platforms, and the harm they’ve caused is often tangible — but it’s still far from certain what this defeat will change, and what the collateral damage could be.If these decisions survive appeal — which isn’t certain — the direct outcome would be multimillion-dollar penalties. Depending on the outcome of several more “bellwether” cases in Los Angeles, a much larger group settlement could be reached down the road. Even at this early stage, it’s a victory for a legal theory that social media platforms should be treated like defective products — a strategy designed to get around the shield of Section 230, but one that’s often failed in court. “The California case specifically is the first time social media has ever had to face the staredown and judgment of a jury for specific personal injuries,” attorney Carrie Goldberg, who pushed forward major early social media liability suits, including an unsuccessful case against Grindr, told The Verge. “It’s the dawn of a new era.”“It’s the dawn of a new era.”For many activists, the overall goal is to make clear that lawsuits will keep piling up if companies don’t change their business practices. What practices? In New Mexico, a jury was swayed by arguments that Meta had made statements misleading users about the safety of its platforms. In LA, the plaintiffs successfully claimed Instagram and YouTube were designed in a way that facilitated social media addiction that harmed a teenage user. Meta and Google (and other nervous companies) could plausibly change specific features or be more cautious in their public statements and disclosures. But each case depends on a set of highly specific circumstances, and there’s no one-size-fits-all answer about what needs to change.Eric Goldman, a legal blogger and expert on Section 230, sees clear legal danger ahead for social media services. “These rulings indicate that juries are willing to impose major liability on social media providers based on claims of social media addiction,” Goldman wrote after the ruling. In an email to The Verge, he noted the issue was bigger than just juries. “Judges are certainly aware of the controversies around social media,” Goldman said. In the Los Angeles case and other upcoming bellwether trials, “the judges have not given social media defendants much benefit of the doubt, which is how the plaintiffs’ novel cases were able to reach trials in the first place.” It’s a situation, he says, that “does feel differently compared to a decade ago.”Goldman pointed out that New York and California have also passed laws banning “addictive” social media feeds for teens — so even if an appeals court reverses the recent decisions, that won’t necessarily turn back the clock.The best-case outcome of all this has been laid out by people like Julie Angwin, who wrote in The New York Times that companies should be pushed to change “toxic” features like infinite scrolling, beauty filters that encourage body dysmorphia, and algorithms that prioritize “shocking and crude” content. The worst-case scenario falls along the lines of a piece from Mike Masnick at Techdirt, who argued the rulings spell disaster for smaller social networks that could be sued for letting users post and see First Amendment-protected speech under a vague standard of harm. He noted that the New Mexico case hinged partly on arguing that Meta had harmed kids by providing end-to-end encryption in private messaging, creating an incentive to discontinue a feature that protects users’ privacy — and indeed, Meta discontinued end-to-end encryption on Instagram earlier this month.“Judges have not given social media defendants much benefit of the doubt.”Blake Reid, a professor at Colorado Law, is more circumspect. “It’s hard right now to forecast what’s going to happen,” Reid told The Verge in an interview. On Bluesky, he noted that companies will likely look for “cold, calculated” ways to avoid legal liability with the minimum possible disruption, not fundamentally rethink their business models. “There are obviously harms here and it’s pretty important that the tort system clocked those harms” in the recent cases, he told The Verge. “It’s just that what comes in the wake of them is less clear to me.”While Reid sees legal risks for smaller platforms with fewer resources in these decisions, he’s not convinced they’re more serious than the challenges new entrants already face in a hyper-consolidated online landscape built on massive amounts of data collection. “There are things that make it hard to do something really new in this space that are driven by the sort of marketplace and the surrounding policy,” he said.Reid, Goldman, and Masnick all warn there’s a clear chance that the fallout could harm marginalized people who use social media to connect. “There will be even stronger pushes to restrict or ban children from social media,” Goldman told The Verge. “This hurts many subpopulations of minors, ranging from LGBTQ teens who will be isolated from communities that can help them navigate their identities to minors on the autism spectrum who can express themselves better online than they can in face-to-face conversations.”If platforms like Instagram are inherently damaging and directly comparable to gambling or cigarettes, comparisons frequently made by critics, being kicked off would be no great loss. But even research that suggests social media can be harmful for adolescents has associated moderate use with better well-being. Conversely, harmful online content like harassment and eating disorder communities still flourished before recommendation-driven, hyper-optimized modern social media; tinkering with specific algorithmic formulas could have a positive impact, but it’s possible it won’t provide a deep or lasting fix. The appeal of punishing Meta is obvious — what it will mean for everyone else is much less clear.Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.Adi RobertsonCloseAdi RobertsonSenior Editor, Tech & PolicyPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Adi RobertsonLawCloseLawPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All LawMetaCloseMetaPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All MetaPolicyClosePolicyPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PolicyReportCloseReportPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All ReportSpeechCloseSpeechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All SpeechTechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechMost PopularMost PopularSony is raising PS5 prices by $100 in AprilReturning from a humanitarian aid trip to Cuba, Americans have phones seized at US airportRank the 50 best Apple productsMeta gets ready to launch two new Ray-Ban AI glassesAnker’s 160W Prime Charger can power three devices at once, and it’s $50 offThe Verge DailyA free daily digest of the news that matters most.Email (required)Sign UpBy submitting your email, you agree to our Terms and Privacy Notice. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.Advertiser Content FromThis is the title for the native adMore in PolicyThe White House has an app now, and Trump wants you to report people to ICE on itWait, the Trump phone might actually existReturning from a humanitarian aid trip to Cuba, Americans have phones seized at US airportMeta’s court losses could be just the beginningBrendan Carr says his broadcast license threat wasn’t really about Iran war coverageDavid Sacks is no longer the White House AI and Crypto CzarThe White House has an app now, and Trump wants you to report people to ICE on itStevie BonifieldMar 27Wait, the Trump phone might actually existDominic PrestonMar 27Returning from a humanitarian aid trip to Cuba, Americans have phones seized at US airportGaby Del ValleMar 27Meta’s court losses could be just the beginningDavid PierceMar 27Brendan Carr says his broadcast license threat wasn’t really about Iran war coverageLauren FeinerMar 26David Sacks is no longer the White House AI and Crypto CzarTina NguyenMar 26Advertiser Content FromThis is the title for the native adTop StoriesMar 27Rank the 50 best Apple products7:00 AM UTCBluetti’s Sora 500 solar panel is incredibly powerful for its size11:00 AM UTCOppo made the best foldable phone, againTwo hours agoWhy OpenAI killed SoraTwo hours agoThe must-have app for frequent flyersAn hour agoA classic Zelda-style adventure, but a lot more cozyThe VergeThe Verge logo.FacebookThreadsInstagramYoutubeRSSContactTip UsCommunity GuidelinesArchivesAboutEthics StatementHow We Rate and Review ProductsCookie SettingsTerms of UsePrivacy NoticeCookie PolicyLicensing FAQAccessibilityPlatform Status© 2026 Vox Media, LLC. All Rights Reserved

Meta’s legal defeat represents a significant, though potentially complex, development in the ongoing debate surrounding social media’s impact on young people. Two separate juries—one in New Mexico and one in Los Angeles—ruled that Meta, the parent company of Instagram and YouTube, was liable for harm caused to minors, a decision that challenges the established protections afforded to online platforms through Section 230 of the Communications Decency Act and the First Amendment. This verdict signals a potential shift in the legal landscape, moving beyond traditional claims of defamation or negligence and toward a framework where social media companies could be held responsible for design choices and content policies that demonstrably contribute to harm, particularly for vulnerable users like teenagers.

The plaintiffs in these cases successfully argued that Instagram and YouTube were architected in ways that encouraged addictive behavior and facilitated access to harmful content, essentially framing the platforms as defective products. The New Mexico case, in particular, was groundbreaking, as it marked the first time social media had been subjected to a jury’s judgment regarding specific personal injuries. Following this, the LA jury agreed with the plaintiffs’ claims that Instagram and YouTube were designed to encourage social media addiction, leading to harm for a teenage user.

While the immediate outcome involves substantial financial penalties—estimated to be in the hundreds of millions—the broader implications are yet to be fully realized. Appeals from Meta and Google are anticipated, and the success of these appeals could dramatically alter the trajectory of these lawsuits. Moreover, the rulings have spurred further legal action, with “bellwether” cases scheduled in Los Angeles to potentially establish a precedent for larger settlements.

Several legal experts have highlighted the significance of this judicial intervention. Carrie Goldberg, who spearheaded earlier lawsuits against Grindr, observed that this decision is the first time social media has ever faced a direct confrontation with the courts regarding specific injuries. Eric Goldman, a frequent commentator on Section 230, noted that juries are now willing to impose major liability on social media providers based on claims of addiction, a shift that challenges the conventional legal protections afforded to platforms. Goldman emphasized that these rulings have indicated that the judges have not given social media defendants much benefit of the doubt due to ongoing controversies, which is how the plaintiffs’ novel cases were able to reach trials in the first place.

The rulings have prompted calls for fundamental changes to social media design and content moderation policies. Advocates like Julie Angwin have urged companies to eliminate harmful features—such as infinite scrolling, beauty filters promoting unrealistic body standards, and algorithms prioritizing provocative content—while regulators move forward. However, the potential for unintended consequences is also being voiced. For instance, Blake Reid, a professor at Colorado Law, warned that companies may pursue “cold, calculated” strategies to avoid liability rather than undertaking a genuine overhaul of their business models. Concerns have also been raised that restricting access to social media for young people could disproportionately affect marginalized communities, such as LGBTQ teens who rely on these platforms for connection and identity exploration.

Furthermore, the legal battles surrounding social media liability have broader implications for the tech industry. The risk of similar lawsuits against other platforms, including YouTube, could encourage greater scrutiny of content policies and algorithmic design. It is also worth noting that recent developments like New York and California's bans on “addictive” social media feeds for teenagers complicate the situation, suggesting that regulatory intervention may be necessary regardless of the outcome of these legal challenges. The situation remains fluid and uncertain, and the long-term impact on how social media platforms operate—and how they interact with users—remains to be seen.