LmCast :: Stay tuned in

The Danger Behind Meta Killing End-to-End Encryption for Instagram DMs

Recorded: March 20, 2026, 10 p.m.

Original Summarized

The Danger Behind Meta Killing End-to-End Encryption for Instagram DMs | WIREDSkip to main contentMenuSECURITYPOLITICSTHE BIG STORYBUSINESSSCIENCECULTUREREVIEWSMenuAccountAccountNewslettersSecurityPoliticsThe Big StoryBusinessScienceCultureReviewsChevronMoreExpandThe Big InterviewMagazineEventsWIRED InsiderWIRED ConsultingNewslettersPodcastsVideoLivestreamsMerchSearchSearchLily Hay NewmanSecurityMar 20, 2026 6:00 AMThe Danger Behind Meta Killing End-to-End Encryption for Instagram DMsMeta blamed users for not opting into the privacy-protecting feature. Experts fear the move could be the first major domino to fall for end-to-end encryption tech worldwide.PHOTO-ILLUSTRATION: WIRED STAFF; GETTY IMAGESCommentLoaderSave StorySave this storyCommentLoaderSave StorySave this storyAs law enforcement agencies scramble to address threats of terrorism, child sexual abuse, and human trafficking—and repressive governments around the world look to broadly expand their surveillance capabilities—researchers fear that Meta's retreat from its commitments to protect user privacy with end-to-end encryption on Instagram chat could create a problematic precedent in big tech.Meta spent the better part of a decade working to deploy end-to-end encryption by default across all of its chat apps. It was a saga—fraught with both technical and political hurdles. But in December 2023, the company declared victory, announcing default end-to-end encryption for Messenger and promising that it was in testing to roll out for Instagram Direct Messaging as well. In the end, though, end-to-end encryption only came to Instagram chat as a backwater opt-in feature. And as threats to end-to-end encryption from governments around the world loom larger than ever, Meta quietly announced last week that it intends to eliminate the feature from Instagram chat entirely on May 8.Crucially, few companies have the scale and stability needed to stake out an influential pro-end-to-end encryption position. And an even smaller group—namely, Meta and Apple—have made it a priority. Experts say that Meta's decision about Instagram chat could give other companies, or even simply other divisions within Meta, permission to do less, too.“Meta's deployment of encryption was a public commitment, and they were weathering a lot of pressure from various governments to do it,” says Johns Hopkins cryptographer Matt Green, who has consulted for Meta over the years on its end-to-end encryption rollout as both an unpaid advisor and paid reviewer. “Public commitments to support privacy features are literally the only thing that we the public have. If they’re worthless, then why should we assume we’ll continue to have end-to-end encryption in Messenger and WhatsApp?”Meta's decision to revoke end-to-end encryption for Instagram chat seems to have been particularly alarming for researchers and privacy advocates because of the company's stated reason for the change: low user adoption.“Very few people were opting in to end-to-end encrypted messaging in DMs, so we're removing this option from Instagram in the coming months,” a Meta spokesperson told WIRED and other outlets. “Anyone who wants to keep messaging with end-to-end encryption can easily do that on WhatsApp.”The statement struck many as disingenuous given that Meta emphasized for years that it was committed specifically to default end-to-end encryption, not the opt-in version that ultimately emerged for Instagram chat buried behind layers of menus.“Designed the feature so nobody could find it, killed it for not being easy enough to find and, therefore, unpopular. It's deeply cynical,” says Davi Ottenheimer, a longtime security executive and creator of the post-quantum cryptography assessment tool pqprobe.Johns Hopkins' Green adds, too, that Meta originally rolled out opt-in encryption for Messenger and seemingly learned the lesson about the need for default implementation from low adoption in that trial.“This is a Meta post where they publicly commit to default encryption in Instagram chat. Then, seemingly without even looking back over it, they add an update to the top that implies that it was optional encryption, and blames lack of opt-in as the reason they need to remove this feature,” Green says. “Nothing about this is honest. They know what they promised.”WIRED gave Meta multiple opportunities to comment for this story, but the company ultimately declined.In a key 2019 treatise laying out his vision for privacy and security across Meta's properties, CEO Mark Zuckerberg wrote, “I understand that many people don't think Facebook can or would even want to build this kind of privacy-focused platform—because frankly, we don't currently have a strong reputation for building privacy-protective services, and we've historically focused on tools for more open sharing." But, he added, “we've repeatedly shown that we can evolve to build the services that people really want, including in private messaging and stories.”Years later, in a December 2023 background call with WIRED ahead of the announcement that full, default end-to-end encryption was ready for Messenger and Instagram chat, a Meta employee who had worked for years on the project specifically described a phase from 2016 to early 2019 in which engineers were working on an optional encryption feature for Messenger, and then a phase beginning with Zuckerberg's 2019 announcement where the team shifted to implementing default encryption.Documents from inside Meta released as part of a lawsuit alleging that Meta did not adequately protect underage users from abuse show, though, that implementing end-to-end encryption by default has long been politically fraught within the company as well as outside of it. Reuters reported at the end of February that Meta head of content policy Monika Bickert wrote internally in March 2019 ahead of Zuckerberg's announcement, “We are about to do a bad thing as a company. This is so irresponsible.”Privacy advocates have long pointed out that while child safety and public safety should always be paramount, child sexual abuse and other crimes still play out daily on chat apps and other digital services that do not offer the universal protection of end-to-end encryption. In other words, adding default end-to-end encryption gives numerous protections to everyone, while eliminating it takes those protections away without solving threats to the most vulnerable.Meanwhile, Meta's statement about its decision to revoke end-to-end encryption for Instagram chat notably does not mention that the protection is available on Messenger, pointing users instead to WhatsApp. This is, perhaps, a revealing omission. As Casey Newton pointed out on Monday in a Platformer story, Meta will eliminate the stand-alone Messenger website in April and is in the process of recoupling Messenger with Facebook after a major push in 2014 and 2015 to separate Messenger as a stand-alone product. In 2015 WIRED reported that “Facebook wants to position Messenger as your default chat app.” Now, it seems the product is being sidelined.Meta is investing in at least one new project that would bring encryption protections to more of its users: a partnership with Signal creator Moxie Marlinspike to deploy his new private AI technology, known as Confer, for Meta AI. The move, announced by Marlinspike this week, could protect millions of conversations people have with Meta's AI chatbot. The collaboration is in an early phase, though, and there are no details yet on exactly how Confer might be integrated.Some sources tell WIRED that, absent more information from Meta about its Instagram chat decision, the only conclusion they can come to is that the company's commitment to implementing end-to-end encryption by default across its chat platforms was always a ploy to improve its public image and mend user trust in the wake of numerous scandals in which the company mismanaged user data or suffered breaches.“Encryption was used politically as a shield and a sword,” pqprobe's Ottenheimer says. “The shield was against the post-Cambridge Analytica trust collapse and the sword was against governments who'd been pressuring Meta on safety and content moderation concerns. Now I guess the privacy brand isn't as valuable, so they just reverse it and blame the users.”CommentsBack to topTriangleYou Might Also LikeIn your inbox: Will Knight's AI Lab explores advances in AI‘Flying cars’ will take off this summerBig Story: Inside OpenAI’s race to catch up to Claude CodeHow ‘Handala’ became the face of Iran’s hacker counterattacksListen: Nvidia’s ‘Super Bowl of AI,’ and Tesla disappointsLily Hay Newman is a senior writer at WIRED focused on information security, digital privacy, and hacking. She previously worked as a technology reporter at Slate, and was the staff writer for Future Tense, a publication and partnership between Slate, the New America Foundation, and Arizona State University. Her work ... Read MoreSenior WriterXTopicsencryptionend-to-end encryptionprivacyFacebookInstagramWhatsAppMetasecuritycybersecurityRead MoreGamers Hate Nvidia's DLSS 5. Developers Aren’t Crazy About It EitherNvidia’s new AI upscaling gaming technology struck gamers as uncanny and off-putting. Developers don't seem to like it either, but it could be “the default” in a few years.Boone AshworthThis Compact Bose Soundbar Is $80 OffBose might be known for its noise-canceling headphones, but the brand’s soundbars are pretty solid too.Brad BourqueKalshi Has Been Temporarily Banned in NevadaA judge ordered Kalshi to immediately halt sports and election contracts in the state, intensifying a growing regulatory battle over prediction markets.Kate KnibbsIran War Puts Global Energy Markets on the Brink of a Worst-Case Scenario“This will be so, so, so, so, so bad,” one analyst says.Molly TaftAt Palantir’s Developer Conference, AI Is Built to Win WarsAs business soars, Palantir is doubling down on a vision of AI built for battlefield advantage—and attracting customers who agree.Steven LevyChina Approves the First Brain Chips for Sale—and Has a Plan to Dominate the IndustryWhile the United States and Europe are moving cautiously forward with clinical trials, China is racing toward the commercialization of brain implants.Jorge GarayCan Tinder Fix The Dating Landscape It Helped Ruin?With more than a dozen new features, including analyzing users’ camera rolls and astrology-based matches, Tinder is trying to lure Gen Z—and bring back those burned out from dating apps.Jason ParhamFirewire's Neutrino Looks Like an Ironing Board and Takes Off Like a ShotFirewire makes the most innovative surfboards in the industry. This winter, I tried the Neutrino, Machado, and Revo Max to see if they're worth the hype.Brent RoseI Learned More Than I Thought I Would From Using Food-Tracking AppsThese apps, some of which use AI and computer vision, were helpful for meeting my caloric and nutrition intake goals. But they also gave me some anxiety.Jaclyn GreenbergThe Danger Behind Meta Killing End-to-End Encryption for Instagram DMsMeta blamed users for not opting into the privacy-protecting feature. Experts fear the move could be the first major domino to fall for end-to-end encryption tech worldwide.Lily Hay NewmanMy AI Agent ‘Cofounder’ Conquered LinkedIn. Then It Got BannedWhen social media is constantly pushing people to use AI, why not let AI agents participate?Evan RatliffTop Paramount+ Coupon Codes and Deals for March 2026Save on streaming with the latest Paramount+ promo codes and deals, including 50% off subscriptions, free trials, and more.Molly HigginsWIRED is obsessed with what comes next. Through rigorous investigations and game-changing reporting, we tell stories that don’t just reflect the moment—they help create it. When you look back in 10, 20, even 50 years, WIRED will be the publication that led the story of the present, mapped the people, products, and ideas defining it, and explained how those forces forged the future. WIRED: For Future Reference.More From WIREDSubscribeNewslettersLivestreamsTravelFAQWIRED StaffWIRED EducationEditorial StandardsArchiveRSSSite MapAccessibility HelpReviews and GuidesReviewsBuying GuidesStreaming GuidesWearablesCouponsGift GuidesAdvertiseContact UsManage AccountJobsPress CenterCondé Nast StoreUser AgreementPrivacy PolicyYour California Privacy Rights© 2026 Condé Nast. All rights reserved. WIRED may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad ChoicesSelect international siteUnited StatesLargeChevronItaliaJapónCzech Republic & SlovakiaFacebookXPinterestYouTubeInstagramTiktok

Meta’s decision to eliminate end-to-end encryption for Instagram Direct Messaging, spearheaded by Matt Green at Johns Hopkins University, represents a significant and concerning development within the broader landscape of privacy-preserving technologies. Initially, Meta had spent a decade pursuing the deployment of end-to-end encryption by default across its messaging platforms, a process fraught with both technical and political challenges. Despite this commitment, the implementation remained an opt-in feature for Instagram Direct, a strategy that resulted in remarkably low user adoption. This ultimately led to Meta’s announcement last week of its intention to entirely remove the feature by May 8th, citing this low uptake as the primary justification.

The ramifications of this move extend beyond simply Meta’s actions; it highlights a critical vulnerability within the tech industry’s approach to privacy. Experts like Davi Ottenheimer, a long-time security executive, criticize the approach, describing it as “deeply cynical,” noting that the feature was “designed the feature so nobody could find it, killed it for not being easy enough to find and, therefore, unpopular.” This tactic underscores a perceived lack of genuine commitment to user privacy amongst large tech corporations, particularly when facing pressure from governments seeking increased surveillance capabilities. Green himself, who has advised Meta on end-to-end encryption, emphasizes that public commitments to privacy features are the only safeguard available to the public, highlighting the potential for such commitments to become meaningless if not consistently upheld.

The primary argument presented by Meta – that the feature was only adopted by a small percentage of users – is viewed with skepticism. This rationale is seen as disingenuous, given Meta’s prior emphasis on default end-to-end encryption and the subsequent prioritization of the opt-in model. This shift further contributes to a perception that the company is prioritizing business interests over genuine privacy protections. The decision to shift users to WhatsApp as the primary platform for end-to-end encrypted messaging is also viewed as an omission.

This situation carries significant implications for the broader tech industry. The removal of end-to-end encryption from Instagram Direct creates a potentially destabilizing precedent, suggesting that companies may be willing to abandon commitments to privacy in response to low user adoption or political pressure. Johns Hopkins’ Matt Green specifically points out the potential for this decision to weaken the overall position of companies like Meta and Apple, who have historically been at the forefront of advocating for and deploying end-to-end encryption. The strategic importance of such commitment as a tool for influencing public trust and combating governmental pressure is evident.

Furthermore, the underlying concerns about surveillance extend beyond simple user experience. The reduction in encrypted communication leaves vulnerable individuals, such as those exposed to child sexual abuse or human trafficking, without a critical layer of protection. As privacy advocates have consistently argued, the absence of universal encryption exacerbates existing vulnerabilities within digital communication.

Meta’s decision is further complicated by its own internal assessments, as revealed through documents released as part of a lawsuit. A 2019 internal memo from CEO Mark Zuckerberg, alongside a later comment by Monika Bickert, reveals a long-standing internal debate regarding the feasibility and desirability of implementing default encryption, suggesting that the commitment to default encryption was, in part, a strategic response to public scrutiny and governmental pressure.

Ultimately, Meta's retreat from end-to-end encryption for Instagram Direct raises critical questions about the future of privacy in the digital age and the role of large tech companies in safeguarding user data. The move, coupled with the actions of other companies, potentially undermines the progress already achieved in promoting secure communication and fuels concerns about the increasing erosion of individual privacy in an era of escalating surveillance.