Cyber's Role in the Rapid Rise of Digital Authoritarianism
Recorded: Oct. 31, 2025, 1 p.m.
| Original | Summarized |
Cyber's Role in the Rise of Digital Authoritarianism TechTarget and Informa Tech’s Digital Business Combine.TechTarget and InformaTechTarget and Informa Tech’s Digital Business Combine.Together, we power an unparalleled network of 220+ online properties covering 10,000+ granular topics, serving an audience of 50+ million professionals with original, objective content from trusted sources. We help you gain critical insights and make more informed decisions across your business priorities.Dark Reading Resource LibraryBlack Hat NewsOmdia CybersecurityAdvertiseNewsletter Sign-UpNewsletter Sign-UpCybersecurity TopicsRelated TopicsApplication SecurityCybersecurity CareersCloud SecurityCyber RiskCyberattacks & Data BreachesCybersecurity AnalyticsCybersecurity OperationsData PrivacyEndpoint SecurityICS/OT SecurityIdentity & Access Mgmt SecurityInsider ThreatsIoTMobile SecurityPerimeterPhysical SecurityRemote WorkforceThreat IntelligenceVulnerabilities & ThreatsRecent in Cybersecurity TopicsCyber RiskZombie Projects Rise Again to Undermine SecurityZombie Projects Rise Again to Undermine SecuritybyRobert Lemos, Contributing WriterOct 30, 20257 Min ReadVulnerabilities & ThreatsLotL Attack Hides Malware in Windows Native AI StackLotL Attack Hides Malware in Windows Native AI StackbyNate Nelson, Contributing WriterOct 30, 20255 Min ReadWorld Related TopicsDR GlobalMiddle East & AfricaAsia PacificRecent in World See AllThreat IntelligenceSilver Fox APT Blurs the Line Between Espionage & CybercrimeSilver Fox APT Blurs the Line Between Espionage & CybercrimebyNate Nelson, Contributing WriterAug 8, 20253 Min ReadThreat IntelligenceIran-Israel War Triggers a Maelstrom in CyberspaceIran-Israel War Triggers a Maelstrom in CyberspacebyNate Nelson, Contributing WriterJun 19, 20255 Min ReadThe EdgeDR TechnologyEventsRelated TopicsUpcoming EventsPodcastsWebinarsSEE ALLResourcesRelated TopicsLibraryNewslettersPodcastsReportsVideosWebinarsWhite papers Partner PerspectivesSEE ALLCyber RiskData PrivacyCybersecurity OperationsIndustry TrendsCyber's Role in the Rapid Rise of Digital AuthoritarianismCyber's Role in the Rapid Rise of Digital AuthoritarianismCyber's Role in the Rapid Rise of Digital AuthoritarianismDark Reading Confidential Episode 11: Enterprise cyber teams are in prime position to push back against our current "Golden Age of Surveillance," according to our guests Ronald Deibert from Citizen Lab and David Greene from the EFF.Dark Reading Staff, Dark ReadingOctober 31, 2025Becky BrackenHello and welcome to Dark Reading Confidential. It's a podcast from the editors of Dark Reading, bringing you real world stories straight from the cyber trenches. Today, we are thrilled to welcome two experts right on the heels of the 10th anniversary of the discovery of the Pegasus Zero Click commercial spyware and the current ratcheting up of digital authoritarianism across the globe. We are joined by Ronald Diebert, professor of Political Science and Director of the Citizen Lab at the University of Toronto; David Green, senior staff attorney, civil liberties director at the Electronic Frontier Foundation (EFF); and we are joined by Alex Calafi, who is a reporter extraordinaire for Dark Reading and who has been covering this topic very deeply for quite some time. Welcome to all of you. Thank you for joining us.Great, I'm going to get out of the way and hand this over to Alex and enjoy this conversation.Alex CulafiWell, thanks, Becky, and thanks, Ron and David, for joining me today. I want to start by using Pegasus as a jumping off point because we're at nearly 10 years since Citizen Lab brought the NSO Group's Pegasus spyware to light, capturing it and showing how commercial organizations sell surveillance software to countries to spy on dissident individuals as well as organizations. Since then, Citizen Lab and the EFF have brought to light many, many instances of countries using commercially sourced spyware.Related:Zombie Projects Rise Again to Undermine SecuritySince the days of Pegasus, has the problem of commercial spyware gotten better, worse, or somewhere in between? Ron, I'll start with you.Ron DeibertSure, well, thank you for having me on. It's a pleasure to be here. Yeah, we're getting up to the 10-year anniversary of that case where we first discovered the exploits that were used to implant Pegasus spyware on a victim's phone and actually recover a copy of Pegasus at the same time. I think it's important to point out though that we, along with the EFF, actually had been tracking mercenary surveillance companies several years before that, including two Western European mercenary spyware firms, Gamma Group and Hacking Team, going back to like 2011, 2012. This is a market that began roughly around the time of the Arab Spring and the introduction of smartphones. And it makes total sense. I mean, at that time, people started entrusting a lot of their personal information to devices and there was a need for law enforcement intelligence to get inside those devices. This market began to emerge around that time, roughly coinciding with the observation a lot of autocrats made that people were becoming empowered by digital technologies using social media and smartphones to organize, and they wanted a way to neutralize it.Related:AI Search Tools Easily Fooled by Fake ContentAll of this led to this burgeoning marketplace. How things are going, I'd say it's a bit of a mixed picture actually. On the one hand, since 10 years ago, let's say, when groups like Citizen Lab, EFF, and maybe one or two others were orbiting around this topic, the community of people that are focused on it now is much larger, much more robust, much more professionalized. We have not only technologists looking into the issue, but people are interested in advocacy, policy promotion, and litigation. David will probably explain that there are many, many cases of litigation going on right now. People, victims, suing governments who use spyware to hack into their phones or suing the vendors of the technology that's used by governments to hack and harass them. And I think that's all, generally speaking, very positive.There are also some regulatory developments, especially under the Biden administration that brought some punishments to the firms, putting them on sanctions lists, including some of the principle owners of those companies. And of course, President Biden's executive order prohibiting 18 federal US intelligence agencies from procuring spyware that's used and abused in human rights violations around the world.Related:'Jingle Thief' Highlights Retail Cyber ThreatsThat's on the positive side of the ledger. On the negative side, I think we really do have to begin with developments in the United States. The descent into a kind of techno-fascism in the United States is sending a very large signal, not only domestically in that country, but around the world. The United States has entered into a contract with a firm called Paragon, which is a competitor to NSO Group.Paragon at one point pitched itself as offering an ethical version of spyware. But as we discovered earlier this year, its technology was being used in Italy to hack the phones of migrant support workers and journalists. And to see that now ICE has a contract with Paragon is certainly something everybody should be worried about. But more broadly, just the dissent into authoritarianism in the United States sends a very bad signal for the rest of the world. The attacks on nonprofits and philanthropies, a lot of organizations in this space depend on support to the extent that that is under attack, it will affect groups around the world, as did the pulling out of funding to organizations like the National Endowment for Democracy or Freedom House. And I think now there's just so much more data available to feast upon by, not spyware companies per se, but the whole surveillance matrix that private sector companies operate in, especially things like location tracking, advertising intelligence, social media analysis, and surveillance. There's like a golden age of surveillance right now, and I fear that whatever progress was made over the last four years and trying to rein in that market is now quickly being reversed. So a bit of a mixed picture is how I would see it.David Greene I agree with that. I don't know how much I have to add. I would say from the activist point or the points of view of the targets, I do think that we, the human rights community more broadly, have been pretty good at educating people about spyware. And so we do have more savvy human rights defenders as well as others who are frequently targeted with better hygiene on their devices.At the same time, spyware technology continues to evolve. That makes it really hard to detect and prevent. With the advent of the zero-click exploits, where someone doesn't even have to open an attachment or even open a message. It just makes it harder even for someone who's being very careful to avoid becoming a victim of spyware. I agree with Ron. I see both steps forward as well as steps back.And it also just seems that the demand side from both democratic and non-democratic governments for this technology is so great that we seem to be seeing so many new players in the market. It's hard to talk about this just in terms of being one or two companies anymore.Alex Culafi I want to widen the focus a little bit because when we think about digital authoritarianism, techno-fascism, the first thing folks think of with good reason is spyware companies, countries, using spyware to spy on citizens, dissidents, what have you. But it's a much larger, more complex web, right? So you have things like the Chinese National Security Law, which enables the country to force organizations to share user data back with the state for national security reasons. What are some of the other ways, not just China, that enterprises, organizations can be swept up in digital authoritarianism, be it as victims or even as aggressors? And David, I'll start with you on this one.David Greene My habit whenever someone mentions the Chinese law is to point out that it's actually not that unusual. The US has a law that's not dissimilar. We have a system of national security letters that essentially requires online services to provide user information the same way, often accompanied by gag orders where they're not permitted to notify the users that their information has been requested and produced and actually really limiting how they can even talk about the fact that they've received these things except in sort of the largest categories. And so, don't think China is exceptional in any way about that. We see in the US, we see actually in most legal systems, even the ones that consider themselves the most democratic. The idea that governments want people's data is not something that is confined. It's widespread common practice, which should cause us all great concern. And again, this feeds into the idea that there's a huge market for these spyware tools on the national security level as well as with routine law enforcement. One of the things that seemed to be going in the right direction with some of the Biden administration actions in the US was at least trying to reestablish a norm that these [spyware tools] are bad, that they result in significant harm to human rights of those who are victimized by them and that they are just bad. And if they're going to be used, there should either be legal process around them or really reserved for exceptional circumstances. But we don't see that as a widespread norm globally. And spyware is just an incredibly powerful tool to where I think we're a few short steps away from it actually moving from the higher levers of national government and down to routine policing.Ron Deibert Yeah, I would just add to what David said, everything I agree with there, it's a good time to remind folks that we live in a digital ecosystem that is invasive by design, principally because of the business model, which revolves around gathering personal data, undertaking surveillance of users' data in order to direct advertisements at them. And that means that it's more than just a spyware problem. We have a whole environment of tech platforms and applications that routinely gather a lot of rich data about people's habits, social relationships, movements, personal preferences, and so on. And that's entrusted principally to the private sector. And the question we're asking is under what conditions can the state have access to it? And as David rightfully pointed out, China is not the only country in the world. In fact, most countries have some kind of desire to get access to that data. The question is under what conditions? So here is where we need to talk about what guardrails exist, what type of oversight exists. Typically you could distinguish China from liberal democratic countries on the basis of robust oversight mechanisms and checks and balances. And a lot of us used to point to the United States and say, you have oversight, you have, you know, congressional hearings, you have various forms of third-party watchdogs and so forth. Well, we can't say that really any longer when it comes to the United States. And this is why I think the dissent into authoritarianism that is happening before our eyes is so very serious because when the United States goes on that path, it effectively legitimizes what other countries are doing or wanting to be doing in this area as well. It kind of lowers the bar for everyone.And I think it's now more serious than ever in the United States when combined with a couple of things. First of all, you had DOGE go through and access all of this once segregated data that governments have access to and essentially synthesized it at the same time that Palantir, a data fusion company, acquired contracts with the US government, including local law enforcement, to be able to fuse all of that data together. And this is a company that's used to help law enforcement and intelligence. And of course, we're witnessing what the other end of this looks like in the United States, I believe, when you see vans with masked ICE agents screeching to a halt on a street corner and nabbing somebody, it's likely because they've been tipped off by the by the technologies that the government now has available to them, which are only going to get more refined. This question that we used to think about in relation to authoritarian countries like China has now really come home to roost in a big way. I think that's a reckoning that Americans will have to make and hopefully be able to restore some semblance of what makes that country unique, which as I understand it is the division of powers, which is quickly evaporating.Alex CulafiTo your point, these levers [to surveil citizens] exist in many countries around the world, and these levers are being abused both in the US and outside of the US, et cetera. So my question is thinking about the enterprise, public or private. What responsibility do they have in a world where these levers exist to keep user data safe, to keep trade secrets safe, in order to limit that data from getting outside of their own clutches? And I'm not saying this as a way to limit their responsibility or to say, then they don't have any responsibility. Quite the opposite. I'm saying, OK, let's definitely hold companies accountable for when they either roll over very quickly or don't show the protective hygiene that they should. But how do we do that and how do we hold them accountable? Ron, I'll start with you.Ron Deibert Sure, yeah. That's a very good question. It kind of gets at what we were discussing before in a more targeted manner, focusing on the responsibilities of businesses. First and foremost, I think we have to recognize that businesses are businesses and however much they may want to speak the language of corporate social responsibility, they're likely only going to go so far, and ultimately, they're taking their cue in the presence of everything else, from revenues in the bottom line and what their shareholders say to them. When it is a sound business decision to make all sorts of corporate responsibility promises, they will. But if there's nothing holding their feet to the fire and it's not good for business, they probably won't. And that's why we have witnessed most of the big US-based tech platforms from Apple, Microsoft, Google, others. essentially genuflecting before Trump. I mean, what more can you say about the fact that Tim Cook of Apple actually bestowed a gold plated gift of some sort to the president? I mean, this is obscene really, but more seriously, it's a sign of where we're at right now. In terms of which responsibilities they have, I think there's lots of guidance here. There's the guidance that comes from UN on responsible human rights practices that businesses should follow. These have been well established for many years. I think that's good. Those are voluntary. What we need ultimately is some sort of government regulation in this space. And that's really what's been notably absent generally in the tech sector for many decades. There's been a mythology around the Internet, generally speaking, that we should leave the government out to allow innovation to proceed unencumbered and this will benefit everyone. Well, that mythology has been punctured many times over.I think we all need to recognize now that there needs to be proper good governance. That means government regulations that require companies to do certain things and not other things and be held accountable for it. And those laws need to be basically synthesized across the industrialized world and the rest of the world, hopefully as much as you possibly can, so that corporations operate within a responsible legal regulatory environment.David GreeneYeah, in a perfect world, companies wouldn't collect and retain user information except to the extent that it was absolutely necessary for their operation, so when there were government requests or even malicious spyware, there wasn't a honeypot of information that could even be obtained. For many companies, their business models depend on not only retaining the information they do collect, but actually there's tremendous incentives to collect as much information as they possibly can. And we benefit from those in the form of free services, but there's tremendous costs.I agree that in the US we don't have comprehensive data privacy regulation. Insufficiently on a state by state level and very few states have actually put in strong privacy protections that limit the ability of companies to collect data. I think data privacy regulation could go a long way. It's not going to be a complete answer. I think that having some responsibility to do human rights assessments and have some type of human rights compliance as some national laws require would go some way to actually having some enforced accountability against companies. But what we're left with as consumers is just our own ability and really weak ability to pressure companies to tell them we're unhappy, to urge them to protect our data, and to try and call them out when they don't. At EFF for many years, we had a project called Who Has Your Back, which tried to make public what companies' actual policies were with respect to responding to government, to law enforcement requests for user information. And a lot of that was really just trying to make it so users could line companies up next to each other and see if one maybe had better practices than another. But we are limited by the fact that these companies have tremendous financial incentives to collect and retain and share and monetize user information and that is a real, real rich source of information for governments.Alex Culafi You say, for many reasons, that a business is going to act with its own interests in mind first. And it makes sense to me that both of you went to regulations as maybe the first line of defense in a world where this generally tends to be the case. But let's say I'm a CISO, I'm some kind of decision maker in a business, and I am open to or interested or invested in the idea of resisting techno-fascism or digital authoritarianism or what have you, reasonably, practically with my customers and maybe stakeholders in mind, whatever. How do I resist these levers and forces that be in the governments I exist in? David Greene Yeah, actually, you know, there's lots of models as Ron mentions, there's lots of models for human rights compliance among businesses. And I think actually each industry might have its own model. So there certainly are things out there and there's really several ways a company can approach this. One from the tech side is really just trying to button down their systems as much as possible and value security, really, really, really value it, make it a priority and don't roll out new features until you're actually sure that they're going to be safe and secure. And we see that as a huge problem where new features get rolled out that have vulnerabilities within them and then they try and mop up afterwards. Companies could have less proprietary vulnerability identification programs where they could actually really more strongly encourage people to find vulnerabilities in their system to report them, and treat those not as attacks, but rather as assistance. That would help greatly if companies really valued that. Some companies are better than others in terms of welcoming outsiders, There's that aspect, and then there's the aspect of what your policies are for when you receive a law enforcement request, or even when you determine that you're infected by malware. If we're looking at law enforcement requests, your policy should be, first of all, that you will say no. And that you will, at a minimum, give your users, the person who's targeted, the opportunity to go to court and try to challenge that request. That means not complying initially, that means notifying users and actually supporting them to the best extent that you can. This means that if there's a regulation or a law or a part of this request that says you're not allowed to tell them, that you legally challenge that because it's tremendously important to you to be able to notify your users. Ron Deibert Yeah, I would agree 100% with everything David said there. Let me add just an anecdote about a type of thing that companies can do more of and we should applaud them and recognize them when they do. Starting a few years ago, Apple started notifying its customers who it became aware had their devices hacked with Pegasus or other spyware.And this was something that they didn't need to do necessarily. It's not even arguably a very good business decision for them. In fact, it could present an entirely opposite perspective for people who were thinking about it in business terms. In essence, they were doing something I believe in the public interest.That was a very commendable step because for groups like ours, which investigate attacks on civil society and look at this from the victim perspective. We were seeing, first of all, a lot of people who otherwise would have no clue that this was going on being notified by their technology provider. And then they were able to do something about it. And for a group like mine, it was essentially doing enormous global triage. They were shaking a tree and victims were falling to the ground like fruit and we could come along and one way or another find out who is getting these notifications. The Italian case that I mentioned earlier this year is a prime example of that. In fact, two companies sent out notifications there, WhatsApp and Apple, and different Italians received those notifications. Eventually they made their way to us and we did forensic analysis showing that their devices were hacked. Apple also introduced a feature called Lockdown Mode. This is a feature that they came up with that was meant to protect high risk users. When looked at from a business decision, this is definitely not the greatest because what it does is essentially degrades the performance of your iPhone. It does that to reduce the attack surface. However, I've had Lockdown Mode on since the feature was introduced, and I really have gotten along fine with it on. It's not a big encumbrance.We have seen not a single instance of a victim's phone being infected with spyware when lockdown mode was enabled yet. So these are not self-interested measures. They're measures designed to help their customer base, broadly speaking, which is one step removed from a sort of naked commercial move. It's more about doing something that is going to reduce harm. And I think that should be commended and applauded. And we'd like to see more of that sort of thing happening. And as David said, there are all sorts of guidelines. Basically, you're starting with, what does it look like to operate a business in this space and try as much as possible to follow international human rights law standards? And then you go from there and it leads from everything on down to what David so aptly described in terms of how you approach things like national security letters or requests for user information. One more thing I'll add is it's very useful, less common now than it was a few years ago, when companies produce transparency reports. When they show, hey, we're getting government requests, even if they're not allowed to show fine grain details, if they can give a general rubric of what's happening, like a barometer of the type of pressures they're facing in different countries, this is useful for the public and it's useful for public interest researchers as well. I'd like to see more of that.Alex CulafiThe EFF, 35 years old, Citizen Lab is 24 years old. And I know both of you in various forms have been around talking about digital rights, advocating for digital rights for what I believe is like 30 years at this point in both of your cases. Speaking of the last 25, 30 years, what's the state of things right now overall? What's gotten better? What's gotten worse over the years? And Ron, we'll start with you.Ron DeibertWell, thank you for that. It's not that I necessarily wanted to be reminded about how long I've been in this space, but that's okay. [laughs] You know what? It's again, a mixed picture. I would say that, you know, I've been really heartened over the course of my career by the growth of the community of which I'm a part.There was a time when it felt like we were mostly alone doing this type of investigative sleuthing that we do at the Citizen Lab. And now there are many, many organizations that are involved in this. And we have a really robust set of mature exchanges, very professional. Everyone's trying to coordinate as best they can. And we've had some great successes. So I'm very pleased with that to see the growth and maturity of broadly speaking, the community. But on the flip side, I think we have to soberly acknowledge that things have gotten much, much worse in a lot of ways. I mean, I could pick out a few technical things that have happened like the spread of end-to-end encryption. That's, you know, genuinely a good thing. But broadly speaking, I kind of feel like someone who's warned about the climate crisis for the last several decades is just progressively getting worse. And I think social media accelerated a lot of the bad problems, unfortunately. The spread of authoritarianism is now well-documented, 19 years of democratic decline as defined by Freedom House. You know, this is bad. This is very bad. And I think for those of us who care about rights and liberties and checks and balances against despotic rule, we're in a very dark place right now. And we need to recognize much like the climate crisis that we're in an emergency and the luxuries that we have to be able to speak freely, to organize freely, to assemble, to vote freely. All of these things are not set in stone. They're social constructs. They depend on us protecting and stewarding them. And right now, they're at great risk and that affects the digital space, the cyber space as well.I think all of us who care about that environment now need to wake people up to the fact that this is a dire time right now and we need to raise the alarm.David Greene Yeah, I mean, digital rights are really just human rights as it applies to the use of digital communications technologies. And so where human rights are at risk, digital rights are going to be at risk. We can't separate the two. And we're at a period of time where human rights are greatly at risk. We're not unique in human rights being at risk. We actually often identify them as rights because they get challenged and people are trying to deny them. But it does seem like some of the, or not some, many, many, of the institutions we relied on as being bulwarks in support of human rights have either collapsed or are faltering. We are seeing challenges to press freedom and erosion of norms regarding press freedoms more broadly. It's a scary time for all human rights right now. And because that's a scary time for digital rights, I think some of the things where you see it being exacerbated in the digital rights space is the way we concentrated. We collectively as a society concentrated a lot of power and decision making into the very few corporations that control how we use digital communications technologies. Sometimes controlling them benignly, or at least there were benign consequences, but sometimes not, or many times not. If I look for bright spots, I think there are bright spots as well. One of the things that I've seen happen in the time I'm doing this is having this conversation really move from being very US and Europe focused to being a more global conversation. So for example, the digital rights dialogue that's coming out of Africa right now is incredible. It's advanced, it's intelligent, it's so much farther thinking than what we've seen when these discussions were happening based in the US and Europe. Africa has the most internet users in the world, has the youngest population of internet users. It's exactly the place where we should be seeing this innovation happening and we see a lot of it, and that's a big bright spot. Now we still face the same challenges and we still, again, because we are seeing degradation of human rights generally, we're going to see degradation of digital rights. But we can see some bright spots where we're not dependent on the particular experience of Western democracies, as they were, in order to try and address issues as they get identified.Alex Culafi For this last question, yes, human rights are at risk. There are also bright spots, but we should highlight the emergency that's happening with the way digital human rights are being treated. If I'm someone listening to this podcast, and statistically, you know, it's not going to be a CISO or a CEO of an enterprise. It's going to be a practitioner or someone interested in cybersecurity or someone interested in what you guys do if they find it through whatever podcatcher, social search engine, whatever. What should I do next? What should I do to learn more? What the next action I should take? David Greene My first response is to always be individual threat modeling and so I'll start there by saying that how we respond really depends on what our own threat models are and some of us have the luxury of being able to respond by saying things like, "I'm going to read up on as much of this stuff as I possibly can and become a knowledgeable consumer. I'm going to support organizations that are doing work to protect digital rights and human rights broadly. I'm going to help my friends, advise them." So that's great and I really urge people to be able to do that. There's another level of people though who actually I think are actively at risk from either from being exploited and how they might respond to this as individuals is far different, right? They're going to want to actually evaluate their own individual risks, you know, to make sure their own devices are secured, identify, look for ways that they continue to do their individual work without being vulnerable to being oppressed by governments or even powerful private actors. That should be a positive message. There are things people can do, right? But if you are someone who has a low individual threat model, then it's really worth understanding how the groups that you think are doing really important work might have different threat models.Ron Deibert That's a great answer. I wouldn't say much different. I'll just add a couple of resources maybe that I often point people to. One is EFF's own Surveillance Self-Defense Guide. It gives a lot of good recommendations on how to protect yourself against the type of surveillance that we've been speaking about broadly. And there's another resource called Security Planner that the Citizen Lab developed and passed on to Consumer Reports.Most people are familiar with Consumer Reports for their ratings and electronic appliances. Well, they have this great portal called Security Planner that we helped develop that, which gives people personalized recommendations on how to improve their digital hygiene. And that's not necessarily meant for high-risk users. As David mentioned, everyone has to approach their own threat model in their unique way. But it is useful for the general public, let's call it, for the average person just to take a few steps to make sure that you're capturing the low hanging fruit. And then for high-risk users, especially in the United States right now, because I think the challenges there are pretty unprecedented in a lot of ways. My colleague, Micah Lee, just wrote a very interesting blog post called Practical Defenses Against Technofascism. If you Google that, you'll see he's got a great set of recommendations on steps people can take to protect themselves, especially if they're mobilizing against what's going on in the United States. And I'd highly recommend that resource.Alex Culafi Thanks guys, and with that I'll pass it back to you, Becky.Becky Bracken What an incredible discussion, and I want to thank you all. Our goal here at Dark Reading is to provide access to the best experts and the greatest thinkers on these topics to our audience of cyber professionals who might not otherwise have access to you. And so we're so grateful that you were able to share such detailed and practical advice for our audience as well as your decades of combined expertise. So thank you. Thank you so much.Alex Culafi, another wonderful report straight from the trenches. So thank you, David Greene, Ron Deibert. Dark Reading is very grateful. This has been Dark Reading Confidential. It is a podcast from the editors of Dark Reading bringing you real world stories straight from the cyber trenches. Thank you for joining. I'm Becky Bracken, and we'll see you next time. Bye bye.About the AuthorDark Reading StaffDark ReadingDark Reading is a leading cybersecurity media site.See more from Dark Reading StaffMore InsightsIndustry ReportsIDC MarketScape: Worldwide Exposure Management 2025 Vendor AssessmentThe Forrester Wave™: Unified Vulnerability Management Solutions, Q3 2025The Total Economic Impact™ Of Palo Alto Networks NextGeneration FirewallsMiercom Test Results: PA-5450 Firewall WinsSecurity Without Compromise Better security, higher performance and lower TCOAccess More ResearchWebinarsThe Cloud is No Longer Enough: Securing the Modern Digital PerimeterSecuring the Hybrid Workforce: Challenges and SolutionsCybersecurity Outlook 2026Threat Hunting Tools & Techniques for Staying Ahead of Cyber AdversariesMeasuring Ransomware Resilience: What Hundreds of Security Leaders RevealedMore WebinarsYou May Also LikeEditor's ChoiceCybersecurity OperationsElectronic Warfare Puts Commercial GPS Users on NoticeElectronic Warfare Puts Commercial GPS Users on NoticebyRobert Lemos, Contributing WriterOct 21, 20254 Min ReadKeep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.SubscribeNov 13, 2025During this event, we'll examine the most prolific threat actors in cybercrime and cyber espionage, and how they target and infiltrate their victims.Secure Your SeatWebinarsThe Cloud is No Longer Enough: Securing the Modern Digital PerimeterTues, Nov 18, 2025 at 1pm ESTSecuring the Hybrid Workforce: Challenges and SolutionsTues, Nov 4, 2025 at 1pm ESTCybersecurity Outlook 2026Virtual Event | December 3rd, 2025 | 11:00am - 5:20pm ET | Doors Open at 10:30am ETThreat Hunting Tools & Techniques for Staying Ahead of Cyber AdversariesTuesday, Oct 21, 2025 at 1pm ESTMeasuring Ransomware Resilience: What Hundreds of Security Leaders RevealedThu, Oct 23, 2025 at 11am ESTMore WebinarsWhite PapersHow to Chart a Path to Exposure Management MaturitySecurity Leaders' Guide to Exposure Management StrategyThe NHI Buyers GuideThe AI Security GuideTop 10 Identity-Centric Security Risks of Autonomous AI AgentsExplore More White PapersDiscover MoreBlack HatOmdiaWorking With UsAbout UsAdvertiseReprintsJoin UsNewsletter Sign-UpFollow UsCopyright © 2025 TechTarget, Inc. d/b/a Informa TechTarget. This website is owned and operated by Informa TechTarget, part of a global network that informs, influences and connects the world’s technology buyers and sellers. All copyright resides with them. Informa PLC’s registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. TechTarget, Inc.’s registered office is 275 Grove St. Newton, MA 02466.Home|Cookie Policy|Privacy|Terms of Use |
Dark Reading’s analysis of how cybersecurity is evolving, and the rapidly expanding role of cyber’s influence on digital authoritarianism. The proliferation of sophisticated surveillance technologies and techniques, combined with the increasing capabilities of governments and private organizations, present unprecedented challenges to global security and human rights. This analysis underscores the need for vigilance, proactive defense measures, and international collaboration to mitigate the risks posed by this evolving landscape. Dark Reading's digital business combine provides extensive coverage of cybersecurity threats, vulnerabilities, and trends, empowering professionals with critical insights and informed decision-making. Their network of online properties and dedicated resources offer a comprehensive ecosystem for staying ahead of the curve and navigating the complexities of the digital security world. In this analysis, Dark Reading explores the relationship between cybersecurity and digital authoritarianism. They highlight how concerns about government overreach and surveillance have become increasingly relevant and urgent. With the rise in sophisticated surveillance technologies, combined with the expanded capabilities of both government and private entities, this landscape has become increasingly complex. Dark Reading’s assessment reveals a significant shift in the threat landscape. The primary focus is on how cyber capabilities are utilized to erode civil liberties and suppress dissent. This has led to an accelerated race to address the vulnerabilities, and establish norms to ensure compliance with human rights. The analysis also underscores the importance of collaboration between various stakeholders, including cybersecurity professionals, policymakers, and civil society organizations. By working together, these groups can develop effective strategies to counter digital authoritarianism and safeguard fundamental rights. Ultimately, Dark Reading’s detailed account serves as a critical warning of the challenges and opportunities within the cyber security space and the importance of recognizing and responding to the potential impact of digital authoritarianism. |