In a speech at the UN headquarters in New York, where world leaders are currently gathered for the organisation’s 80th anniversary, Ukrainian president Volodymyr Zelensky warned: “We are now living through the most destructive arms race in human history.”

The proliferation of drone technology combined with the rapid development of AI, Zelensky remarked, could create “dead zones” in the near future. He defined these as areas “stretching for dozens of kilometres where nothing moves, no vehicles, no life. People used to imagine that [scenario] only after a nuclear strike – now it’s [a] drone reality.”

AI could soon enable “swarms” of drones that operate autonomously together in a coordinated manner. So far this has only been seen in sci-fi movies but we are now starting to see the beginnings of this technology in real life, including from the Ukrainian military.

For security scholars such as Audrey Kurth Cronin of Carnegie Mellon University in the US, we are now in a time of “open tech innovation”. This is a period where people – whether terrorists or criminal groups – do not need the expertise and resources of a state to be able to orchestrate nefarious acts of disruption and destruction.

Zelensky and Kurth Cronin believe this new age of military technology requires new rules and enhanced global collaboration if the worst-case scenarios are to be avoided. “We need to restore international cooperation – real, working cooperation – for peace and for security,” said Zelensky in his UN speech. “A few years from now might already be too late.”

Days before these remarks, drone activity caused multiple airports in Denmark to close. The country’s defence minister, Troels Lund Poulsen, told a news conference that the “attack” was part of a “systematic operation”. Some reports have suggested that Russia may have been behind these acts.

One of the major concerns among security experts worldwide in recent years has been on acts of sabotage that play out below the threshold that can lead to open war. In what is known as “hybrid warfare”, states and criminal groups can orchestrate a variety of tactics to generate fear and cause disruption.

These acts may be intended for political ends – for instance, by creating discontent with political leaders. They may also be intended to test the systems of security that are important for defending against military action. The incursion of Russian drones into Polish airspace in early September, for example, generated serious debate about how Nato should respond.

These recent events may signal that the world is now in a new age of military-technological insecurity that, as Zelensky warned the UN, is only going to get worse in the years ahead.

Read more: Russian drones over Poland is a serious escalation – here's why the west's response won't worry Putin

Deterring futuristic war

Central to defence policy and strategic thinking is deterrence. Our world is built on strategies that are intended to deter countries or regimes from pursuing certain courses of action. The possession of nuclear weapons, for example, has prevented war between the world’s leading powers for decades.

Deterrence will continue to inform decisions and strategy, even as global events become increasingly chaotic. So much of the debate around what Nato countries should do about the war in Ukraine, for instance, has been informed by questions of deterrence and escalation. Ultimately, direct Nato action has been restricted by the fear that nuclear weapons could be used in a moment of strategic chaos.

Russian president Vladimir Putin has, in a similar way, been careful not to push above the threshold with actions that might lead to a direct confrontation with Nato. Acts that are hard to attribute – such as drone use over airports or cyber-espionage – are ideal for a regime that wants to create disruption but doesn’t want to escalate.

There are three elements that can be developed to prevent escalation and war. The first is deterrence by punishment. This is where an action will result in a response that will mean the risk outweighs the cost.

The second is deterrence by denial, when you make an action too difficult to orchestrate successfully and effectively. And third is deterrence by entanglement. This is when the interconnected nature of society means that an action may be counterproductive or even self-destructive.

All of these elements of deterrence will probably come into play in this new age of drones and AI. There might be technical solutions that limit the extent to which AI-enabled drone swarms become a decisive weapon in future wars. For example, a group of drones was successfully knocked out by a new radio wave weapon in an April 2025 trial by the British Army.

There may also be limits on the exploration of the destructive possibilities of drone swarms due to the concern with keeping events below the threshold that would lead to war between global powers. While Putin may authorise the use of drones in Ukraine, he may be deterred from risking the use of swarms across London. This is due to the possibility of escalation and perhaps even the threat to Russian-owned property and citizens there.

So, as terrifying as the new age of drone swarms and AI may be, there are good reasons for thinking the dystopian possibilities of future war will be controlled and contained. We should probably expect that the world will be characterised by more frequent disruptive events in the years ahead. Yet, hopefully the disruption will be limited to the nuisance caused by delayed flights.

What is more concerning is the possibility of an accident occurring that tips disruption over the threshold into an open war. The history of war and international politics is rife with accidents and miscalculations. The question now is what accidents will be generated in this new age of AI and drone swarms.

This article is republished from The Conversation, a nonprofit, independent news organization bringing you facts and trustworthy analysis to help you make sense of our complex world. It was written by: Mark Lacy, Lancaster University

Read more:

Mark Lacy does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.