Ukraine’s War Brings Autonomous Weapons to the Front Lines

Drones that can find their own targets already exist, making machine-versus-machine conflict just a software update away.
firemen look at destroyed building
Ukrainian firefighters work on a destroyed building after a drone attack in Kyiv on October 17, 2022, amid the Russian invasion of Ukraine. Photograph: YASUYOSHI CHIBA/Getty Images

When the war came to Sergiy Sotnychenko’s neighborhood in March 2022, he found himself carrying out daily performances for the drones that hummed constantly overhead. Desperate to prove that he wasn’t a combatant, he put on an orange hoodie which, out of all the clothing he owned, seemed least likely to be mistaken for military fatigues. He tried to show the drones he was carrying out innocent activities, like planting onions. Sometimes he would wave. 

That March was a nightmarishly violent month for Kyiv’s outskirts, including Irpin, where Sotnychenko lives, but there were moments when he allowed himself to feel comforted by the drones flying above. He imagined the Ukrainian army watching his small acts of resistance. “I felt reassured because I felt that I wanted to show them that we are holding out,” he says, speaking through a translator provided by the Museum of Civilian Voices, a project documenting ordinary people’s experience of the conflict in Ukraine.

But when Sotnychenko watched a Russian armored personnel vehicle drive through Irpin, shooting indiscriminately at the houses around him, he realized there was no way the drones were on his side. “I started hiding from all drones,” he says. “Sometimes I hid under trees or behind the branches. Sometimes, I managed to escape into my basement.” When a drone appeared above Sotnychenko and his 77-year-old mother as they tried to escape Irpin, they ran from it, certain it would kill them. 

The way Sotnychenko’s perception of drones was transformed over that month, from ally to enemy, echoes a shift that has taken place for civilians across Ukraine. At the start of the war, Turkish-made Bayraktar drones became a symbol of Ukraine’s resistance. But as the war edged toward its second year, Ukraine’s successes were eclipsed by Russian bombardments of Iranian-made kamikaze drones, used to target energy infrastructure and plunge parts of the country into darkness.

The war in Ukraine is the first large-scale conflict to see widespread use of drones on both sides. That has made it a crucible of innovation, as both invader and defender experiment and refine their technologies and tactics. But experts now caution that the proliferation of unmanned aerial vehicles is driving militaries—in Ukraine and beyond—to hand over more and more control to artificial intelligence, and ultimately moving toward systems that can operate on the battlefield without human involvement. 

“The massive use of drones in the war in Ukraine is pushing for more AI-guided weapon systems,” says Wim Zwijnenburg, project leader in humanitarian disarmament at PAX, a Dutch organization that campaigns to end armed violence. This, he warns, would create a slippery slope. “Justification for defensive purposes can easily change into offensive use when the genie is out of the bottle.” 

In the early days of Russia’s invasion, drones were mostly used as surveillance tools, like the ones Sotnychenko saw above Irpin. Russian forces used Orlan-10 fixed-wing drones to monitor troop movements and assess artillery damage. But it was Ukraine’s use of the Bayraktar TB2, made by the Turkish company Baykar, that transformed public perceptions of drone warfare. 

The Bayraktars were able to exploit gaps in Russia’s air defense to attack tank and truck convoys, but they were also a potent propaganda tool. The video game-style footage they produced—close enough to show the damage they were doing but far enough away to spare observers the sight of deaths and injuries—seemed made for social media and helped show that Ukraine was capable of driving the invaders back.

The Bayraktar became a symbol of resistance. A song was written about it. A lemur in Kyiv Zoo was named in its honor. But as Russia adapted its air defenses, videos demonstrating TB2’s effectiveness dried up. Instead, the dominant force in drone warfare became the Shahed-136, an Iranian-made suicide drone that Russia started using in September to destroy Ukraine’s energy infrastructure. In December, drone attacks left Odessa, a city of 1 million people, almost entirely without power.

On the front lines, soldiers on both sides have often eschewed large, expensive military drones to spy on one another’s trenches, opting instead for cheaper commercial models. Companies like DJI have said they don’t want their products used in the war. But Ukrainian soldiers kept using them anyway, receiving them in bulk through an informal supply chain of donations and European volunteers. Russia got its DJI shipments sent to the front via China or the Gulf.

These drones have mostly been used as spotters, but some were also modified to carry weapons. This wasn’t new—the Islamic State had also used drones to drop grenades in Iraq—but Ukraine started doing this on a professional, unprecedented scale, says Zwijnenburg, moving from jury-rigging devices to drop hand grenades to 3D-printing parts that turn ordinary commercial drones into weapons. 

The sheer volume of drones in action along the front line is pushing militaries toward automation, Zwijnenburg says. The more UAVs in the air at once, the less likely it is that humans can defend against them without AI. “We are really concerned this can be used to justify rapid deployment of artificial intelligence in weapon systems,” he says. 

There are no publicly agreed-upon norms around the use of weapons that can find and attack their own targets. And although a UN-convened group of experts agreed on a set of principles around lethal autonomous weapons in 2019, these are not legally binding. Instead, arguments in favor of these systems include ways they might spare the lives of soldiers and reduce collateral damage. “One of the stupid pitches I hear from the defense industry is that robots wouldn’t torture a human being, or robots can’t rape,” says Zwijnenburg. “But that totally depends on who’s doing the programming.” 

For Ukraine, which is fighting an existential war, concerns about the long-term consequences of automated weaponry feel abstract. Mykhailo Fedorov, the country’s minister of digital transformation, has described the development of autonomous drones as “logical and inevitable.”

“We will do everything to make unmanned technologies develop even faster,” he said on Twitter. In January, he estimated that armed autonomous drones could be within Ukraine’s grasp within the next six months. 

The defense industry is ready to supply them. “We are entering the new era of the machine-versus-machine battlefield,” says Johannes Pinl, CEO and founder of Monaco-based defense company MARSS, which is building an autonomous drone defense system designed to target the Shahed kamikaze drones.

He thinks Russia is already using the Iranian drones autonomously (although weapons experts who spoke to WIRED say they don’t think there’s enough evidence to support this claim), arguing that it’s why Ukraine needs to fight back with autonomous systems like his. Machines make decisions in milliseconds, he says. Humans take minutes. 

MARSS’ new anti-drone system, which is currently being tested in the UK and Middle East, targets incoming vehicles in several ways. Step one is trying to jam the drone’s GPS—although Shaheds may have their targets preprogrammed, meaning there’s no signal to jam. If that fails, the system can release an autonomous interceptor drone that is designed to crash into the incoming UAV. Pinl says MARSS has already supplied several systems to Ukraine.

Automating machine-versus-machine conflicts is not quite the same as allowing artificial intelligence to make decisions that result in the death of a human being. But the technology to do that is already in the field. 

Ukraine is already using US-designed Switchblade drones—small, flying explosives that loiter over a vehicle before dropping on it—that are capable of identifying targets using algorithms. 

“From a technical standpoint, it is possible to build in additional autonomous capabilities but that would be dependent on customer requirements,” says Cindy Jacobson, spokesperson for AeroVironment, the company that produces the drones.

Russia has also been experimenting with autonomous weapons systems, according to Samuel Bendett, a Russia analyst at the Center for Naval Analyses, a think tank. Promotional materials for the Lancet and KUB kamikaze drones released by their manufacturer, Kalashnikov, suggests they are capable of operating autonomously. 

The decision to keep human operators involved in targeting decisions is based more on principle than technological necessity, according to Ingvild Bode, associate professor at the Center for War Studies at the University of Southern Denmark. “There has been a creeping, slow integration of more and more of these autonomous or AI-based technologies,” she says.

“It’s essentially just a software change that could allow them to be used without human control,” says Catherine Connolly, the automated decision research manager at campaign group Stop Killer Robots. “It’s leading people to recognize that these systems are here and now, it’s not theoretical.”

This evolution probably means more chaos in the skies for Ukrainians. For Sotnychenko, who is now back in Irpin, the noise of drones is now burned into his memory. He says he recently mistook the sound of a generator for a drone flying overhead. “My head was up in the sky looking for drones,” he says. “When I realized it was just a generator, I calmed down. But it really frightened me.” He uses an app on his phone to alert him to incoming Shaheds. “For me,” he says. “Drones are now the birds bringing death.”