“This technology is our future threat,” warns Serhiy Beskrestnov as he inspects a recently captured Russian drone. Unlike traditional weapons, it uses artificial intelligence to locate and strike targets without human intervention.
Beskrestnov, a consultant for Ukraine’s defence forces, has analysed countless drones since the invasion began. This model stands out. It neither sends nor receives signals, making it impossible to jam or detect.
Both Russian and Ukrainian forces are experimenting with AI on the frontlines. They use it to track enemies, process intelligence, and clear mines faster than ever before.
AI becomes a decisive force on the front
Artificial intelligence has become indispensable for Ukraine’s military. “Our forces receive more than 50,000 video streams from the front each month,” says Deputy Defence Minister Yuriy Myronenko. “AI analyses the footage, identifies threats, and plots them for commanders.”
The technology enables faster decisions, optimises resources, and reduces casualties. Its most striking impact is in unmanned systems. Ukrainian troops now operate drones that lock onto targets and complete their final approach autonomously.
These drones are nearly impossible to jam and extremely difficult to shoot down. Experts predict they will evolve into fully autonomous weapons capable of identifying and destroying targets independently.
Drones that fight independently
“All a soldier needs to do is press a button on a smartphone,” explains Yaroslav Azhnyuk, CEO of Ukrainian tech firm The Fourth Law. “The drone will find its target, drop explosives, assess the damage, and return to base. Piloting skills are not required.”
Azhnyuk believes these drones could greatly strengthen Ukraine’s air defences against Russian long-range drones like the Shaheds. “A computer-guided system can outperform humans,” he says. “It reacts faster, sees more clearly, and moves more precisely.”
Myronenko admits fully autonomous systems are still in development but says Ukraine is close. “We have already integrated parts of this technology into several devices,” he adds. Azhnyuk predicts thousands of these drones could be deployed by the end of 2026.
Progress carries serious risks
Full automation brings dangers. “AI might not distinguish a Ukrainian soldier from a Russian one,” warns Vadym, a defence engineer who requested anonymity. “They often wear identical uniforms.”
Vadym’s company, DevDroid, produces remotely controlled machine guns that use AI to detect and track targets. Automatic firing is disabled to prevent friendly fire. “We could enable it,” he says, “but we need more field experience and feedback before trusting it fully.”
Legal and ethical questions remain. Can AI obey the laws of war? Will it recognise civilians or surrendering soldiers? Myronenko stresses humans must make the final decision, even if AI assists. Yet he warns not all militaries will act responsibly.
The global arms race intensifies
AI is driving a new type of arms race. Traditional defences—jamming, missiles, or tanks—struggle against swarms of intelligent drones.
Ukraine’s “Spider Web” operation last June, when 100 drones struck Russian air bases, reportedly relied on AI coordination. Many fear Moscow could replicate the tactic, both on the front and deep inside Ukraine.
President Volodymyr Zelensky told the United Nations that AI is fuelling “the most destructive arms race in human history.” He called for urgent global rules for AI weapons, warning the issue is “as urgent as preventing the spread of nuclear arms.”
