Machines that can make their own decisions – called autonomous systems – raise ethical concerns, especially as it is now possible for anyone with computer experience and the right tools to build drones that can be weaponized.
Autonomous drones have been caught dropping explosives on U.S. troops, shutting down airports, and employed in assassination attempts. Azerbaijan recently used its Turkish Bayraktar TB2 unmanned combat aerial vehicles (UCAVs) to great effect in its conflict with Armenia. A United Nations report says that deadly drone "hunted down" a human target without being instructed to do so.
The autonomous systems being developed make staging such attacks easier and more devastating. The attraction to those who would kill political enemies is the drone's agility and the targeted attack. Kim Jong Un is thought to have assassinated his half brother with VX nerve agent in 2017. A year later, there was evidence that Russia may have used a Novichok chemical agent in England, in a failed assassination attempt of a former Russian spy and his daughter. The U.S. intelligence community linked the Russian government to the attempted assassination of Russian dissident Aleksei Navalny in 2020 with a Novichok agent.
Natasha Bajema, the director of the Converging Risks Lab at the Council on Strategic Risks, in Washington, D.C., has written, "Technologists and engineers who work on drones need to be aware when they develop applications that might be weaponized and exploited for deadly effect. And policymakers and military strategists need to be equally vigilant in defending against a highly agile new threat that, while its use has, gratefully, been limited to date, its potential for danger will continue to increase as commercial, off-the-shelf drone technologies mature and proliferate."