Machines at War: The Morality of Armed Robot Weapon Systems

While we are right to ban the development of robot weapon systems, reality dictates that the fears over their use may be at least somewhat exaggerated.

With all the debate over the morality of the use of drones in warfare, the potential use of armed robots is beginning to pose a much greater concern. Last week a United Nations expert called for a global moratorium on the development and use of armed robots that can kill without human command. No countries currently use such weapons, but technology is now reaching a point where they have never been closer to becoming a reality. Some states have committed to not deploy them for the foreseeable future, but based on future circumstances it is very possible that the robots could be put into battle. Supporters of armed robots cite the advantages they offer, including the fact that they are not subject to fear and panic, they think faster and are not motivated by human desires such as revenge.

Are these going to be part of the new norms of warfare?

Despite these so called advantages, a moratorium on the use of armed robots is necessary for so many reasons. The development of such machines takes warfare to a futuristic level which is already being seen in some respects with the development of laser weapon systems at sea. Such technological advances are beginning to push war and the use of force into a whole new realm of possibilities, detached from human control. The most significant question posed by armed robots systems is whether they will make it easier for states to go to war. If they enable states to go to war with relative ease and a reduced fear of human casualties then armed robots become a very serious threat to the protection and preservation of life during war and peace.

While a ban on robot weapon systems is logical and necessary, the fears over their potential use may be a little overstated. It is not necessarily true that they would make it easier for states to go to war because of all the potential challenges they would pose in their deployment. Military technology may be advancing, but military budgets are constantly shrinking due to harsh economic realities. The cost of deploying such weapons systems would likely be astronomical, especially in the beginning. Aside from the United States, it is unlikely that any state could or would be willing to afford the development of armed robots as technology and budgets continue to push in opposite directions. In addition, because of their associated costs, it is unlikely that aggressive states could get their hands on such technology, at least not for some time. The primary threats from aggressive states will likely remain the same for some time, including terrorism and the possible use of nuclear weapons presented by states such as Iran and North Korea.

Since World War II, we have resisted the use of nuclear bombs in warfare which arguably presents an even greater threat than the use of armed robots. One challenge that cannot be overlooked is whether robot weapon systems would meet the standards of war established by international law. UN expert Christof Heyns recently urged the Human Rights Council in Geneva to set up a high level panel to assess whether existing international laws are adequate for controlling the use of armed robots. It is disputable whether robots would have the ability to distinguish between civilians or combatants or to assess proportionality, such as whether the likely harm to civilians during a military action exceeds the military advantage gained by it. In addition, it is unclear who would be held responsible in the event that a robot breaches international law.

While the potential threats of using armed robots may be somewhat exaggerated based on current circumstances, they pose too many dangerous questions with too few answers. It is a relief that states have agreed not to deploy the robots, at least right now, but they should be going a step further and calling for an international ban on not just their use but their development. It is currently unlikely that a rogue state could harness the technology to develop such weapons systems, but the possibility exists in the future. At least among liberal, international law abiding states a ban should be enforced on such high tech weapon development. It is an uncertainty that the modern and future world does not need and banning the development of armed robots would do a lot to soothe this uncertainty before it becomes a more pressing international concern.

About Aaron Willschick

Aaron Willschick is a graduate from the MA program in European, Russian and Eurasian Studies at the University of Toronto’s Munk School of Global Affairs. He also holds an MA degree in political science from York University and a BaH from York University’s Glendon College. His research interests include the European Union, European security and defense policy, NATO enlargement to Eastern Europe and democratization. He has extensive experience in policy and research, having worked as a trade assistant at the U.S. Consulate in Toronto and a research assistant to well-known Canadian author Anna Porter and York University political science professor Heather MacRae. Contact Information: Email: awillschick@rogers.com