But using robots to perform combat missions, including the use of deadly force, involves a number of technical and ethical questions. A campaign to ban autonomous killer robots has already been mounted, with the idea of a United Nations treaty to forbid their use.
The military has been killing the enemy by remote control for over ten years. Predator drones (2) equipped with Hellfire missiles have proven to be useful weapons to surgically take out terrorists in remote locations. Drone operators stay in safety thousands of miles away from the combat zone, fly the drones to the target and, when the rules of engagement are met, fire a missile, taking out the terrorists.
Even though human operators are in the loop, the use of remote-controlled drones has not been without its problems. On occasion, drone strikes have caused what the military calls “collateral damage” (i.e. civilian casualties).
Also, some drone operators have come down with a form of PTSD, brought on by the dichotomy of performing assassinations inside a control room and then going after a shift to a familiar home and family without time to decompress from the psychological stress.
Beyond Human Operated Drones
Having either aerial drones or even land rover robots being able to act autonomously is a technological step beyond a human-operated drone. The idea is that these weapons platforms would be inserted into a combat zone for a predefined mission but would be programmed to determine on their own who to kill and who to allow to live.
The problem with remote drones is that they have to be controlled by a satellite communications network and, if such is disrupted, they become useless. Autonomous killer robots do not have that problem.
Robots do not get killed or maimed or suffer from PTSD. If they are destroyed, they are not mourned by families and friends. They do not get angry, nor do they tire. They continue until they complete the mission or until they are destroyed or crippled by an enemy.
Leaving aside the scenario presented in the film “The Terminator” (3), in which autonomous killer robots go berserk and start committing mass genocide, a number of groups have raised objections to the very concept. The Campaign to Stop Killer Robots (4) believes that a number of ethical issues exist with the use of these kinds of weapons.
First, no amount of programming will impart to a robot the ability to make ethical judgments on the battlefield. Robots, in the view of the group, will not be able to accurately distinguish between enemy combatants and civilians and will not be able to judge proportionality. Experts in artificial intelligence (5) may disagree with this assessment.
Lowering War’s Threshold
The second objection is that robot warriors lower the threshold to going to war. One consideration that makes governments pause before ordering their soldiers into combat is the prospect of having too many young men and women coming home in body bags.
The issues have to be considered worth it before accepting that casualties will result. With autonomous robots making decisions, war becomes too “neat and painless,” in the words of Captain James T. Kirk.
However, an article in IEEE Spectrum (6) points out that attempts to ban killer robots will likely be ineffective. Countries such as Iran and North Korea and terrorist groups such as ISIS will cheerfully ignore such a ban. The cost of building robot warriors will be low enough that almost anyone could make such a weapon. The same delivery drone that brings a package from Amazon can just as easily deliver a bomb.
Besides, software that can allow a killer robot to perform a mission more easily, more ethically, and with fewer casualties than fallible, emotional humans will sooner or later be developed. The question, therefore, is not whether using robot soldiers is inherently a bad thing, but whether it is less wrong than using human soldiers.
In any case, the argument is not likely to be settled anytime soon.
References & Image Credits:
(1) ARS Technica
(2) How Stuff Works
(4) Stop Killer Robots
(5) TSW: Stanford Conducts Century Long Study of Artificial Intelligence
(7) AC-208 Fires Hellfire
(8) Image of Predator and Hellfire