Military technology

The ethics of a robot army

Within the field of military technology two apparently opposite trends are evident: on the one hand robots are taking over more and more tasks from servicemen whilst on the other hand the moral weight of the decisions taken by people in the armed forces is increasing.
It is anticipated that the United States should have an autonomously operating robot army by the year 2034. Though that may sound like science fiction the developments are moving fast. Some 20,000 are being deployed in Iraq, most of which are flying robots. These are still remote controlled but the first generation of autonomous equipment should be operational by 2015.
image“One of the questions that this raises is whether these kinds of weapons because that is what they are, satisfy the stipulations laid down in international humanitarian war legislation”, says the engineer and lawyer Dr. Lamber Royakkers. One might also question whether independently operating robots do not even increase the temptation to incur unnecessary damage on unsuspecting victims and whether they can be relied upon to differentiate between civilians and soldiers. “In the case of unarmed robots deployed to, for instance, inspect areas before soldiers go in, I see no problem. But armed robots, which independently determine when to attack and kill and when not to attack, should be banned. The risks attached to armed robots are so great as to be unacceptable. No one can predict how they will operate in dynamic and complex situations and what will happen if a defect occurs.” In a certain sense one might assert that robots are incredibly well suited to becoming “befehl ist befehl”soldiers, a genre which, after the Second World War, it was decided should never be allowed to exist. People should never be allowed to be able to hide behind the orders of others when seeking to shun moral acceptability for their own actions. Any order provides a moral context within which own considerations should exist. In the case of robots there is no such thing as moral considerations.
Ironically, the moral responsibility of those in the armed forces is currently being heightened by the rise of “network centric warfare”. In this way information about the situations with which those engaged in armed combat are confronted is perpetually dispersed so that they have more opportunities to make autonomous decisions. The emphasis is thus shifting from carrying out orders to comprehending the intentions behind orders so that action is taken on those grounds and on the basis of any new information that may come to light.
“This demands of military personnel different kinds of moral capacities. An often-cited textbook military dilemma is that in which a child approaches, thus threatening to give away your position. Do you then speedily shoot it down? Such action is much easier to take if your instructions are to defend the secrecy of your position at all costs than if you independently have to assess whether the position is really under threat. It is always harder if you are required to retrospectively justify your actions.”
Another point worth considering is the fact that much information is not at all certain. Miscommunication and incorrect assessment tend to be the order of the day in the heat of the moment on the average battlefield. The chance of error, culminating in the opening up of “friendly fire” as the ultimate nightmare, is great. Furthermore, as Royakkers explains, it is harder to find support for one’s decisions in the case of network centric warfare: it is no longer the case that an army of hundreds of men descends on a particular target. In modern warfare soldiers operate in small units in which each has his or her own tasks.
In other words, information technology changes the moral context within which individual soldiers are required to make their decisions. The users of such technology have to be prepared for that so that beforehand and afterwards they are able to justify all their decisions, difficult as that may be in the circumstances of war.

Scroll to top