Ethics of robots that kill

h+ Magazine has a fascinating interview with Dr. Ronald Arkin, the director of Georgia Tech's Mobile Robot Lab who literally wrote the book on the ethics of robots that kill. The book, titled Governing Lethal Behavior in Autonomous Robots, lays out Arkin's research across law, philosophy, military ethics, and engineering to address dilemmas we'll face in the future as we build even more complex killing machines. From h+:
 Images Book-Img Weblarge 9781420085945 h+: How does the process of introducing moral robots onto the battlefield get bootstrapped and field tested to avoid serious and potentially lethal "glitches" in the initial versions of the ethical governor? What safeguards should be in place to prevent accidental war?

RA: Verification and validation of software and systems is an integral part of any new battlefield system. It certainly must be adhered to for moral robots as well. What exactly the metrics are and how they can be measured for ethical interactions during the course of battle is no doubt challenging, but one I feel can be met if properly studied. It likely would involve the military's battle labs, field experiments, and force-on-force exercises to evaluate the effectiveness of the ethical constraints on these systems prior to their deployment, which is fairly standard practice. The goal is not to erode mission effectiveness, while reducing collateral damage.

A harder problem is managing the changes and tactics that an intelligent adaptive enemy would use in response to the development of these systems... to avoid spoofing and ruses that could take advantage of these ethical restraints in a range of situations. This can be minimized, I believe, by the use of bounded morality –- limiting their deployment to narrow, tightly prescribed situations, and not for the full spectrum of combat.
"Teaching Robots the Rules of War" (h+, thanks RU Sirius!)

Governing Lethal Behavior in Autonomous Robots (Amazon)


  1. Good Afternoon. The wars of the future will not be fought on the battlefield or at sea, but in space, or on top of a really tall mountain. In either case, most of the actual fighting will be done by small robots, and as you leave here today, your duty is clear: To build and maintain those robots. Thank You.

    Get ready for Second Variety / Screamers.

  2. Ethical lethal robots? Please.

    The whole point of war-fighting robots is to eliminate the pesky conscience of the individual soldier, who might be undisciplined enough to question the basis for the war, or to listen attentively to a civilian pleading for her life.

    Adding a morality module to such a machine would be like applying perfume to my hammer.

  3. I could rant for hours on what a bad and inevitable mistake robot and drone war machines are…but the major problem I foresee is that they will eliminate much of the objection that US citizens have with our fighting stupid and greedy wars overseas. Without human lives on the line, why should people object?

    Poor nations will just hate us all the more and terrorism will become their only recourse – unable to kill soldiers, why not kill civilians? We are setting the stage for a lot of useless heartbreak in the future.

  4. It’s way, way too late to be talking about this. We already have autonomous robots running loose, and they already kill indiscriminately. Sometimes they are obliged to pay out money when they do, but they have plenty.

    The robots are built out of laws and people, but are no less dangerous for it. On the contrary, they have access to the services of armies and police. Any people who balk at their role in them are easily replaced. Inconvenient laws can be changed at minimal expense, and are, routinely.

    We call these robots “corporations”, from the Latin corpus: body. Someday the people will not be necessary, but the only problem with them now is their slowness.

  5. Article is superficially crap from a computer scientist’s perspective. The reality of image classifications algorithms as they will probably continue to stand until general AI appears or is deemed impossible, is that a certain percentage of the time your robot will see a cat and think soldier. And blow the cat away. Ethics don’t even apply when the entity is incapable of distinguishing parties well.

  6. Although Wikipedia has obviously been photoshoped already, we must continue to guard ourselves against malevolent corporate photoshoping robots that are bent on being first and desire nothing more than to rule our pool. Being first and being photoshoped are not currently, nor will they ever be, an acceptable form of artistic expression.

  7. If it’s ethical to drop a bomb in hope of killing an enemy even if innocents might be harmed, then I fail to see how it’s unethical to dispatch a robot to seek out and kill the specific target. Even if it might err and kill innocents, it would presumably do so less often than more primitive alternatives like the bomb. The robot is potentially a more precise weapon, and therefore inherently more ethical to employ, even if there’s room to refine it further.

    And the removal of conscience is in no way the “whole point” of military robots. It’s primarily about operational efficiency — not needing rest or food or complicated medical care, having superior senses and reaction time, and not being handicapped by fear of death or injury.

    While I understand the argument about reducing the moral/political consequences of the use of force, One could alternatively argue that employing smarter weapons would place moral responsibility for violent action more squarely on the shoulders of the political decision makers — the fewer intervening human agents, the more clearly it’s like the leader himself pulling the trigger, or bringing down the axe.

  8. I agree with #2. The perfect solider is the one who runs towards the enemy armed with a dull knife just because his superior officer told him to, not the one who hesitates in front of another human being. The whole idea of killing robots is getting rid of those pesky ethics. No army in the world would allow their shiny killing toys to have a “good-vs-bad chip” installed, thus rendering the whole point moot.

    Thinking about it, I’m not really sure they would even allow a “never shoot our own troops” system to be installed.

  9. This is not a debate about how “ethical” it is to drop a photoshoped bomb into our pool. If so-called “innocent” corporate robots are going to use photoshop to rule our pool then we have no choice but to call them n00bs. Regardless of the number of “primitive alternatives” there are to photoshoping (i.e. airbrushing), there will never be a replacement for being first, or for ruling our pool.
    In order to seek-out and target things that need photoshoping, corporate robots will have to rule our pool. Even photoshoping needs military robots. If we spend money on healthcare, how will military robots photoshop? Let alone rule our pool?

  10. @Takuan: You’ve just given me a very chilling vision of future “minefields” in third world countries littered with semi and fully functional autonomous killbots.

  11. Ethical warbots use biodegradable ammunition, and sweep up their spent shell casings when they’re done.

  12. the man who wrote THE book on psychological warfare also wrote SF. That is where “manshonjagger” comes from.

  13. But we need something more ethical than biodegradable ammunition and autosweep functions. The most accurate picture of a future warbot will be a small, quiet, solar powered machine. Rockets will contain not only shrapnel and cluster-type sub-munitions, but also seed-bombs in the same rocket to replant the forests destroyed by the wars of the future.

    Recycle bots should follow the front lines to melt down robot casualties into reinforcements. Recycle bots could then create the soylent green to fuel the engineers, forced to design ever more effective warbots. Of course, this thread has been photoshoped. The book was written by a corporate anti-healthcare robot who wants to trick us into letting him rule our pool!

  14. Would I ruin our “OMG, they only want robots because robots don’t feel human pity!” party if I were to point out that human soldiers frequently(I’d go so far as to say reliably) flip out and start killing civilians just because they can, under certain circumstances?

    Obviously, war robots would be weapons, designed to kill people, and will never give us the warm and fuzzies; but I find the notion that they are inevitably worse than humans quite dubious. Robots may not be empathetic; but they are not sadistic. Nor does fear dull their judgment, nor do they have any notion of vengeance. And, I’m just guessing, that robots are pretty unlikely to use mass rape as a tactic.

  15. Yeah, the “robots can’t feel pity” comment has to be balanced by “robots can’t feel malice” and “robots can’t feel fear.” It seems to me most of the problems (at least in recent times) come when frontline troops are doing what they aren’t trained to do, out of malice or fear. Robots are always by the book. And while they don’t have any inherent way to refuse atrocious orders, a)orders from higher ups aren’t usually the problem, and b)history has shown there is rarely any difficulty finding humans willing to do atrocious things, anyway. As for whether they can “see a cat and think soldier,” well, so can a human…

  16. The worst thing about war robots is the potential software glitches, but once those are ironed out, it’s less likely that software will glitch then a person will.

    People are pretty much guaranteed to make mistakes. Computers, when they are running programs that are finished and properly run, rarely do.

    Also, yes. Robots that are able to calculate density and select non-penetrating ammunition on the fly, as well as visually assess targets and calculate threat within nanoseconds are less likely to cause collateral damage than panicky kids with machine guns.

    Robots will likely not surpass the best of humans, but they are always better than the worst of humans.

  17. What an absolute waste of time. First, wait until it’s a problem, or, at least in the same century as the problem, THEN start dealing with it. This entire field of study is similar to the ethics of Wave Motion Guns or Red Matter.

  18. well, you could just design their positronic brains to obey the three laws, that would solve everything.

  19. “and how can a machine be made in the image of the human mind?”

    One Butlerian Jihad, right up. Better than getting blasted away by the Cylon.

    Oh, and…maybe these people should ask the RIAA/MPAA what happens when you try to automate repression against cash-poor, time-rich people. How many time before The Enemy learns to spoof the friend/foe identification mechanism on these ‘bots ?

  20. While I wish the gentleman well in his quest for “moral robots on the battlefield,” to me this sounds a bit like “clean coal” — theoretically possible, demonstrable in tests, and economically uncompelling.

  21. I think the focus is a bit misplaced. Machines cannot be expected to be “ethical”. It is their operators to whom we should be paying attention.

    “We call these robots “corporations”, from the Latin corpus: body. Someday the people will not be necessary, but the only problem with them now is their slowness.”

    Sorry, will never happen. While it is true that it is every general’s (and capitalist’s) dream to wield an army of mindless slaves that do your every bidding, in practice this can never happen. The reasons why should be immediately apparent.

    If the process of producing consumer goods is completely automated and robotic who will buy your products? If war is completely automated and robotic how do we know when we’ve won?

    This idea that it is desirable or even possible to replace consumers or soldiers with robots is delusional and on all fours with the supply side delusion that if lowering taxes increases revenue then eliminating them altogether will increase revenues to infinity.

    U hoomans sure are st00pid.

  22. I’d bet a buck or two that one of the reasons for the US Army is pushing for these is that, when the robots commit war crimes, they really can’t be punished for it. At the very worst, they’ll blame the programmers.

    (Or the virii introduced into the ROMs when they sourced their manufacture out to the lowest bidder, in Viet-Nam.)
    (Or they think their soldiers are too brain-dead to fight properly…)(Hmmm…)

  23. The whole point of war-fighting robots is to eliminate the pesky conscience of the individual soldier, who might be undisciplined enough to question the basis for the war, or to listen attentively to a civilian pleading for her life.

    That is straight up absurd on multiple levels.

    First, it has been proven since, oh, the dawn of human existence, that getting one human to kill another innocent human during a time of war is trivial. If that is your goal, you will accomplish it. It isn’t like adding in robots suddenly makes this easier. If indescrimnate killing of pleading women and children was the goal, nukes will do the job handily.

    Secondly, robot warriors have the capacity to be a few orders of magnitude more moral combatants that a human soldier. Just consider this absurdly common scenario.

    You are in a squad of marines. You are walking down the central street of a city. You notice that today there are no civilians visible and so are weary. Suddenly, concealed IED explodes, killing half the men in your square. At the same time, from apartment buildings on either side you start taking fire. What do you do?

    Most people, even normally deeply moral people, respond to the sudden death of friends followed by life threatening situations like this by using all of their power to defend themselves and destroy the enemy. In this case, you might fire at the apartment buildings indifferent to the civilian lives that you might be destroying. In fact, if you have the capacity (and a US marine square does), you might all but level the buildings. Eh, this might be a complete violation of the rules of engagement, but court marshaled is better than dead, especially when everyone else in the squad will back up whatever story you tell.

    Now consider a drone soldier.

    You are piloting a drone squad from a nice arm chair in California while slurping on an ice coffee. Suddenly a bomb goes off wiping out half of the drones in the patrolling squad. The drones then starts taking fire from two nearby apartment buildings. Lickity-split, a couple of high ranking commanders show up by your consul with a military lawyer in tow. You are reminded that the rule of engagement forbid risking civilian casualties and so are instructed to only fire back with a minimal of force and only on targets that you have 100% identified as enemies. So, instead of leveling the building in a desperate attempt to save your own life, you take pop shots back at only targets you can see with a bullet or laser. After the engagement is over, all of the recordings are reviewed (and everything is recorded). If it turns out that despite not being in any physical danger you were too trigger happy, you get tossed to the curb.

    The danger of detachment resulting in the slaughter of civilians is very real, but the detachment comes from being scared, threatened, and watching your friends die, not having a scowling commander and lawyer sitting over your shoulder, recording your every actions, and making sure you follow the rules of engagement to the letter while you yourself are in zero physical danger. Computer screens might cause some level of detachment, that but detachment is trivial compared to the kind of detachment you develop when your life is threatened every day, friends die, and the civilian population seems actively out to get you. The detachment cause by staring at a screen is easily remedied by recording constantly, having multiple people checking up on your, and reviewing all actions taken. The kind of detachment that comes from being in a war and being shot at on the other hand is impossibly to remedy.

  24. War is not nice. It kills people and breaks things. There are no rules of war. The only objective in war is to win, and any country that adopts rules is limiting itself arbitrarily. Something like “I’ll whip you with one hand behind my back.” The only problem is public relations, and the only fix is to tell the public to go relation itself.

  25. War is not nice. It kills people and breaks things. There are no rules of war. The only objective in war is to win, and any country that adopts rules is limiting itself arbitrarily.

    Just ask the Carthaginians what they had to say about that. Oh wait, you can’t, because Rome destroyed them and their entire history.

    Beside the crudeness/immorality of violence, it seems to me that the concern here is that in the process of destroying “the enemy” you’ll also destroy yourself. “When one fights with monsters, one must take care not to become a monster.”

    Whether it’s Skynet who was programmed to “defend the USA from its enemies” and interpreted “enemies” as “humans” and not just “Soviets” (echoes of the AM computer from “I Have No Mouth, and I Must Scream“) or simply the paradox of “we had to destroy the village to save the village” (e.g. “The War on Terror”), beware the costs of war.

    Consider this quote by Robert A. Heinlein,

    I also think there are prices too high to pay to save the United States. Conscription is one of them. Conscription is slavery, and I don’t think that any people or nation has a right to save itself at the price of slavery for anyone, no matter what name it is called. We have had the draft for twenty years now; I think this is shameful. If a country can’t save itself through the volunteer service of its own free people, then I say : Let the damned thing go down the drain!

  26. Strawmen Tak, straw men. Heinlein wasn’t a sexist because he put women on a pedestal. See?? How can that be sexist? Heinlein wasn’t a wingnut because he was an authoritarian wingnut… blah blah blah…. I can haz semantic games?

    Yeah, he was fun reading when I was 16, so was Ayn Rand. But his misogyny was hideous and exceeded only by his militarism his elitism.

    I prefer real stories peopled with real characters. That’s why I don’t read SF any more. Not since I was in my 20’s anyway. Sure a certain amount of escapism is fun but I’ve gotten to the point where technological spectacle, Sf or whatever, just bores me to tears. These days I want human drama. The less SF is in a book or movie the better it is in my opinion.

    “Up” is far superior to that horrible drivel they call the latest Star Trek movie and “The Hurt Locker” is ten time better than anything else released so far this summer. Hollywood and the “SyFy” channel are killing SF, thank the gods. I did enjoy the latest “Torchwood, Children of Earth” series but that was because there was very little actual SF in it.

  27. Holy Crap! Talk about on topic! (Currently on Drudge):
    Robot attacked Swedish factory worker
    A Swedish company has been fined 25,000 kronor ($3,000) after a malfunctioning robot attacked and almost killed one of its workers at a factory north of Stockholm.

  28. I’ve often wondered what will happen in the future when wars are fought ONLY by robots, with no human involvement. Would it become a question of which party is willing to throw more money at the conflict? And if that is the case, why not have a neutral party take sealed bids from both sides and declare the higher bidder the winner?

  29. Building robots without the Three Laws of Robotics will be the beggining of the end for humankind.

  30. And, I’m just guessing, that robots are pretty unlikely to use mass rape as a tactic.

    “Oh, look! There’s a rape machine – I’d go outside if he’d look the other way. You wouldn’t believe the things they do.”

  31. Is this a book about programming ethics into a robot that can kill, or the ethics that people need to consider before making robots that can kill? “Governing Lethal Behavior in Autonomous Robots Engineers.”

    Forget Terminator and think War Games: “The only winning move is not to play.”

  32. But…if super-advanced robots could make better soldiers than humans then the plot of “Star Wars: Episode II” would make no sense whatsoever!

Comments are closed.