The science and ethics of digital war

From the Department of Terrible Ideas: The Washington Post has a must-read story up reporting on research that promises to someday make military drones fully automated. Yes, that's right, drones that kill based on software such as facial recognition, rather than any direct human command.

I know the obvious thing to do here is make Skynet jokes. But, frankly, there are plenty of problems with this without welcoming our robotic overlords. Say, for instance, this issue, which the Post broaches with a note of wry eyebrow-raising:

The prospect of machines able to perceive, reason and act in unscripted environments presents a challenge to the current understanding of international humanitarian law.

To say the least.

But here's the really interesting thing about this story: Arms control ethicists are trying to deal with it before it exists, rather than after-the-fact.

In Berlin last year, a group of robotic engineers, philosophers and human rights activists formed the International Committee for Robot Arms Control (ICRAC) and said such technologies might tempt policymakers to think war can be less bloody.

Some experts also worry that hostile states or terrorist organizations could hack robotic systems and redirect them. Malfunctions also are a problem: In South Africa in 2007, a semiautonomous cannon fatally shot nine friendly soldiers.

The ICRAC would like to see an international treaty, such as the one banning antipersonnel mines, that would outlaw some autonomous lethal machines. Such an agreement could still allow automated antimissile systems.

“The question is whether systems are capable of discrimination,” said Peter Asaro, a founder of the ICRAC and a professor at the New School in New York who teaches a course on digital war. “The good technology is far off, but technology that doesn’t work well is already out there. The worry is that these systems are going to be pushed out too soon, and they make a lot of mistakes, and those mistakes are going to be atrocities.”


Also: If you're confused about the choice of photo, look to the right of the guy's head, about the middle of the image.

Image: Droning, a Creative Commons Attribution (2.0) image from oddwick's photostream


  1. Former president of Afghanistan and head of the Afghan peace council, Burhanuddin Rabbani, was killed today by someone with a bomb in their turban.

    What do we need this for, again?

  2. There’s an interesting counterpoint here that has come up recently in my field of robotics (specifically, AUVs).  Consider the following:

    1. Autonomous systems can make mistakes that result in casualties
    2. Humans can and continue to make mistakes that result in casualties

    So, if the autonomous system makes fewer mistakes than the human, is it unethical NOT to use the autonomous system?  I have no answers to this.

  3. I predict that terrorist attacks will rise due to robotic war machines.  If our impoverished adversaries of the future can’t fight us in person on the battlefield, why wouldn’t they attack us at home?

  4. And yet, there is nothing *less* capable of telling friend from foe (from noncombatant) than an old-fashioned bomb.

  5. Can I make a WOPR joke?

    Srsly, taking human decision making out of the process is a fatally flawed idea no matter what.

  6. I wish that people would stop calling them “armed drones”.  They’re Killer Robots dammit!  We never got Blasters, Jet Packs or Inertialess Drive but we at least got this part of the Gernsback Continuum and should admit it.

  7. “In South Africa in 2007, a semiautonomous cannon fatally shot nine friendly soldiers.”
    This would make a great scene in a movie. Oh, wait a minute…
    Actually I have a really good idea. All those countries with disputes should find a nice, big, flat part of the world, (preferably not Belgium this time, but I don’t s’pose anyone would really mind), and send all their autonomous robots there to fight among themselves to decide who’s the winner. Kind of a grown-up Robot Wars with really funky weapons. Just think of the tv ratings!

  8. >Yes, that’s right, drones that kill based on software such as facial recognition, rather than any direct human command.

    Turns out the drones will be using Facebook’s facial recognition software.  They’ll also “friend” their intended targets ahead of time (for reconnaissance purposes), and unfriend them right before the missile hits.

  9. The real lesson to take from Skynet is that they couldn’t be stopped- even using time travel! This horse has already left the barn and humanity will never be able to lure it back in.

  10. We don’t have ethical use of remote controlled drones right now, let alone ethical programming of autonomous drones. When the US does it with drones in countries we’re not at war with, we focus on the shiny technologies they used. When other countries or groups accomplish the same things with average tech, it’s rightfully called “assassination” and condemned.

  11. Since the US is rapidly falling behind the rest of the world in technology and internet speed maybe they should not encourage unarmed drones before we become the equivilant of loin-clothed natives huddling before armored, steel bearing, mounted conquers.

  12. Pushing buttons rather than manually cleaving someone in half makes violence easier and more acceptable.  

    Pushing a button to allow a robot to push a button whenever it feels like it “should” will be a bad thing if for nothing else, it removes humanity from killing almost completely.   

    Taking this sort of thinking to its logical (or illogical) end is it is like turning on a roomba and going on vacation.  You come back to a clean house, but don’t notice the damn thing terrorizing your cats for two weeks when it is not banging into every piece of furniture you have hundreds of times.

    It does whatever it thinks it should and you get to enjoy the benefits without even thinking about the process much less getting your hands the least bit dirty.

    oh yeah and it does a crappy job of sweeping too.

    1. Exactly – do you think Americans will be more, or less willing to engage in wars if all human members of the armed forces are safely behind desks in California somewhere?

      Not only that, but killer-robot making creates jobs…

      1. I think it will make endless war become so mundane that it will be ignored even more than the two wars going on right now are.

        I think the simple fact that there is still some risk involved (obtusely for the politicians), the wars are limited in scope… though the wars don’t appear to have any real goals or targets for triggering an end.

  13. Gee, killer robots, what could possibly go wrong.  BTW:  That sound you hear is Issac Asimov spinning in his grave.

  14. Americans already pay less attention to the two wars they are in right now compared to any of their prior wars.  

    If killing and risk to soldiers (and politicians) becomes any more minimized, is there any reason to not be at war constantly (aside from cost).  Ethics and some sort of goal seems to have been taken out of the equation at some point in the 1990s.

    The USA has not fought a war for “survival” since 1945. The wars that have been fought since then have been “punishment” for something, often not related to the USA in a very direct way. The “war on terror” as a response to 9-11, is dubious at best. The people have to be behind the war, but most won’t continue to support a war when lots of people they know are dying without an imminent threat to their own personal safety. It appears that outlawing box cutters and adding in porno-scanners have taken care of this for the most part (for the last 10 years+ 10 days or so).

    The wars as they are being fought are not winnable (are they intended to be?). The last big war that america fought with the intention of winning was WW2. Luckily for the Japanese people, the emperor was able to come to his senses and surrender while there was still time left. The germans fared better due to a closer cultural link and were lucky enough to not have “perpetrated a dastardly sneak attack”. Those wars were winnable as the US military (and people) had this idea that if you manage to kill off the enemy and or make the idea of fighting so uncomfortable, you can get them to quit eventually. It does not appear that the fundamentalism that fuels Al Queda will ever come to that conclusion. They do have the advantage that the american mindset has changed a bit since 1945 and they are obsessed with minimizing casualties on both sides but especially among “non-combatants”. Several million people in German and Japan did not have this benefit.

    I wonder if these robots will be programmed to have this sort of ethos?

    The only reason that more american soldiers have not been killed in the last 10 years is because of lower individual risk and huge advances in medicine. I doubt the average american would be so apathetic about the wars if every family had one or more gold stars in the window (per WW2).
    The wars have lasted longer than vietnam did and have probably cost more (adjusted dollars + hiding cost/risk with contractors has much to do with the enormous amounts of spin involved in keeping these wars going for no reason.

  15. I dunno – on one hand – “AAHHH!! Killer robots!” On the other hand, perhaps they will perform better than fallible, biased humans.

  16. As horribly bad as the idea of autonomous killing machines is, the kind of technical sophistication required to do it has many other potential applications that are quite attractive. The problem is that once we develop systems that can act autonomously in novel environments for virtually any purpose, hooking a gun to them is trivial. I seriously doubt we can stop the technology that will allow us to make such machines possible from emerging. Which only leaves agreements like the landmine and cluster munitions bans as ways to address the problem, something the US has been notoriously reluctant to be a part of.

  17. Kind of makes the Orange Catholic Bible maxim, “Thou shall not make a machine in the likeness of a human mind” a little more prophetic.

  18. If a corporation produces a piece of software that is installed in a drone that kills a village full of children, who gets charged with war crimes?  I realize the U.S. government isn’t going to answer that question, since the official position for some time has been “we aren’t prosecutable for war crimes.”  But the rest of the world should probably stake out a position in international law before the first test case gets here.

  19. Why can’t we just build a machine that absorbs the blame that would normally go to the perpetrator of a particular act?  I know, I know, we already have one, and it’s called government. 

      1. next? an automated trading program already nearly destroyed the world economy the other year.

        AI is a great idea, for entertainment.

  20. This is a terrible idea that is completely inevitable.  It’s such an obviously winning strategy to keep your humans at home and employed making robots and off the battlefield where they’ll just die and lower morale that I don’t think any kind of rules of chivalry can possibly compete.  I’m glad there are people at least trying to slow the inevitable, though.

  21. It’s not surprising that a technological advance in warfare will be rushed to battle before perfecting it,  since, once the same technology is widely available, the balance of power returns and a new advantage or deterrent must be developed. I agree this one seems particularly awful – but so must have the first siege machine to the bowman on the parapet.

  22. The line has already been crossed, when a soldier sitting comfortably in middle America can push a button and kill someone in Asia someplace. So much of a warrior’s ethos is built up from the idea that a soldier is putting his own life on the line to protect hearth and home. This is antithetical to that.

    By the time you’ve reached this point, recruiting new pilots has little to do with personal sacrifice for the good of the group, it’s just financial incentives…. and the fascist nature of it all just never enters the equation.

     Who really thinks the enemy is going to up the ante with robots of their own, or try to ‘fight fair’ against this sort of thing? It’s an open invitation to terrorism.

     There’s a reason that bioweapons, poison gas, and nuclear weapons are illegal This kind of thing should be made illegal for the same reason.

  23. I know it’s romantic (in the grimmest of ways) to wax endlessly about wizbang technology, but at the gritty end of the day the overwhelming majority of extrajudicial killings in this world are perpetrated with tech at the low end of the spectrum. The Kalashnikov, the machete, the car bomb. Pulling our hair out about killer robots and facial recognition tech is willful ignorance of the most banal SWPL sort.

  24. I have to wonder how many of the commenters have actually seen combat? To speak out against the technology that would take a few more of our lads out of harm’s way …and say it’s better to go hand to hand and bleed out that to sit in a remote opcenter… Has never had to gather the peices of his best friend into a body bag ( and come up short some) or tell a guy’ s wife how he spent his last minutes talking of her as you held him in your arms before he died.  As one poster said, the tech is out of the bottle. If we don’t develop it to protect our guys, we will just fill body bags fighting those who did develop it. The soldier is not the one to make the political decisions to “make peace”  or got to war. We’re the ones that are at the pointy end of a very s**ty stick. If a damn robot keeps more of my guys alive to go home to their families, where do I sign up for them?

  25. The most ethical way to conduct a war would be to ship all of the people who want the war off to a deserted island, and let them fight it out amongst themselves… and never, ever let them leave.

  26. One terrible lesson that the powers that be have learned from the recent disgraceful attack on Libya (20,000-30,000 bombing sorties flown, massive infrastructure destroyed, zero popular dissension or protests in the streets), as well as the various undeclared drone wars in Yemen, Somalia, Pakistan, etc. is…

    American people don’t give a FLYING FUCK how many brown people we murder, as long as johnny doughboy is not coming home in a body bag.

    I don’t look forward to it, but I feel the karmic retribution for America and it’s apathetic, racist, decadent people will be terrible.

    1. Re: “…zero popular dissension or protests in the streets…”

      What are you talking about? Are you seriously saying there weren’t protests that Gaddafi didn’t try to squelch in the months before the civil war???

      1. No, I’m saying that there have been no protests in the west, especially compared to the mass protests against the Iraq war.

  27. 9/11 was the kind of “karmic retribution” the US got for 60 years of foreign politics consisting of funding another man’s terrorist, toppling democratic governments etc.

    Sad thing is that wether the US Government nor the american population learned from this but instead keep on repeating their errors.

Comments are closed.