The science and ethics of digital war

Discuss

43 Responses to “The science and ethics of digital war”

  1. Cocomaan says:

    Former president of Afghanistan and head of the Afghan peace council, Burhanuddin Rabbani, was killed today by someone with a bomb in their turban.

    What do we need this for, again?

  2. parenthetical says:

    William Hurt makes for a pretty passable Afghani.

  3. ifreecarve says:

    There’s an interesting counterpoint here that has come up recently in my field of robotics (specifically, AUVs).  Consider the following:

    1. Autonomous systems can make mistakes that result in casualties
    2. Humans can and continue to make mistakes that result in casualties

    So, if the autonomous system makes fewer mistakes than the human, is it unethical NOT to use the autonomous system?  I have no answers to this.

  4. Rotwang says:

    I predict that terrorist attacks will rise due to robotic war machines.  If our impoverished adversaries of the future can’t fight us in person on the battlefield, why wouldn’t they attack us at home?

  5. EvilSpirit says:

    And yet, there is nothing *less* capable of telling friend from foe (from noncombatant) than an old-fashioned bomb.

  6. Halloween_Jack says:

    Can I make a WOPR joke?

    Srsly, taking human decision making out of the process is a fatally flawed idea no matter what.

  7. Nadreck says:

    I wish that people would stop calling them “armed drones”.  They’re Killer Robots dammit!  We never got Blasters, Jet Packs or Inertialess Drive but we at least got this part of the Gernsback Continuum and should admit it.

  8. CountZero says:

    “In South Africa in 2007, a semiautonomous cannon fatally shot nine friendly soldiers.”
    This would make a great scene in a movie. Oh, wait a minute…
    Actually I have a really good idea. All those countries with disputes should find a nice, big, flat part of the world, (preferably not Belgium this time, but I don’t s’pose anyone would really mind), and send all their autonomous robots there to fight among themselves to decide who’s the winner. Kind of a grown-up Robot Wars with really funky weapons. Just think of the tv ratings!

  9. Chuck Holt says:

    >Yes, that’s right, drones that kill based on software such as facial recognition, rather than any direct human command.

    Turns out the drones will be using Facebook’s facial recognition software.  They’ll also “friend” their intended targets ahead of time (for reconnaissance purposes), and unfriend them right before the missile hits.

  10. mrclamo says:

    The real lesson to take from Skynet is that they couldn’t be stopped- even using time travel! This horse has already left the barn and humanity will never be able to lure it back in.

  11. Deidzoeb says:

    We don’t have ethical use of remote controlled drones right now, let alone ethical programming of autonomous drones. When the US does it with drones in countries we’re not at war with, we focus on the shiny technologies they used. When other countries or groups accomplish the same things with average tech, it’s rightfully called “assassination” and condemned.

  12. Rebecca DeLaTorre says:

    Since the US is rapidly falling behind the rest of the world in technology and internet speed maybe they should not encourage unarmed drones before we become the equivilant of loin-clothed natives huddling before armored, steel bearing, mounted conquers.

  13. CSBD says:

    Pushing buttons rather than manually cleaving someone in half makes violence easier and more acceptable.  

    Pushing a button to allow a robot to push a button whenever it feels like it “should” will be a bad thing if for nothing else, it removes humanity from killing almost completely.   

    Taking this sort of thinking to its logical (or illogical) end is it is like turning on a roomba and going on vacation.  You come back to a clean house, but don’t notice the damn thing terrorizing your cats for two weeks when it is not banging into every piece of furniture you have hundreds of times.

    It does whatever it thinks it should and you get to enjoy the benefits without even thinking about the process much less getting your hands the least bit dirty.

    oh yeah and it does a crappy job of sweeping too.

    • Rotwang says:

      Exactly – do you think Americans will be more, or less willing to engage in wars if all human members of the armed forces are safely behind desks in California somewhere?

      Not only that, but killer-robot making creates jobs…

      • CSBD says:

        I think it will make endless war become so mundane that it will be ignored even more than the two wars going on right now are.

        I think the simple fact that there is still some risk involved (obtusely for the politicians), the wars are limited in scope… though the wars don’t appear to have any real goals or targets for triggering an end.

  14. Eark_the_Bunny says:

    Gee, killer robots, what could possibly go wrong.  BTW:  That sound you hear is Issac Asimov spinning in his grave.

  15. CSBD says:

    Americans already pay less attention to the two wars they are in right now compared to any of their prior wars.  

    If killing and risk to soldiers (and politicians) becomes any more minimized, is there any reason to not be at war constantly (aside from cost).  Ethics and some sort of goal seems to have been taken out of the equation at some point in the 1990s.

    The USA has not fought a war for “survival” since 1945. The wars that have been fought since then have been “punishment” for something, often not related to the USA in a very direct way. The “war on terror” as a response to 9-11, is dubious at best. The people have to be behind the war, but most won’t continue to support a war when lots of people they know are dying without an imminent threat to their own personal safety. It appears that outlawing box cutters and adding in porno-scanners have taken care of this for the most part (for the last 10 years+ 10 days or so).

    The wars as they are being fought are not winnable (are they intended to be?). The last big war that america fought with the intention of winning was WW2. Luckily for the Japanese people, the emperor was able to come to his senses and surrender while there was still time left. The germans fared better due to a closer cultural link and were lucky enough to not have “perpetrated a dastardly sneak attack”. Those wars were winnable as the US military (and people) had this idea that if you manage to kill off the enemy and or make the idea of fighting so uncomfortable, you can get them to quit eventually. It does not appear that the fundamentalism that fuels Al Queda et.al. will ever come to that conclusion. They do have the advantage that the american mindset has changed a bit since 1945 and they are obsessed with minimizing casualties on both sides but especially among “non-combatants”. Several million people in German and Japan did not have this benefit.

    I wonder if these robots will be programmed to have this sort of ethos?

    The only reason that more american soldiers have not been killed in the last 10 years is because of lower individual risk and huge advances in medicine. I doubt the average american would be so apathetic about the wars if every family had one or more gold stars in the window (per WW2).
    The wars have lasted longer than vietnam did and have probably cost more (adjusted dollars + hiding cost/risk with contractors has much to do with the enormous amounts of spin involved in keeping these wars going for no reason.

  16. Mister44 says:

    I dunno – on one hand – “AAHHH!! Killer robots!” On the other hand, perhaps they will perform better than fallible, biased humans.

  17. Daniel Smith says:

    As horribly bad as the idea of autonomous killing machines is, the kind of technical sophistication required to do it has many other potential applications that are quite attractive. The problem is that once we develop systems that can act autonomously in novel environments for virtually any purpose, hooking a gun to them is trivial. I seriously doubt we can stop the technology that will allow us to make such machines possible from emerging. Which only leaves agreements like the landmine and cluster munitions bans as ways to address the problem, something the US has been notoriously reluctant to be a part of.

  18. Cocomaan says:

    Kind of makes the Orange Catholic Bible maxim, “Thou shall not make a machine in the likeness of a human mind” a little more prophetic.

  19. gwailo_joe says:

    I believe that’s a good idea, because nothing created by a human could ever fail…

  20. Don says:

    If a corporation produces a piece of software that is installed in a drone that kills a village full of children, who gets charged with war crimes?  I realize the U.S. government isn’t going to answer that question, since the official position for some time has been “we aren’t prosecutable for war crimes.”  But the rest of the world should probably stake out a position in international law before the first test case gets here.

  21. EricT says:

    Why can’t we just build a machine that absorbs the blame that would normally go to the perpetrator of a particular act?  I know, I know, we already have one, and it’s called government. 

  22. Daniel says:

    This is a terrible idea that is completely inevitable.  It’s such an obviously winning strategy to keep your humans at home and employed making robots and off the battlefield where they’ll just die and lower morale that I don’t think any kind of rules of chivalry can possibly compete.  I’m glad there are people at least trying to slow the inevitable, though.

  23. Jim Saul says:

    What was the Tom Selleck/Gene Simmons movie with the little acid-injecting drone spiders?  Ah yeah, “Runaway”.

  24. Teller says:

    It’s not surprising that a technological advance in warfare will be rushed to battle before perfecting it,  since, once the same technology is widely available, the balance of power returns and a new advantage or deterrent must be developed. I agree this one seems particularly awful – but so must have the first siege machine to the bowman on the parapet.

  25. anansi133 says:

    The line has already been crossed, when a soldier sitting comfortably in middle America can push a button and kill someone in Asia someplace. So much of a warrior’s ethos is built up from the idea that a soldier is putting his own life on the line to protect hearth and home. This is antithetical to that.

    By the time you’ve reached this point, recruiting new pilots has little to do with personal sacrifice for the good of the group, it’s just financial incentives…. and the fascist nature of it all just never enters the equation.

     Who really thinks the enemy is going to up the ante with robots of their own, or try to ‘fight fair’ against this sort of thing? It’s an open invitation to terrorism.

     There’s a reason that bioweapons, poison gas, and nuclear weapons are illegal This kind of thing should be made illegal for the same reason.

  26. TheHowl says:

    I know it’s romantic (in the grimmest of ways) to wax endlessly about wizbang technology, but at the gritty end of the day the overwhelming majority of extrajudicial killings in this world are perpetrated with tech at the low end of the spectrum. The Kalashnikov, the machete, the car bomb. Pulling our hair out about killer robots and facial recognition tech is willful ignorance of the most banal SWPL sort.

  27. Jim Burrill says:

    I have to wonder how many of the commenters have actually seen combat? To speak out against the technology that would take a few more of our lads out of harm’s way …and say it’s better to go hand to hand and bleed out that to sit in a remote opcenter… Has never had to gather the peices of his best friend into a body bag ( and come up short some) or tell a guy’ s wife how he spent his last minutes talking of her as you held him in your arms before he died.  As one poster said, the tech is out of the bottle. If we don’t develop it to protect our guys, we will just fill body bags fighting those who did develop it. The soldier is not the one to make the political decisions to “make peace”  or got to war. We’re the ones that are at the pointy end of a very s**ty stick. If a damn robot keeps more of my guys alive to go home to their families, where do I sign up for them?

  28. Daemonworks says:

    The most ethical way to conduct a war would be to ship all of the people who want the war off to a deserted island, and let them fight it out amongst themselves… and never, ever let them leave.

  29. librtee_dot_com says:

    One terrible lesson that the powers that be have learned from the recent disgraceful attack on Libya (20,000-30,000 bombing sorties flown, massive infrastructure destroyed, zero popular dissension or protests in the streets), as well as the various undeclared drone wars in Yemen, Somalia, Pakistan, etc. is…

    American people don’t give a FLYING FUCK how many brown people we murder, as long as johnny doughboy is not coming home in a body bag.

    I don’t look forward to it, but I feel the karmic retribution for America and it’s apathetic, racist, decadent people will be terrible.

    • Mister44 says:

      Re: “…zero popular dissension or protests in the streets…”

      What are you talking about? Are you seriously saying there weren’t protests that Gaddafi didn’t try to squelch in the months before the civil war???

      • librtee_dot_com says:

        No, I’m saying that there have been no protests in the west, especially compared to the mass protests against the Iraq war.

  30. Fabi Fala says:

    9/11 was the kind of “karmic retribution” the US got for 60 years of foreign politics consisting of funding another man’s terrorist, toppling democratic governments etc.

    Sad thing is that wether the US Government nor the american population learned from this but instead keep on repeating their errors.

  31. Al Billings says:

    Am I the only one who is thinking of the short story, “Malak,” by Peter Watts, in which an autonomous drone is, effectively, made too clever by half?

Leave a Reply