back to article Human Rights Watch proposes new laws of robotics

Human Rights Watch (HRW) has issued a document titled Losing Humanity: The Case against Killer Robots that argues development of autonomous weapons must be stopped because it represents a threat to human rights. The document defines three types of autonomous weapons, namely: Human-in-the-Loop Weapons: Robots that can select …

COMMENTS

This topic is closed for new posts.
  1. Roger Stenning
    Terminator

    Asimov's gonna be spinning in his grave :(

    If they're that scared of killer bees - I mean robots - why not just install Asimov's famous Three Laws of Robotics, then? No one will be killed, maimed, or get hurt feelings from any form of robot if this happens!

    1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

    2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.

    3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

    Simple, innit?

    1. Neoc
      Coat

      Re: Asimov's gonna be spinning in his grave :(

      If you're going to quote Asimov, please quote all *four* Laws:

      0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm

      First posited in "Robots And Empire", a late addition to the set by R. Giskard and told to R. Daneel Oliver.

      Mine's the metallic-silver one.

      1. Neoc

        Re: Asimov's gonna be spinning in his grave :(

        Forgot to add that the First law was modified to be subservient to the Zeroth law. I normally wouldn't fix this oversight, but El Reg commenters tend to pick on things like this. ^_^

      2. Neoc

        Re: Asimov's gonna be spinning in his grave :(

        Damnabbit: *Olivaw*, not Oliver.

        Always check fingers before applying to keyboard.

        1. tkioz
          Unhappy

          Re: Asimov's gonna be spinning in his grave :(

          Asimov's laws are good... but they are very dangerous in the real world.

          Example, humans eating junk food is bad for them, robots then should enforce nutritional standards on humans... humans order robots to let them eat junk food, robots rightly consider that order to be invalid because of how the second law is worded in regards to the first law...

          That's just one example of robots taking control over humans using those laws. There are many many more. Think the Nanny states are bad now? Wouldn't be a patch on a robot run world.

          1. TRT Silver badge

            Re: Asimov's gonna be spinning in his grave :(

            Add an indisputable fact to the fact list. "It is harmful to humanity to deny its members free will."

            1. Great Bu

              Re: Asimov's gonna be spinning in his grave :(

              "It is harmful to humanity to deny its members free will."

              So the free will being excercised by a murderer when he shoots his victim cannot be interfered with because the zeroth law trumps the first law ? Not that effective a plan.....

            2. Elmer Phud

              Re: Asimov's gonna be spinning in his grave :(

              "Add an indisputable fact to the fact list. "It is harmful to humanity to deny its members free will.""

              First you need to define 'free will' as it seems that different cultures have vastly different interpretations.

              1. TRT Silver badge

                Re: "free will"

                You have a very good point!

          2. Brewster's Angle Grinder Silver badge
            Terminator

            @tkioz

            "Asimov's laws are good... but they are very dangerous in the real world."

            Yeah, I think he wrote a few stories about how they might be misinterpreted.

            1. James 100

              Re: @tkioz

              One that sticks in my memory is one character, Bigman, breaking out of jail by putting a weapon to his head, telling the robot that unless he's released he'll kill himself. "May not allow a human to come to harm through inaction" - so the robot released him to prevent the suicide. So much for automated security!

              I'd say the rules are fine as they are, though. More than half a century ago it was illegal to shoot civilians with rifle bullets; it's just as illegal to shoot them with guided missiles now, and half a century from now it'll be illegal to shoot them with interplanetary plasma warheads - do we really need new laws, or just compliance with the existing ones? (Of course mistakes and crimes both happen, too - but new laws rarely help prevent either.)

              Legally, is unleashing a psycho-killbot on somebody any different from planting a landmine on their doorstep or indeed just shooting them yourself? Or, if it's accidentally unleashed, is causing somebody's death through accidentally releasing a psycho-killbot any different from accidentally releasing toxic gas or a runaway train that kills them?

              1. TRT Silver badge

                Re: @tkioz

                OK. How about "uphold the law" as a rule then? Or is that too Robocop?

    2. James Micallef Silver badge
      Unhappy

      Re: Asimov's gonna be spinning in his grave :(

      "just install Asimov's famous Three Laws of Robotics, then? No one will be killed, maimed, or get hurt feelings from any form of robot if this happens!"

      Unfortunately (and just as with the development of lots of other technology) these robots are being developed with the express purpose of killing, maiming and hurting. So HRC might just as well have said "stop developing killer robots". And I suspect that the answer from the world's military forces in either case would be "garn git f***ed"

    3. Nuke
      Facepalm

      @Roger Stenning - Re: Asimov's gonna be spinning in his grave :(

      Wrote ;_ " why not just install Asimov's famous Three Laws of Robotics, then? ........ Simple, innit?"

      Yes, thanks for solving that one. We must put you onto the World Hunger problem next.

      Or how about a law to stop humans killing each other? Oh.. wait.........

    4. Thorne

      Re: Asimov's gonna be spinning in his grave :(

      Asimov's laws don't apply here. The whole point to these robots is to kill people. That what they were made for.

      The predator will have to be renamed to the pussy if it's not allowed to blow people up.

    5. Stoneshop
      Holmes

      Re: Asimov's gonna be spinning in his grave :(

      Scenario: incoming missile, possibly loaded with an ABC warhead (the human operator who launched it thought nothing of this). Surely it will harm humans when it comes down. Does the robot defense system take action to destroy it in flight?

      E_DILEMMA. decision by zero, conscience dumped.

  2. tkioz
    Unhappy

    One of my major concerns with remote or robotic warfare is it makes war "too easy", at least politically. Now don't get me wrong, I don't want to see dead soldiers coming home on the news anymore then the next person, but the fear of those images keeps politicians, at least the ones in the first world, hesitant to go to war, and that hesitation is a very good thing (though not without it's drawbacks) for the world.

    If the politicians can order a war with very little risk to their own political standing, that is a very bad thing, a destabilizing thing for the world at large. Really the only control we have over our political "masters" is their fear that we will not vote for them next time... we really don't want to let that go.

    1. Anonymous Coward
      Anonymous Coward

      It isn't a huge problem if the population of a nation is remotely civilized, sadly there aren't any nations like that.

      What do I mean? Well a civilized population would throw out its government if they saw their government butchering tens of thousands of innocent civilians. But as we've seen time and time again, we just sit back and wave the wars on. So what's the difference if you can do it completely automatically, then you don't even have anyone to blame if a missile blows up a wedding congregation, it's just a technical error. As it is it's just "bad intelligence" only difference there's one less western drone operator having nightmares coz some cunt in a cushy office said "press the button"

      1. Thorne

        "Well a civilized population would throw out its government if they saw their government butchering tens of thousands of innocent civilians"

        Ah yes but the problem is fear. These civilians are harbouring terrorist who hate our western way of life and want to kill us so they need to be sacrificed to protect our right to buy Mcdonalds and drive Hummers.

        We don't know any of these people getting killed so really it's alright cause it isn't us.

        To Quote Sam and Max

        Sam (holding a bomb): Max, where should I put this so it doesn't hurt anyone we know or care about?

        Max: Out the window, Sam. There's nothing but strangers out there.

        Sam: (looks at the bomb in his hand and throws it out the window behind him)

        (Bomb explodes outside the windows)

        Sam: I sure hope there was no one on that bus.

        Max: No one we know, at least...

  3. Mondo the Magnificent
    Devil

    Hmmm...

    Perhaps HRW have been reading ELReg's "ROTM" articles and are scared shitless that SkyNet may become a reality...

  4. jake Silver badge

    Uh ... Someone's confuzled.

    "Human-out-of-the-Loop Weapons: Robots that are capable of selecting targets and delivering force without any human input or interaction"

    Like Israel's "Iron Dome"? That'll fly on the world stage ... not.

    1. Richard 12 Silver badge

      Every single anti-missile system is human-out-of-loop

      Unless you count "human turns the system on and off" as human-on-the-loop.

      They have to be, because once a missile pops over the horizon there's single-digit seconds before the anti-missile missile must launch or the interception will fail.

      While they might have a "do-not-fire" button for a human to hit, if you've got less then five sec to hit it then it won't be pressed - nobody is that alert for more than ten minutes or so - and under 2sec means it can't be pressed.

    2. James Micallef Silver badge

      Re: Uh ... Someone's confuzled.

      Iron Dome IS human-out-of-the-loop BUT it does not (so far, and as far as I know) target humans or vehicles likely to contain humans

      1. 404
        Terminator

        Re: Uh ... Someone's confuzled.

        That's just software buddy, software. Change a few IF statements and iptso fatso (whala? Open Sesame? Yeah that.)... Whoosh, Bang, You're Dead.

        ;)

        1. Anonymous Coward
          Anonymous Coward

          Re: Uh ... Someone's confuzled.

          "...Change a few IF statements and iptso fatso (whala?.."

          That's:

          ipso facto

          and

          voilà

          Bit confuzled yourself?

  5. rurwin
    Big Brother

    Ask Hollywood

    Laws against targeting civilians? I think you'll find that is just a minor legal point that is getting in the way of big business making money. Just like the copyright fair-use restrictions.

  6. tapanit
    Pirate

    Sounds to me like regular landmines fit in the 3rd category already...

    1. Richard 12 Silver badge

      Good point!

      Same with tripwire munitions, which are still permitted.

      Who said there was a minimum amount of intelligence needed before a "smart" bomb can kill a child?

    2. madick

      And naval mines as well.

  7. Blofeld's Cat
    Unhappy

    It's OK it's one of ours...

    ED-209: "Please put down your weapon. You have 20 seconds to comply."

    Dick Jones: "I think you'd better do what he says, Mr. Kinney."

    ED-209: "You now have 15 seconds to comply."

  8. Anonymous Coward
    Mushroom

    I can see the smarter politicians getting behind this because it will be a perfect reason for even more control.

    If one law is to "not target civillians" then we can all wear an RFID / Biometric / Scannable sub-dermal ID badges confirming we are "good citizens", no more need for jail as "citizenship" can be based on your criminal record and police can be replaced by drones circling overhead... mr Blofeld's Cat is spot on!

  9. crayon

    'The third concern surrounds accountability, as it's hard to apply humanitarian law to a robot or its programmer. Existing laws and remedies would therefore struggle to deliver “meaningful retributive justice”.'

    Simple, the leader(s) of whichever organisation/government that deployed said robot would be held accountable.

    Unfortunately, in practice, only leaders of African countries will ever have "retributive justice" meted on them.

  10. greatfog
    Terminator

    It's too late

    and has been for a long time.

    [http://en.wikipedia.org/wiki/Spring_gun]

  11. JimC

    Where's my cute puppy icon?

    > robots would not be restrained by human emotions and the capacity for compassion,

    > which can provide an important check on the killing of civilians

    I'm not sure, when you look in history at Kosovo, Rwanda, Mongol hordes etc etc etc, that taking human emotions out of the loop is a recipe for more civilians getting killed. More likely, I fear, the other way round.

  12. Mark 85
    Mushroom

    Laws...?

    All well and good except that there are too many people/countries who do not follow "laws". As I recall, suicide bombers and terrorist efforts against civilians is prohibited by the Geneva Convention. Who follows the Geneva Convention anymore? More importantly, unless all parties follow it, then the ones who do are at a disadvantage of the ones who don't.

    I forget who said it, but "in a world of barbarians, the only way to have peace is to be the biggest and baddest barbarian". Jihadists/non-Jihadists/any government scream bloody murder when their people are attacked, yet think nothing of attacking other civilians. Another law will mean diddly.

    The mushroom cloud... because.

  13. heyrick Silver badge

    Yay! Only TWENTY NINE years after WarGames pointed out the dangers...

    However, I wish to also note that their mission statement is that "it represents a threat to human rights". So do many things. Bankers/banking acted recklessly and now the whole situation is damn near ridiculous. Interest rates are a highly volatile thing to certain countries. Yet others are being managed by unelected technocrats. Forget robotic weapons and ask how many people have been screwed over as fallout from the fragile state of the worlds finances (the part where somebody finally noticed "oh shit, we're outta cash") and wonder who is caring about their human rights.

    Then take a look at certain conflicts in parts of the world and wonder if human operators are likely to be any better or worse with weapons of mass destruction. At least a computer won't arbitrarily kill somebody because of a different religion or just not liking the guy. It would need to be specifically programmed to do that, and then it's just following instructions without independent thought, emotions, or perceptions of superiority.

This topic is closed for new posts.