If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Globe and Mail)   Human rights group wants a ban on all robots that are relentlessly pursuing Sarah Connor   (theglobeandmail.com) divider line 24
    More: Interesting, human rights group, mobile robot, Iron Dome, grenade launcher, Hellfire missile, international humanitarian law, chemical weapons, robots  
•       •       •

4702 clicks; posted to Main » on 20 Nov 2012 at 4:33 AM (1 year ago)   |  Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



Voting Results (Smartest)
View Voting Results: Smartest and Funniest


Archived thread
2012-11-20 04:53:57 AM
4 votes:
"We want you to unilaterally stop using technology that is keeping you safe from your enemies that don't have it"
2012-11-20 08:18:40 AM
2 votes:
Now that I think of it...

If the Terminator was captured in 1984 , would there be legal grounds to prosecute it?

That would have made a thrilling courtroom sequel.
2012-11-20 05:45:32 AM
2 votes:

Sgygus: calling for an international treaty outlawing military weapons systems that decide - without a human "in the loop" - when to pull the trigger

This is quite prudent. Machines are not capable of being responsible. Humans are.


Kind of arguable. Machines also aren't capable of going off the reservation and making bad decisions. Humans are.

In some ways, robotic defense systems are a human rights improvement over corruptible human soldiers.

//Also, for the obvious historical reference, consider that international law following WW1 banned the use of aircraft as weapon platforms of any kind. Look up how long that lasted for an idea of how long we can keep auto systems a high schooler could build given sufficient money off the battleground.
2012-11-20 05:02:29 AM
2 votes:
Go ahead and ban them. It will not stop them from being deployed. Seriously, do you think that if some Palestinian terrorist group could design one of these killer robots that they would refrain from setting it loose in Tel Aviv because they had been banned?
2012-11-21 06:49:21 AM
1 votes:

dragonchild: As a drama that was a good episode, but I'm in reality here.


You seem to think that Killer Robots are an excellent idea that will in no way go horribly wrong or be misused by humanity in any way.

I think Reality is a missed left turn back about a hundred miles or so.
2012-11-20 08:52:37 PM
1 votes:

dragonchild: way south: I think there is a difference because we have a certain level of expectation from humans and they can usually explain their reasoning.

That reasoning is usually, "I was just following orders." Robots controlled by programs don't scare me half as much as humans programmed by political agendas.


Yes, but who controls the robots?
If you remove the human, thinking, questioning, accountable, soldiers from the military then what you have left are machines that obey only the politicians.

They are the source of so very many bad agendas.
..and soon there'll be no one left to ask any questions.
2012-11-20 12:19:02 PM
1 votes:

Lunaville: dragonchild: Humans do that already. If you're concerned about civilian casualties then don't get into a farking war to begin with.

Here we have the best idea that will ever be posted in this thread. Thank-you, dragonchild.


This.

Esp when you consider tactics like Hamas uses... building missile launchers on top of apartment buildings.
2012-11-20 11:13:30 AM
1 votes:

dragonchild: Humans do that already. If you're concerned about civilian casualties then don't get into a farking war to begin with


Exactly. War is a messy and bloody affair that ultimately results in loss on both sides. Sending robots that nobody gives a fark about to do our dirty work removes a very important message: War is not a fun thing, it is the absolute last resort once all other possible avenues have been fully explored.

War should always be something that when you engage in it you get to 'enjoy' the coffins coming home as well. As a species we have a really crappy long term memory and a desperate desire to remove past atrocities from it. This should never happen.

And that's why a human should be in the loop no matter how 'smart' these things are. A human should be forced to authorise and then watch as these things chew apart other robots, humans, cattle, whatever they come across.
2012-11-20 11:12:14 AM
1 votes:
2012-11-20 09:44:37 AM
1 votes:
John McGinnis, a Northwestern University Law professor, suggests that "artificial-intelligence robots on the battlefield may actually lead to less destruction, becoming a civilizing force in wars."

This poor bastard has never watched a scifi movie in his life.
2012-11-20 09:07:40 AM
1 votes:

dragonchild: I'll take it. Robots don't get bored; they don't get traumatized; they don't get fatigued; they don't require rescue. If a soldier (or a small group thereof) is cut off, command is faced with a very difficult choice over whether to order a rescue or just tell them "good luck". Makes for good drama in fiction, but in reality it's a brutally pragmatic call where the soldiers are left to their own fate if the cost in resources is too high. If a robot is cut off and surrounded, it can just go apeshiat before self-destructing. And as others say, they're not going to go off-mission to commit some recreational atrocities, flee in panic or desert entirely, take or give a bribe. It's not a big deal if a robot's arm gets blown off. Robots don't care to be home for the holidays. They don't have kids back home. You don't even need to bring them back at all.


www.wearysloth.com
"Well, consider that in the history of many worlds, there have always been disposable creatures. They do the dirty work. They do the work that no one else wants to do because it's too difficult or too hazardous...and an army of Datas, all disposable. You don't have to think about their welfare, you don't think about how they feel. Whole generations of disposable people."
2012-11-20 08:16:45 AM
1 votes:

dragonchild: If a robot is cut off and surrounded, it can just go apeshiat before self-destructing.


Problem is robots don't really think. They follow a checklist of instructions according to what their sensors may or may not perceive. Their definition of "cut off" might be "lost radio contact due to interference and... OOh!, look, a wedding party full of terrorists!!".

Robots show all the judgement of a mousetrap.
When the right parameters are met, SNAP!
2012-11-20 07:18:50 AM
1 votes:

Vaneshi: Dracolich: tl;dr: We already had unquestioning soldiers.

When a soldier pulls the trigger he or she is intrinsically aware that if they shoot the wrong thing then 'bad things' will happen to them. This provides an impetus to make damn sure what you are shooting at is in fact an enemy.

When a machine pulls the trigger... who's responsible? It screws up and who's to blame?


This is an interesting point. We have a lot of accidents in war from using soldiers. Warning shots have killed civilians many times, but some people blame that on soldiers "not following protocol."

You also bring up self-interest. If you're not sure about the person approaching you, do you shoot them? This is a real issue that we've seen in Iraq. When people feel threatened, they act. When they're not sure if they're threatened but the risk is real, they act. Is this the case if a robot is used? Does the person controlling it feel like the amount of risk on the line is the same? They're no longer at personal risk. It's no longer "your life vs a reprimand." The safer choice shifts towards getting more information first. It becomes "your robot vs a reprimand." This may actually make civilians safer, but we'll probably go through a fair number of additional robots when we're incorrect. On the bright side, we'll have recordings of what happened in that particular case.
2012-11-20 07:04:10 AM
1 votes:

way south: Assuming you could distinguish aircraft with absolute accuracy, and you write your drone to ignore a foreign 737 entering your airspace, it might ignore a 737 flown by suicided bombers or even an armed 737 acting as a missile boat.


Or it gets confused and turns your AWACS in to shrapnel. That and considering that military equipment is sold to other people (we in the UK do it, you American's do it, the Russians do it, etc.) how will it respond to a friendly F-15, MiG, Tornado, etc. with a failed IFF unit when it's also having to deal with enemy F-15's, MIG's and Tornado's?

Honestly I don't see a way for it to distinguish the two as it's software will just see a MIG, F-15 or Tornado (as examples) with no IFF broadcast. Perhaps Jim will provide us with a convincing argument as to why this could never happen...
2012-11-20 06:56:54 AM
1 votes:

Jim_Callahan:
In some ways, robotic defense systems are a human rights improvement over corruptible human soldiers.


No. We're not talking about a Terminator here or indeed anything that has the ability to comprehend. As per the botjunkie photo that was posted we're talking about a tracked machine with a 50cal machine gun and a webcam strapped to it.

These things have zero way of knowing if they're looking at an enemy troop formation or a civilian protest/market, it'll just know that they aren't friendly because they lack the correct RFID or a. n. other FoF identifier... then slaughter them.

Hell these things are driven by computers and we have daily threads here about various bits of software screwing up! So somehow one of these things encounters a race condition nobody had found in its code before and just randomly starts throwing grenades... at the barracks.

With a human (and we do mean a member of the armed forces here) in the loop there is at least someone we can point to and say "Why did you let it do that?"
2012-11-20 06:50:27 AM
1 votes:
eggshell-robotics.node3000.com
2012-11-20 06:49:54 AM
1 votes:

mamoru: You want to deploy a machine that cannot take responsibility for who it shoots? Then you take responsibility for who it shoots.


That's pretty much how it works, yes. You step on a land-mine that's not supposed to be there, the government that deployed the mine is the one at fault. Same with this stuff.

You mean the complicated hardware and software to control an autonomous weapons platform will never malfunction? What a relief!

Probably less frequently than the reprobates we sometimes hire slip off base to have a shooting spree among the local civilians, or mutilate corpses, or are bribed to let contractors abscond with millions of dollars in untraceable cash.

100% absolute infallibility is miles and miles above the bar that terminators have to leap to be an improvement, is what I'm getting at here.
2012-11-20 06:44:09 AM
1 votes:

Jim_Callahan: Kind of arguable. Machines also aren't capable of going off the reservation and making bad decisions. Humans are.


You mean the complicated hardware and software to control an autonomous weapons platform will never malfunction? What a relief!

Anyway, I doubt deployment can be banned. How about a law which holds the human who orders the deployment of such machines directly responsible for the actions of them, with punishments equal to that human having carried out such actions him/herself? You want to deploy a machine that cannot take responsibility for who it shoots? Then you take responsibility for who it shoots.

/general "you", not you specifically, Jim_Callahan :)
2012-11-20 06:21:10 AM
1 votes:

Jim_Callahan: Sgygus: calling for an international treaty outlawing military weapons systems that decide - without a human "in the loop" - when to pull the trigger

This is quite prudent. Machines are not capable of being responsible. Humans are.

Kind of arguable. Machines also aren't capable of going off the reservation and making bad decisions. Humans are.

In some ways, robotic defense systems are a human rights improvement over corruptible human soldiers.

//Also, for the obvious historical reference, consider that international law following WW1 banned the use of aircraft as weapon platforms of any kind. Look up how long that lasted for an idea of how long we can keep auto systems a high schooler could build given sufficient money off the battleground.


We've already got automated war machines called land mines.
The problem is machines aren't accountable for following orders. They do whatever they were rigged or programmed to do, and if the code is vague enough then they'll take a shot at an airliner as quickly as they'd shoot down an enemy fighter. Putting wheels on a mouse trap doesn't make it sympathetic to the differences between mice and hamsters. At least with a traditional fighter you've got the pilot to blame.
Maybe You could bypass a ban by adding an authorization button, but that means the signatory soldier would be risking his name on a pile of code he didn't write.
...but he could just say he was ordered to sign and its all good. Who knows.

I doubt they'll pass a ban to begin with, because the idea of using automated guns for perimeter defense is just too tempting for a first world army up against guerrillas.

dl.dropbox.com
2012-11-20 05:54:06 AM
1 votes:

Sgygus: This is quite prudent. Machines are not capable of being responsible. Humans are.


Yes, because humans have a rich and luxurious history of being peaceful and kind to one another even when in possession of tools that have no explicit purpose other than killing large quantities of other humans.
2012-11-20 05:50:21 AM
1 votes:
1 "Serve the public trust"
2 "Protect the innocent"
3 "Uphold the law"
4 (Classified)
2012-11-20 05:01:56 AM
1 votes:
I propose that all kill bots be programmed with a preset kill limit and once that kill limit is reached, the kill bot must deactivate. Just never announce what that limit is. No commander will ever be cunning enough to deal with that.
2012-11-20 05:00:45 AM
1 votes:
Silly Luddites, you can't stop progress
2012-11-20 04:45:02 AM
1 votes:
Oh, a HUMAN rights group is against robots. Big farkin' surprise there. Bigots.
 
Displayed 24 of 24 comments

View Voting Results: Smartest and Funniest


This thread is archived, and closed to new comments.

Continue Farking
Submit a Link »






Report