If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(The New Yorker)   Thanks to Google we now need to implement the Three Laws Of Robotics   (newyorker.com) divider line 65
    More: Scary, Laws of Robotics, robot soldiers, Pol Pot, driverless cars  
•       •       •

6179 clicks; posted to Geek » on 28 Nov 2012 at 10:43 AM (1 year ago)   |  Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



65 Comments   (+0 »)
   
View Voting Results: Smartest and Funniest

Archived thread

First | « | 1 | 2 | » | Last | Show all
 
2012-11-28 06:02:35 PM  

SavvyLemur: i got this gaiz

bool isKaleyCuaco;
foreach (Human item in CurrentView)
{
isKaleyCuaco = DetermineIfKayleyCuaco(item);
if (IsKayeyCuaco)
SendGPSLocationToMobile();
DetainKayleyCuaco();
}


1) You used the wrong case on line 5
2) Why declare a variable outside the scope where you're using it?
3) You're executing DetainKayleyCuaco one for every person regardless of detection (need a block under the if)
4) Shouldn't you wait for detain to return before you send the GPS coordinates? Otherwise, the coords might be stale by the time it catches her.
5) I think it's "Cuoco"

/Or is that all subtle commentary on their inaccurate portrayal of STEM?
 
2012-11-28 06:17:13 PM  

hinten:
Your car is speeding along a bridge at fifty miles per hour when errant school bus carrying forty innocent children crosses its path. Should your car swerve, possibly risking the life of its owner (you), in order to save the children, or keep going, putting all forty kids at risk? If the decision must be made in milliseconds, the computer will have to make the call.


"At those speeds, metal becomes pliable.
"Me, I'm a cop, so I know the score. I know we're all dead.
But the robot, its brain is a difference engine.
It calculated that I had a better chance of survival ..."

Or something like that.

/not obscure
//not here
 
2012-11-28 06:19:41 PM  
The three laws of robotics are not laws, they are commandments. I mean they are almost written as "thou shalt not..."

Laws have definitions and consequences. A law reads like this:

"A person who owns or controls and a robot which causes harm to human being shall be deemed guilty of assault and sentenced to between..."
or
"A person who owns or controls and a robot which through inaction allows harm to come to a human being shall be deemed guilty of (something else) and sentenced to between..."
or
"A person who owns or controls and a robot which fails to follow instructions from a human being shall be deemed guilty of (well, owning a broken robot will probably be too common to be against the law)

But quite frankly we already have product liability legislation.

Asimov laws aren't laws they're commandments. And when he got to the Zeroth one I gave up reading.
 
2012-11-28 06:35:21 PM  

brainlordmesomorph: The three laws of robotics are not laws, they are commandments. I mean they are almost written as "thou shalt not..."

Laws have definitions and consequences. A law reads like this:

"A person who owns or controls and a robot which causes harm to human being shall be deemed guilty of assault and sentenced to between..."
or
"A person who owns or controls and a robot which through inaction allows harm to come to a human being shall be deemed guilty of (something else) and sentenced to between..."
or
"A person who owns or controls and a robot which fails to follow instructions from a human being shall be deemed guilty of (well, owning a broken robot will probably be too common to be against the law)

But quite frankly we already have product liability legislation.

Asimov laws aren't laws they're commandments. And when he got to the Zeroth one I gave up reading.


What's the consequence for breaking the laws of thermodynamics? Fourier's law? Newton's law of cooling? Ohm's law? Murphy's law? Poe's law?
 
2012-11-28 06:46:55 PM  

SuperChuck: dittybopper: SuperChuck: way south: I'm willing to wager money that there's no way to translate any of those laws into machine speak.
The first thing a computer wont understand is what the difference is between a human or any other random object in the road.

Image recognition software is far better than you think it is. Humans are rarely shaped like rocks or trash cans

They are when they want to be.

In fact, it's a trivial exercise in camouflage to look like a rock or a garbage can.

Plus, you don't have to actually do that: You don't have to look like some other object. Just don't look like a stereotypical human. No image recognition software is going to know what everything looks like.

All the software has to do is recognize that there's something in the road that it probably shouldn't just drive over. No matter what goofy thing you're doing, a human is probably going to fall into that category. if you want to lay in the road and disguise yourself as a speedbump, maybe you should get run over.


The critical point that you mention is the computer doesn't care, or need, to differentiate a human from any other human sized thing.
If you could make it understand concepts like "harm" then it would (at this stage) apply the do not harm rule to a mailbox or a human equally.

It's lacking far too many concepts to even begin at understanding the moral quandaries that Asimov came up with.
 
2012-11-28 06:56:43 PM  

way south: SuperChuck: dittybopper: SuperChuck: way south: I'm willing to wager money that there's no way to translate any of those laws into machine speak.
The first thing a computer wont understand is what the difference is between a human or any other random object in the road.

Image recognition software is far better than you think it is. Humans are rarely shaped like rocks or trash cans

They are when they want to be.

In fact, it's a trivial exercise in camouflage to look like a rock or a garbage can.

Plus, you don't have to actually do that: You don't have to look like some other object. Just don't look like a stereotypical human. No image recognition software is going to know what everything looks like.

All the software has to do is recognize that there's something in the road that it probably shouldn't just drive over. No matter what goofy thing you're doing, a human is probably going to fall into that category. if you want to lay in the road and disguise yourself as a speedbump, maybe you should get run over.

The critical point that you mention is the computer doesn't care, or need, to differentiate a human from any other human sized thing.
If you could make it understand concepts like "harm" then it would (at this stage) apply the do not harm rule to a mailbox or a human equally.

It's lacking far too many concepts to even begin at understanding the moral quandaries that Asimov came up with.


You guys are all missing the whole "assume my passengers are human; collisions might harm my passengers, avoid slamming into anything in the road, be it a man, a rock, or a man dressed in a convincing mailbox costume," aspect of that law. It's not going to go, "that obstacle is non-human; collision not a problem!"
 
2012-11-28 07:00:35 PM  

tricycleracer: But the whole discussion is pretty moot because, well, have you ever seen a car hit a school bus?


Have you?
 
2012-11-28 07:02:49 PM  

ghare: hinten: 1.A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2.A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3.A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.


Example FTFA:
Your car is speeding along a bridge at fifty miles per hour when errant school bus carrying forty innocent children crosses its path. Should your car swerve, possibly risking the life of its owner (you), in order to save the children, or keep going, putting all forty kids at risk? If the decision must be made in milliseconds, the computer will have to make the call.

I say: Let the robot go Galt and see how that philosophy works out.

You could have two settings: Democrat or Republican. If you're Republican, the bus loses wins. If you're a Democrat, the bus wins loses.


FTFY
 
2012-11-28 07:18:47 PM  

miscreant: Oysterman: The whole "it should be illegal for humans to drive in the future" has my inner libertarian foaming at the mouth (granted, the phrase 'inner libertarian' already implies a minor case of rabies)

Don't worry, there will probably be amusement parks where you'll be able to go and drive cars "like they did in the olden days"


On Sundays you'll elude the Eyes and hop the Turbine Freight to far outside the Wire where your white-haired uncle waits.
 
2012-11-28 07:22:42 PM  

ProfessorOhki: SavvyLemur: i got this gaiz

bool isKaleyCuaco;
foreach (Human item in CurrentView)
{
isKaleyCuaco = DetermineIfKayleyCuaco(item);
if (IsKayeyCuaco)
SendGPSLocationToMobile();
DetainKayleyCuaco();
}

1) You used the wrong case on line 5
2) Why declare a variable outside the scope where you're using it?
3) You're executing DetainKayleyCuaco one for every person regardless of detection (need a block under the if)
4) Shouldn't you wait for detain to return before you send the GPS coordinates? Otherwise, the coords might be stale by the time it catches her.
5) I think it's "Cuoco"

/Or is that all subtle commentary on their inaccurate portrayal of STEM?


No subtle commentary. I'm just a newb.
 
2012-11-28 07:41:44 PM  

ProfessorOhki: way south: SuperChuck: dittybopper: SuperChuck: way south: I'm willing to wager money that there's no way to translate any of those laws into machine speak.
The first thing a computer wont understand is what the difference is between a human or any other random object in the road.

Image recognition software is far better than you think it is. Humans are rarely shaped like rocks or trash cans

They are when they want to be.

In fact, it's a trivial exercise in camouflage to look like a rock or a garbage can.

Plus, you don't have to actually do that: You don't have to look like some other object. Just don't look like a stereotypical human. No image recognition software is going to know what everything looks like.

All the software has to do is recognize that there's something in the road that it probably shouldn't just drive over. No matter what goofy thing you're doing, a human is probably going to fall into that category. if you want to lay in the road and disguise yourself as a speedbump, maybe you should get run over.

The critical point that you mention is the computer doesn't care, or need, to differentiate a human from any other human sized thing.
If you could make it understand concepts like "harm" then it would (at this stage) apply the do not harm rule to a mailbox or a human equally.

It's lacking far too many concepts to even begin at understanding the moral quandaries that Asimov came up with.

You guys are all missing the whole "assume my passengers are human; collisions might harm my passengers, avoid slamming into anything in the road, be it a man, a rock, or a man dressed in a convincing mailbox costume," aspect of that law. It's not going to go, "that obstacle is non-human; collision not a problem!"


The programmer presumes you have human cargo and that you don't want the car to hit anything.
The car itself? It would care about as much as a guided bomb.
It might have sensors to detect if "weight" is in its chairs, but its not treating so many pounds of humans any different than it would a couple of sandbags.
 
2012-11-28 08:17:15 PM  

way south: The programmer presumes you have human cargo and that you don't want the car to hit anything.
The car itself? It would care about as much as a guided bomb.
It might have sensors to detect if "weight" is in its chairs, but its not treating so many pounds of humans any different than it would a couple of sandbags.


There's no useful distinction between the two. The programmer writing the intelligence to assume a weight in the chair is a human is functionally identical to the car (the car and the control software being referred to jointly) assuming the weight in the chair is a human.

I get where you're going with the whole, "determining what is truly human vs. human shaped is a nearly impossible problem" angle. But you a very Chinese Room question in that if you want to get right down to it, neither you nor I can truly identify one another as human right now. To even suggest that you could hardcode "don't harm humans" implies you can hardcode definitions of "harm" and "human." Either that's done at a very superficial level, "don't exert X lbs of force on things shaped like a person" or it's some sort of dynamic heuristic in which case it's not a law to begin with, it's just an instinct that applies differently based on the experiences of the robot.

At the superficial level, could such a robot perform a vaccination? Injecting a human with an attenuated virus presents a chance for harm, but at the same time, massively reduces the chance for future harm. Like it's been said a dozen times in this thread, the three laws were just an example of how mechanical absolutes don't translate well to human society. The practicality of identifying "human" or "harm" isn't relevant to the theme; the incompatibility of absolutes and human-like understanding on the other hand....
 
2012-11-28 08:18:35 PM  
*you get
 
2012-11-29 07:31:58 AM  

SuperChuck: dittybopper: SuperChuck: way south: I'm willing to wager money that there's no way to translate any of those laws into machine speak.
The first thing a computer wont understand is what the difference is between a human or any other random object in the road.

Image recognition software is far better than you think it is. Humans are rarely shaped like rocks or trash cans

They are when they want to be.

In fact, it's a trivial exercise in camouflage to look like a rock or a garbage can.

Plus, you don't have to actually do that: You don't have to look like some other object. Just don't look like a stereotypical human. No image recognition software is going to know what everything looks like.

All the software has to do is recognize that there's something in the road that it probably shouldn't just drive over. No matter what goofy thing you're doing, a human is probably going to fall into that category. if you want to lay in the road and disguise yourself as a speedbump, maybe you should get run over.


I was thinking more along the lines of a military context.
 
2012-11-29 07:58:13 AM  

ProfessorOhki: You guys are all missing the whole "assume my passengers are human; collisions might harm my passengers, avoid slamming into anything in the road, be it a man, a rock, or a man dressed in a convincing mailbox costume," aspect of that law. It's not going to go, "that obstacle is non-human; collision not a problem!"


That's why the car's top speed might be 2 MPH, so the car does not risk harm to its occupants.
 
Displayed 15 of 65 comments

First | « | 1 | 2 | » | Last | Show all

View Voting Results: Smartest and Funniest


This thread is archived, and closed to new comments.

Continue Farking
Submit a Link »






Report