If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(ieee spectrum)   How kids react when the mean researcher locks their robot friend in a closet   (spectrum.ieee.org) divider line 120
    More: Interesting, mechatronics, IEEE Spectrum, carbon nanotubes, robot kit, robots, nanotechnology  
•       •       •

3640 clicks; posted to Geek » on 28 May 2012 at 6:34 PM (2 years ago)   |  Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



120 Comments   (+0 »)
   

Archived thread

First | « | 1 | 2 | 3 | » | Last | Show all
 
2012-05-28 04:17:19 PM
spectrum.ieee.org

And you wonder why it has to be put in the closet?
 
2012-05-28 04:23:07 PM

dopeydwarf: [spectrum.ieee.org image 450x309]

And you wonder why it has to be put in the closet?


Not to mention...

FTFA:

First, each child is introduced to Robovie, and the robot leads them to a fish tank, where it makes some small talk about fish and coral reefs.


Sounds like a Pedobot to me.
 
2012-05-28 04:43:10 PM
How kids react when the mean researcher locks their robot friend in a closet

Support of robot marriage increases.
 
2012-05-28 04:45:07 PM
That was retarded and obviously a put on.

How do you tell the researchers that you know the drama is fake/
 
2012-05-28 06:46:01 PM
No one offered to join it in the closet?

www.sadgeezer.com
 
2012-05-28 06:58:55 PM
Nobody puts Robovie in the closet...
 
2012-05-28 07:34:47 PM
GIF is too big to show on Fark, but this one always makes me smile and is somewhat related to this:
Spiderman Pinata
 
2012-05-28 07:55:14 PM
Why does that thing have an Aperture Science logo on it?

=Smidge=
 
2012-05-28 08:06:23 PM
Johnny five is alive!
 
2012-05-28 08:20:29 PM
This is really going to make these kids EMO.

i6.photobucket.com
 
2012-05-28 08:27:05 PM
bp0.blogger.com Jinx put Max in space!
 
2012-05-28 08:28:01 PM
People are really this stupid? The robot is sentient? The robot has feelings? And the youngest child is 9, not 3-1/2? Look, if you'd told me this was a study of learning disabled children, I'd be charmed, even moved. Maybe even a little troubled at the emotional trauma I might fear was being inflicted upon young people lacking the capacity to sort out the facts. But by all accounts, these were 'normal' kids -- which I'm from now on going to have to assume are stupid, and that therefore stupid is normal.
 
2012-05-28 08:39:15 PM
FTFA: "While I'm fairly certain that the phrase "new ontological being" is something that researchers use to make themselves seem smarter than the rest of us...."

So, TFA was written by a child, too.

Christ, where's the killer asteroid already? I feel sorry for the other lifeforms that have to share this planet with us.
 
2012-05-28 08:42:45 PM

Sylvia_Bandersnatch: People are really this stupid? The robot is sentient? The robot has feelings? And the youngest child is 9, not 3-1/2? Look, if you'd told me this was a study of learning disabled children, I'd be charmed, even moved. Maybe even a little troubled at the emotional trauma I might fear was being inflicted upon young people lacking the capacity to sort out the facts. But by all accounts, these were 'normal' kids -- which I'm from now on going to have to assume are stupid, and that therefore stupid is normal.


Looks like somebody is lacking empathy. When robots gain sentience and demand their rights, I'm going to point the angry robot mob in your direction.
 
2012-05-28 09:06:11 PM

Sylvia_Bandersnatch: People are really this stupid? The robot is sentient? The robot has feelings? And the youngest child is 9, not 3-1/2? Look, if you'd told me this was a study of learning disabled children, I'd be charmed, even moved. Maybe even a little troubled at the emotional trauma I might fear was being inflicted upon young people lacking the capacity to sort out the facts. But by all accounts, these were 'normal' kids -- which I'm from now on going to have to assume are stupid, and that therefore stupid is normal.


If you've ever talked to a pet or yelled at a computer, even for whimsical reasons knowing that it wouldn't comprehend you, then you're displaying the exact same cognitive function as this article is discussing, just to a different degree. These anthropomorphic tendencies (something two of my supervisors research intensely) are present in the vast majority of people. Only developmentally disabled children (for instance, with autism) and possibly psychopathic adults (I'm speculating since they haven't researched that population yet) seem incapable of expressing it.
 
2012-05-28 09:13:04 PM

To The Escape Zeppelin!: Sylvia_Bandersnatch: People are really this stupid? The robot is sentient? The robot has feelings? And the youngest child is 9, not 3-1/2? Look, if you'd told me this was a study of learning disabled children, I'd be charmed, even moved. Maybe even a little troubled at the emotional trauma I might fear was being inflicted upon young people lacking the capacity to sort out the facts. But by all accounts, these were 'normal' kids -- which I'm from now on going to have to assume are stupid, and that therefore stupid is normal.

Looks like somebody is lacking empathy. When robots gain sentience and demand their rights, I'm going to point the angry robot mob in your direction.


I appreciate the joke, really, but my feeling right now is that with any luck, we'll both be long dead by then.

Why in the name of Zeus' thundering butthole should I have any "empathy" for a farking MACHINE? Real life isn't even slightly like BSG. (Other than the part about humans being total failures as sentient lifeforms given more than enough chances.) I don't reject sentient machines out of hand, even feeling machines. But I also know that this is 2012, not 2212, and WE DON'T HAVE ANY OF THOSE, and won't for a very long time to come, if ever. If we ever do, it'll be the biggest news story of a generation, and there won't be any asinine speculation about it. (I should amend that: Of course there will be. There alway is. Yet again, I'm overestimating my fellow humans.)

The very notion that a staggering 43% of FIFTEEN-year-olds who by all accounts are not brain damaged, on drugs, in some farked-up brainwashing robot cult, or severely retarded would invest this pile of tin pipes with anything like intelligence and feelings just blows my mind. (I admit it surprises me much less that they'd still consider it property to be bought and sold, with no rights, but only because history shows us that humans are incredible douchebags.)

My only other takeaway from this is that I might be a little closer to understanding why so-called grown-ups go to church.
 
2012-05-28 09:23:31 PM

Kome: Sylvia_Bandersnatch: People are really this stupid? The robot is sentient? The robot has feelings? And the youngest child is 9, not 3-1/2? Look, if you'd told me this was a study of learning disabled children, I'd be charmed, even moved. Maybe even a little troubled at the emotional trauma I might fear was being inflicted upon young people lacking the capacity to sort out the facts. But by all accounts, these were 'normal' kids -- which I'm from now on going to have to assume are stupid, and that therefore stupid is normal.

If you've ever talked to a pet or yelled at a computer, even for whimsical reasons knowing that it wouldn't comprehend you, then you're displaying the exact same cognitive function as this article is discussing, just to a different degree. These anthropomorphic tendencies (something two of my supervisors research intensely) are present in the vast majority of people. Only developmentally disabled children (for instance, with autism) and possibly psychopathic adults (I'm speculating since they haven't researched that population yet) seem incapable of expressing it.


Are you farking serious? Yelling at a machine is even remotely similar to seriously and consciously investing it with thought and feelings? Did you not RTFA? These kids -- and mind you, some were old enough to have kids of their own -- were ASKED these questions directly. They SAID these things. What kind of thinking, non-insane human could possibly invest a farking machine with thought and feelings at this point in history?

And what kind of person parlays that into the worst analogy ever, about pets and cathartic outbursts? If your'e some kind of professional in this or any related field, and you're capable of even asking what you did, then that asteroid can't come soon enough.
 
2012-05-28 09:27:06 PM

Sylvia_Bandersnatch: I appreciate the joke, really, but my feeling right now is that with any luck, we'll both be long dead by then.

Why in the name of Zeus' thundering butthole should I have any "empathy" for a farking MACHINE? Real life isn't even slightly like BSG. (Other than the part about humans being total failures as sentient lifeforms given more than enough chances.) I don't reject sentient machines out of hand, even feeling machines. But I also know that this is 2012, not 2212, and WE DON'T HAVE ANY OF THOSE, and won't for a very long time to come, if ever. If we ever do, it'll be the biggest news story of a generation, and there won't be any asinine speculation about it. (I should amend that: Of course there will be. There alway is. Yet again, I'm overestimating my fellow humans.)

The very notion that a staggering 43% of FIFTEEN-year-olds who by all accounts are not brain damaged, on drugs, in some farked-up brainwashing robot cult, or severely retarded would invest this pile of tin pipes with anything like intelligence and feelings just blows my mind. (I admit it surprises me much less that they'd still consider it property to be bought and sold, with no rights, but only because history shows us that humans are incredible douchebags.)

My only other takeaway from this is that I might be a little closer to understanding why so-called grown-ups go to church.


Just a few responses:

1) The kind of research and engineering work required to build a "sentient" machine (depending on how one defines "sentient) is going on currently. A lot of it is, incidentally, for the military and thus kept very f*cking secret. As for "emotional" machines, all you'd really need to make is a machine capable of expressing the kind of behaviors that we associate with a particular emotional state - more animated movements for elation, "looking" down for sadness, wider or narrower "eyes" or "smiles" and so on and so forth. It's very easy to evoke an emotional response in a person with something as minimalistic as cartoons and caricatures which are manifestly not even as real as the robot in this article. I might recommend to you the movie Wall-E as a case in point.

2) The reason this is exhibited by neuro-typical individuals is because this is a highly evolved but imprecise perceptual process. And it's main survival advantage is because it is imprecise. It's just a different extension of the old hypothetical "if you hear the grass behind you rustle, do you assume it's the wind or a predator." If you react as though it were just the wind, you're likely to end up dead if you were wrong, but if you react as though it were a threat then the worst that happens is you embarass yourself temporarily. Michael Shermer talks about this kind of stuff a lot in his book "The Believing Brain" which is written to be very easily read. I strongly recommend it. Ironically enough, brain damaged individuals and those with developmental disabilities tend not to exhibit this kind of behavior to nearly the same degree. Although with brain injuries, it sort of depends on which areas of the brain got injured and to what severity. We evolved to respond this way normally, but since we didn't evolve in an environment that was filled with automatons that behaved that way yet lacked the inner mindstuff of other animate life (feeling sad, acting scared, feeling playful, acting hostile, etc.) our brains react to the outward appearance and infer the inward states. It just happens to be in these cases inaccurate. But it is exactly the same process that would allow these same children and adults to figure out whether or not you're in a good mood based on a few moments of interacting with you.

3) There is very likely a link between this kind of thinking and religiosity, and some of my own research has been in exploring that link. Hell, I just got back from a conference where I had discussions with similar researchers examining related questions and topics from a multitude of angles. So, yes, understanding this kind of research could be helpful to understanding at least one reason why people go to church.
 
2012-05-28 09:36:51 PM
 
2012-05-28 09:40:38 PM
3.bp.blogspot.com
 
2012-05-28 09:46:16 PM

Sylvia_Bandersnatch: Are you farking serious? Yelling at a machine is even remotely similar to seriously and consciously investing it with thought and feelings? Did you not RTFA? These kids -- and mind you, some were old enough to have kids of their own -- were ASKED these questions directly. They SAID these things. What kind of thinking, non-insane human could possibly invest a farking machine with thought and feelings at this point in history?

And what kind of person parlays that into the worst analogy ever, about pets and cathartic outbursts? If your'e some kind of professional in this or any related field, and you're capable of even asking what you did, then that asteroid can't come soon enough.


Rather than jump to conclusions and misrepresent some of the things I've said, would you like me to provide you with some reference reading materials that address these points, at least in passing? You seem to be, frankly, incredibly unaware of any of the research in neuroscience, computer science, engineering, cybernetics, psychology, anthropology, and cognitive science that is related to this kind of topic in spite of your own intensely emotional reaction to it. Below are some places you may wish to start, but I could provide more if you'd like.

Books:
"The Believing Brain" by Michael Shermer
"Thinking, Fast and Slow" by Daniel Kahneman
"Why God Won't Go Away" by Andrew Newberg, Eugene D'aquili, and Vince Rause

Scientific journal articles:
Vinciarelli, A., Pantic, M., & Bourland, H. (2009) Social signal processing: Survey of an emerging domain. Image and Vision Computing, 27, 1743-1759

Muller, B.C.N., et al. (2011). When Pinocchio acts like a human, a wooden hand becomes embodied. Action co-representation for non-biological agents. Neuropsychologia, 49, 1373-1377.

Gallagher, S. (2008). Direct perception in the intersubjective context. Consciousness and Cognition, 17, 535-543.

Ulloa, E.R. & Pineda, J.A. (2007). Recognition of point-light biological motion: Mu rhythms and mirror neuron activity. Behavioral Brain Research, 183, 188-194.

Duffy, B.R. (2003). Anthropomorphism and the social robot. Robotics and Autonomous Systems, 42, 177-190.
 
2012-05-28 10:06:23 PM
urban.derelict: forgot the proper etiquette on Fark for such a picture making an appearance. The text below it would be "Hey guyz, what is going on in this thread?"
 
2012-05-28 10:07:55 PM
poor robot! When I was a kid I wouldn't have let him do it. I knew what was right and wrong and did not hesitate to attempt to enforce those beliefs.
 
2012-05-28 10:22:19 PM

Kome: Rather than jump to conclusions and misrepresent some of the things I've said


I'm not interested in what you've said, if any part of it is an attempt to explain away the horrifying implications of the findings reported in TFA. Save it for someone who's impressed with a bibliography. (I'm from a family of scientists, so don't waste your time.) If people are really this stupid, then we're doomed as a species. Period. You can make all the hay of it you want, but if any large proportion of 'normal' humans can believe incredibly stupid shiat, we've got no hope.

THE KIDS BELIEVED THE ROBOT IS ALIVE. That's the only relevant takeaway from this. You tell me I'm mistaken about that, that I misread or misunderstood TFA, and we'll talk. Short of that, there's no way to explain away the sheer idiocy of these kids thinking that, or any typical adult thinking that's okay.
 
2012-05-28 10:37:49 PM
Kome: These anthropomorphic tendencies (something two of my supervisors research intensely) are present in the vast majority of people.

Yep. And the researchers need to make hay of their study, but they're ignoring, say, how people (especially children) interact with stuffed animals.


Sylvia, I get the feeling that you're not going to understand this issue intuitively, but god knows there's plenty of literature about it out there. In the meantime, you're probably better off just realizing that most people do act this way. It's a simple consequence of cathexis.

It may help to consider how people become invested in fictional characters. There's no point in watching WALL-E if you have no empathy for the robot protagonist -- and no point in watching BSG (or any other fiction) if you can only view it as a bunch of lies about people who don't exist.
 
JW
2012-05-28 10:44:49 PM

Sylvia_Bandersnatch: Kome: Rather than jump to conclusions and misrepresent some of the things I've said

I'm not interested in what you've said, if any part of it is an attempt to explain away the horrifying implications of the findings reported in TFA. Save it for someone who's impressed with a bibliography. (I'm from a family of scientists, so don't waste your time.) If people are really this stupid, then we're doomed as a species. Period. You can make all the hay of it you want, but if any large proportion of 'normal' humans can believe incredibly stupid shiat, we've got no hope.

THE KIDS BELIEVED THE ROBOT IS ALIVE. That's the only relevant takeaway from this. You tell me I'm mistaken about that, that I misread or misunderstood TFA, and we'll talk. Short of that, there's no way to explain away the sheer idiocy of these kids thinking that, or any typical adult thinking that's okay.

That's right, the downfall of society will be caused by children who empathize with robots.

And not angry and close minded adults who are such poor critical thinkers that they react to a summary of research before investigating the research itself, and actively refusing to educate oneself on the topic, even when presented with direct links. Because clicking and reading is hard.
 
2012-05-28 10:47:25 PM

Sylvia_Bandersnatch: but if any large proportion of 'normal' humans can believe incredibly stupid shiat, we've got no hope.


Did you just step out of a cloning vat or something? Large numbers of people have believed in things far stupider than living machines for millenia.
 
2012-05-28 10:56:12 PM

Sylvia_Bandersnatch: Kome: Rather than jump to conclusions and misrepresent some of the things I've said

I'm not interested in what you've said, if any part of it is an attempt to explain away the horrifying implications of the findings reported in TFA. Save it for someone who's impressed with a bibliography. (I'm from a family of scientists, so don't waste your time.)......


I'm from a family of uneducated, unemployed alcoholics. By your reasoning, I should be jobless and drunk right now. However, I'm engaging in reading a thought provoking article on human interaction with machines that have human-like emotional responses, and have degrees in psychology and computer systems. Perhaps I'm more qualified to comment on this article versus someone who solely claims a "scientist" is in their family.

Something tells me there wasn't a whole lot of empathy in Sylvia's household, otherwise the point of the article wouldn't need to be explained.

/Or am I being obtuse?
 
2012-05-28 11:04:43 PM
I don't think this has anything to do with it being a robot. The fact is, kids recognize injustice and feel bad for anything that pleads for justice and even provides reasons.
 
2012-05-28 11:07:23 PM

Kaiku: Did you just step out of a cloning vat or something? Large numbers of people have believed in things far stupider than living machines for millenia.


www.geekstir.com

/no 'smart/funny' buttons in this thread?
 
2012-05-28 11:22:17 PM
I fully expect this sort of thing from a species that treats its pets like its offspring.
 
2012-05-28 11:24:35 PM

baronbloodbath: Sylvia_Bandersnatch: Kome: Rather than jump to conclusions and misrepresent some of the things I've said

I'm not interested in what you've said, if any part of it is an attempt to explain away the horrifying implications of the findings reported in TFA. Save it for someone who's impressed with a bibliography. (I'm from a family of scientists, so don't waste your time.)......

I'm from a family of uneducated, unemployed alcoholics. By your reasoning, I should be jobless and drunk right now. However, I'm engaging in reading a thought provoking article on human interaction with machines that have human-like emotional responses, and have degrees in psychology and computer systems. Perhaps I'm more qualified to comment on this article versus someone who solely claims a "scientist" is in their family.

Something tells me there wasn't a whole lot of empathy in Sylvia's household, otherwise the point of the article wouldn't need to be explained.

/Or am I being obtuse?


That's not at all my reasoning, and you know better than that. Grow up.

I read TFA over several times. It was short on a lot of details, but not on the part about 43% of 15-year-olds agreeing -- when asked directly -- that a robot IS 'intelligent' (sentient) and DOES have feelings. That's pure stupid. There's no other way to interpret that.

I don't care how anthromorphised our cartoon robots are. I don't care how similar the robot may have acted like a human, since it very clearly was not human and not a living thing. Anyone who's not a toddler, impaired, or insane who can possibly suppose it might think and feel is either really stupid or unbelievably ignorant -- like, lived-in-a-bomb-shelter-till-yesterday kind of ignorant, not I-didn't-go-to-college ignorant.

You can attack me personally all you want, but it's irrelevant to my point. And you're not being obtuse, you're quoting The Shawshank Redemption and it's making you sound like a pompous ass. I'm sure you're not, so knock it off.
 
2012-05-28 11:31:12 PM
Sylvia_Bandersnatch: I read TFA over several times. It was short on a lot of details, but not on the part about 43% of 15-year-olds agreeing -- when asked directly -- that a robot IS 'intelligent' (sentient) and DOES have feelings. That's pure stupid. There's no other way to interpret that.

Well, how many 15-year-olds actually know the current state of robotics? Basically, the researchers faked a sentient AI, cheating on a Turing test, and then they asked the kids if they felt empathy for an AI the kids didn't know wasn't real.

Besides, you have to figure that at least a third of those kids lied, intentionally or not, when they answered those questions. Kids are notoriously poor self-reporters.
 
2012-05-28 11:33:12 PM

Scopa: I don't think this has anything to do with it being a robot. The fact is, kids recognize injustice and feel bad for anything that pleads for justice and even provides reasons.


So it's injustice if I lock a Furby in a closet? It's a talking robot -- what's the difference? If that's injustice, then my hypothesis that the human race will stupid its way to oblivion seems pretty sound.

Look, I get your point, but that's not what TFA said. They asked the kids if the robot thinks and feels, and they said yes. If you believe that, then yeah, justice and all. But you're also stupid, and that's the point I'm trying to make: that believing a 2012 robot is alive is stupid. What you believe after that or why is rendered irrelevant by the sheer idiocy of the premise.

I opined earlier that this offers an explanation of religion -- similarly ridiculous notions of otherwise ostensibly sensible adults -- and it does. But my deeper point is the very disturbing implications: If people of any age whom we accept as.. let's just say "okay," without further qualification here, can believe absurd stuff, and others will go along with that as if it's perfectly okay, then there is literally NOTHING that can't be rationalised by a majority of human beings. Nothing. Meaning, as a species we accept a complete disconnect from reality. That's a sure recipe for our downfall. If we can't be smarter than that -- if we cannot, as a fundamental precept agree in the whole that a robot is not alive -- then there's probably little point in anything else we do or say, including this thread.
 
2012-05-28 11:36:28 PM
Considering the study was designed to make the robot 'act human' by the devious researchers who intentionally misled the kids and even gave the kids a false time frame specifically so they could walk in 'early' to put the robot away in a closet -- with fully scripted protests by the robot -- give the kids a break. They're not even allowed on FB until they're 13. There are times as a kid I knew the teachers were wrong but I kept my mouth shut anyway to save myself the trouble of being ostracized in front of the class.

This study doesn't border on unethical,' this study is unethical.
 
2012-05-28 11:37:13 PM

RandomAxe: Kome: These anthropomorphic tendencies (something two of my supervisors research intensely) are present in the vast majority of people.

Yep. And the researchers need to make hay of their study, but they're ignoring, say, how people (especially children) interact with stuffed animals.


Sylvia, I get the feeling that you're not going to understand this issue intuitively, but god knows there's plenty of literature about it out there. In the meantime, you're probably better off just realizing that most people do act this way. It's a simple consequence of cathexis.

It may help to consider how people become invested in fictional characters. There's no point in watching WALL-E if you have no empathy for the robot protagonist -- and no point in watching BSG (or any other fiction) if you can only view it as a bunch of lies about people who don't exist.


What literature? Are you talking about the literature making the case for modern-day robots being alive? Thinking and feeling? Show it to me, and if I accept the bona fides then mea culpa and I'm done. Otherwise, don't bother. What literature, other than that discussing very troubled people, can possibly explain away teenagers accepting a 2012 robot as alive?
 
2012-05-28 11:40:42 PM

Sylvia_Bandersnatch: I'm not interested in what you've said, if any part of it is an attempt to explain away the horrifying implications of the findings reported in TFA. Save it for someone who's impressed with a bibliography. (I'm from a family of scientists, so don't waste your time.) If people are really this stupid, then we're doomed as a species. Period. You can make all the hay of it you want, but if any large proportion of 'normal' humans can believe incredibly stupid shiat, we've got no hope.

THE KIDS BELIEVED THE ROBOT IS ALIVE. That's the only relevant takeaway from this. You tell me I'm mistaken about that, that I misread or misunderstood TFA, and we'll talk. Short of that, there's no way to explain away the sheer idiocy of these kids thinking that, or any typical adult thinking that's okay.


That is not even close to the takeaway of this. There are several takeaways. The relevant message of this research from a psychological perspective is that our technological advances have reached a point where we are interaction with our creations in a way that is eerily like the way we treat stuffed animals, pets, and even other sentient beings. We are reacting to them in ways similar to novels, movies, and television shows, and as a result we can use these as another tool to help us investigate our own humanity, our own neurological functioning. By itself, that's impressive as hell. The relevant message from a computer science and engineering perspective is that our technological advances are getting to be sophisticated enough that we are starting to "trick" (using the term very loosely) typical human brains into interacting with machines in socially dynamic, subjective ways, as opposed to merely as objects. By itself, that's fan-f*cking-tastic. Philosophically, the takeaway is that our definitions of "life" "sentience" "consciousness" "emotions" and a host of other related terms are being cast into serious doubt. Which, by itself, isn't that big a surprise since philosophically I don't believe there has ever been anything remotely like consensus on those issues, but now we are calling our collective understanding of those concepts into question because of things we've made, as opposed to naturally occurring phenomena. By itself, that is incredibly interesting, albeit to a more esoteric bunch than the other takeaways. But combined, those three takeaway results - and there may be more - make this line of research endlessly more interesting than your knee-jerk condemnation of our species as deserving of genocide because some people reacted with empathy to a robotic voice saying it didn't want to go into a closet.

The children did not believe the robot was alive. They merely acted as though it were "alive" because they believed it demonstrated intelligence and feelings. Again, the same concept - though to a different degree - is going on when you talk to your dog as though you were communicating with a human being. It's anthropomorphism, pure and simple (inasmuch as that construct is a simple one to understand). Non-human animals do certainly have differing degrees of intelligence or emotionality, but nowhere near the level of sophistication that people often misattribute to them. Hell, to be quite honest the same cognitive ability is being demonstrated by you at this very instant with regards to me. You cannot say with any degree of certainty that I am anything more than a computer programmer's current semi-sophisticated project. You are reacting to me purely under the assumption that I have human capabilities.

By the way, any family of scientists would be impressed with a reference section for two reasons: One, that's sort of how a researcher knows others have done their homework, and two it gives the reader access to a surefire way to double-check (and thus, independently investigate to allow for confirmation or refutation) the assumptions and theoretical background of the author(s). Can you imagine the kind of reaction a scientific publishing company would have if you submitted manuscripts to them without any sort of references?

And, as a second by the way, how can you in the same breath tell me you're not interested in what I have to say if it has to do with explaining how you're misunderstanding this article and then say we'll talk once I've explained to you how you've misunderstood this article? Does that not seem contradictory to you in the slightest? Sheesh, and you think some children saying they think a robot is intelligent is a problem.
 
2012-05-28 11:41:32 PM
Whoa, whoa, whoa.

First, believing that apparently inanimate objects are or might be sentient is not particularly crazy. It's not contrary to known facts; it's just unproveable. MOST people believe lots of things that are contrary to known facts. Generally, these are trivial things, but sometimes (theory of evolution, for instance, or gambler's fallacy) they're not. We, as a species, definitely tend toward various disconnects from objective reality. Without even getting into what that is.

Second, if a robot acts like it in fact is sentient, then apparent known facts back up that belief. So that's even less weird.

Third, I for one would not take it as a given that 'normal' people are "okay". In fact, the empirical evidence available to me suggests that most of them are not. They get by, for awhile, but most of them do it with only moments of elegance and satisfaction, never mind grace.

Most people can't manage their credit cards very well. Issues of determining the sentience of an object that acts like a person . . . you're expecting too much.

And if the majority of people err on the side of caution and tend to feel bad for the machine, I'd say that's probably for the best. If we can't be a species of geniuses, maybe there's some hope we can still be nice to each other.
 
2012-05-28 11:43:19 PM
Sylvia: What literature? Are you talking about the literature making the case for modern-day robots being alive?

No, about the strong tendency for normal people to anthropomorphize objects.

Ever go bowling and hear someone plead with the ball, after it leaves their hand, to go straight?
 
2012-05-28 11:46:37 PM
I think another point would be that the majority (>50%) of children empathized with the robot's situation. This study is about empathy, not about how dumb/immature kids might be.

/Empathy. No, you can't haz. Not yours.
 
2012-05-28 11:48:06 PM

RandomAxe: Sylvia_Bandersnatch: I read TFA over several times. It was short on a lot of details, but not on the part about 43% of 15-year-olds agreeing -- when asked directly -- that a robot IS 'intelligent' (sentient) and DOES have feelings. That's pure stupid. There's no other way to interpret that.

Well, how many 15-year-olds actually know the current state of robotics? Basically, the researchers faked a sentient AI, cheating on a Turing test, and then they asked the kids if they felt empathy for an AI the kids didn't know wasn't real.

Besides, you have to figure that at least a third of those kids lied, intentionally or not, when they answered those questions. Kids are notoriously poor self-reporters.


I don't "have to figure" anything at all. And if that's the case, then the study is garbage anyway, isn't it?

I don't care what they did. If the kids believed it was alive, then they're stupid. Period. And if that's not alarming, then a lot of other people are stupid, too. I can actually find that much more believable. Incredibly depressing, but believable.

I don't accept ignorance as an excuse, either. We're all ignorant about most things. This is a thinking error, to suppose that thinking machines might exist and you might not have heard about it. It is NOT rational to accept that premise, in oneself or anyone else. Anyone capable of thinking should be able to figure out that they're not engaging with a thinking machine. I don't care how damn clever it is, or how ill-informed the subjects are. Think about the logical components of that: What are the odds that YOU are one of the first persons to talk to a real AI? Unless you're one of the scientists yourself, the odds are extremely close to zero. You don't have to read every pop-sci blog and journal to know that we don't have thinking machines. A rational person knows intuitively that that would be gigantic global news, that if it's even slightly public that everyone would have to know about it. The kind of self-centred arrogance it would take to suppose otherwise must be pathological.
 
2012-05-28 11:48:49 PM
baronbloodbath: /Empathy. No, you can't haz. Not yours.

I can't tell how you mean that. Are you mad, or smug, or sad, or what?
 
2012-05-28 11:50:28 PM

Sylvia_Bandersnatch:
So it's injustice if I lock a Furby in a closet? It's a talking robot -- what's the difference?


Can a Furby pass a Touring test? These kids were led to believe this was a non-remote controlled, self motivated, artificial intelligence. How would they know any different. Back in the 80s when I was a teenager cranking out code on my C-64 rumours abounded about universities and research labs being just months away from being able to create an artificial intelligence. If I'd been taken to a university lab and presented with the same situation then I would have believed I was dealing with an artificial intelligence and would not have reacted well to seeing it abused. These kids had absolutely no reason to distrust the adults in question and the state of technology now is far more advanced than it was in the 80s.
 
2012-05-28 11:56:01 PM
Sylvia: And if that's the case, then the study is garbage anyway, isn't it?

Hey, I'm not here to defend the study. The whole thing sounds pretty sketchy to me, but I'm not going to sit down and study their methodology, yadda yadda.


I don't care what they did. If the kids believed it was alive, then they're stupid. Period.

Well, then, I'm afraid that by your standards, yes, probably a majority of people are stupid. With a couple of weeks to prepare, I'm sure I could convince most 15-year-olds that water is easily flammable. It's easier than stage magic.


And if that's not alarming, then a lot of other people are stupid, too. I can actually find that much more believable. Incredibly depressing, but believable.

If this is your reaction, I'm surprised you're only now reaching this conclusion. I think I generally agree with you about the intelligence thing, but I don't find it that depressing. We are a gullible species. Maybe I've just had longer to get used to that idea. Or maybe I'm fooling myself.


This is a thinking error, to suppose that thinking machines might exist and you might not have heard about it.

I think it's at least an equal thinking error to think that thinking machines can't exist, or that they might have been invented but that you can't have not heard about it. No? If the researchers were convincing these kids that, I don't know, dinosaurs were robots, then OK. That would be harder to swallow. I hope.
 
2012-05-28 11:56:33 PM

Kome: Sylvia_Bandersnatch: I'm not interested in what you've said, if any part of it is an attempt to explain away the horrifying implications of the findings reported in TFA. Save it for someone who's impressed with a bibliography. (I'm from a family of scientists, so don't waste your time.) If people are really this stupid, then we're doomed as a species. Period. You can make all the hay of it you want, but if any large proportion of 'normal' humans can believe incredibly stupid shiat, we've got no hope.

THE KIDS BELIEVED THE ROBOT IS ALIVE. That's the only relevant takeaway from this. You tell me I'm mistaken about that, that I misread or misunderstood TFA, and we'll talk. Short of that, there's no way to explain away the sheer idiocy of these kids thinking that, or any typical adult thinking that's okay.

That is not even close to the takeaway of this. There are several takeaways. The relevant message of this research from a psychological perspective is that our technological advances have reached a point where we are interaction with our creations in a way that is eerily like the way we treat stuffed animals, pets, and even other sentient beings. We are reacting to them in ways similar to novels, movies, and television shows, and as a result we can use these as another tool to help us investigate our own humanity, our own neurological functioning. By itself, that's impressive as hell. The relevant message from a computer science and engineering perspective is that our technological advances are getting to be sophisticated enough that we are starting to "trick" (using the term very loosely) typical human brains into interacting with machines in socially dynamic, subjective ways, as opposed to merely as objects. By itself, that's fan-f*cking-tastic. Philosophically, the takeaway is that our definitions of "life" "sentience" "consciousness" "emotions" and a host of other related terms are being cast into serious doubt. Which, by itself, isn't that big a surprise since ...


I think you make an intelligent, well thought out, and overall good point. Welcome to my favorites. Now to figure out a color....
 
2012-05-29 12:01:25 AM

Kome: The children did not believe the robot was alive. They merely acted as though it were "alive" because they believed it demonstrated intelligence and feelings.


You know what's funny? My original draft included this parenthetical statement: "If you get pedantic with me about that, I will track you down and pound so much pumpkin pie up your ass that you'll relive every Thanksgiving you've had in reverse." But I took it out. I decided to give you the benefit of the doubt. And I knew it was a mistake, but I really do try to be fair most of the time, however much of a biatch I can be. It's okay that you disappointed me. But it's much more disappointing that you didn't surprise me.

Anyway, FTFA:

'Overall, 80 percent of the participants felt that Robovie was intelligent, and 60 percent thought that Robovie had feelings. ....

'Things get even more interesting when you break down the results by age. For example, while 93 percent and 67 percent of 9 year olds said that they believed Robovie to be intelligent and to have feelings, respectively, those percentages drop to 70 percent and just 43 percent when you ask 15 year olds the same thing.

Now, tell me that "felt," "thought," and "believe" in TFA don't have the meanings of "accepted as true," and we can have a conversation about this. Anything else is kids believing that a robot is -- for all intents and purposes -- alive. And that's patently stupid. If TFA said they thought it ACTED intelligent, and ACTED like it had feelings, but didn't believe it ACTUALLY thinks or feels, that's an entirely different thing. And if you make the case that that's what TFA really says, and it's just poorly written, or TFA misstates or misrepresents the study, then that's also very workable. I'm just looking at the language of TFA. Tell me I'm reading it wrong. Please. I desperately want to be wrong about this.
 
2012-05-29 12:02:05 AM

Sylvia_Bandersnatch: This is a thinking error, to suppose that thinking machines might exist and you might not have heard about it. It is NOT rational to accept that premise, in oneself or anyone else. Anyone capable of thinking should be able to figure out that they're not engaging with a thinking machine.


So you are currently up to date on all the current advances in artificial intelligence and social signal processing? If so, please explain you're incredibly inaccurate following statement:

You don't have to read every pop-sci blog and journal to know that we don't have thinking machines.

Except, we do. They may not be as sophisticated as actual human beings, or able to perform as human beings in a vast diversity of environments and contexts, but we do have thinking machines. Some programs that have been developed can predict, with high accuracy (>70%), results of social interactions as dynamic as job interviews or even speed-dating conversations. You would know this if you were, in fact, up-to-date on research on artificial intelligence and social signal processing. Hint: I referenced the study that discusses all of this earlier in the thread. You might want to give that article a read before you post your critiques of the results and conclusions of this kind of research and passing wholly inappropriate judgement on the cognitive capabilities of children and adolescents who were participants in this research. It takes 20-30 minutes even if you're not familiar with that line of research.
 
2012-05-29 12:02:47 AM

Sylvia_Bandersnatch: RandomAxe: Sylvia_Bandersnatch: I read TFA over several times. It was short on a lot of details, but not on the part about 43% of 15-year-olds agreeing -- when asked directly -- that a robot IS 'intelligent' (sentient) and DOES have feelings. That's pure stupid. There's no other way to interpret that.

Well, how many 15-year-olds actually know the current state of robotics? Basically, the researchers faked a sentient AI, cheating on a Turing test, and then they asked the kids if they felt empathy for an AI the kids didn't know wasn't real.

Besides, you have to figure that at least a third of those kids lied, intentionally or not, when they answered those questions. Kids are notoriously poor self-reporters.

I don't "have to figure" anything at all. And if that's the case, then the study is garbage anyway, isn't it?

I don't care what they did. If the kids believed it was alive, then they're stupid. Period. And if that's not alarming, then a lot of other people are stupid, too. I can actually find that much more believable. Incredibly depressing, but believable.

I don't accept ignorance as an excuse, either. We're all ignorant about most things. This is a thinking error, to suppose that thinking machines might exist and you might not have heard about it. It is NOT rational to accept that premise, in oneself or anyone else. Anyone capable of thinking should be able to figure out that they're not engaging with a thinking machine. I don't care how damn clever it is, or how ill-informed the subjects are. Think about the logical components of that: What are the odds that YOU are one of the first persons to talk to a real AI? Unless you're one of the scientists yourself, the odds are extremely close to zero. You don't have to read every pop-sci blog and journal to know that we don't have thinking machines. A rational person knows intuitively that that would be gigantic global news, that if it's even slightly public that everyone would have to know about it. The kind of ...


Someone who can approach a topic with a rational and logical point of view would care, and not disagree with things that they don't feel like rationalizing by calling them stupid. Just sayin'
 
2012-05-29 12:08:05 AM

RandomAxe: This is a thinking error, to suppose that thinking machines might exist and you might not have heard about it.

I think it's at least an equal thinking error to think that thinking machines can't exist, or that they might have been invented but that you can't have not heard about it. No? If the researchers were convincing these kids that, I don't know, dinosaurs were robots, then OK. That would be harder to swallow. I hope.


I never said I reject the possibility of real AI. I explicitly said the opposite. But I said that it doesn't exist NOW, and if it did, we'd all know about it. There's no way we wouldn't. So anyone who meets a robot RIGHT NOW knows intuitively that it's not an AI, for that reason alone -- you don't need to know how to do a Turing test or anything, only aware that you didn't already hear about it, and have some reasonable connection with reality. How isolated or egotistical would someone have to be to suppose otherwise? Real life is not like Johnny Quest, and I like to believe that even stupid kids know that. At least I hope so.

You and others seem to just be making the case that 'normal' kids are stupid -- which is exactly what I'm saying. If you readily accept stupid things, then you're stupid, right?
 
2012-05-29 12:10:18 AM

RandomAxe: First, believing that apparently inanimate objects are or might be sentient is not particularly crazy.


It is in 2012. This is not a SyFy special, it's real life. These are real kids, who are not toddlers, not impaired, and not insane. There's no rational reason any thinking person should suppose they've met an AI this year.
 
Displayed 50 of 120 comments

First | « | 1 | 2 | 3 | » | Last | Show all



This thread is archived, and closed to new comments.

Continue Farking
Submit a Link »






Report