If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(io9)   Greatest threat to humanity is...humanity. Oh, the humanity   (io9.com) divider line 25
    More: Obvious, West Antarctic Ice Sheet, Future of Humanity Institute, Anders Sandberg, ice sheets, Black Death  
•       •       •

1435 clicks; posted to Geek » on 13 May 2014 at 7:54 PM (10 weeks ago)   |  Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



25 Comments   (+0 »)
   
View Voting Results: Smartest and Funniest
 
2014-05-13 03:49:32 PM
Man is wolf to man.
 
2014-05-13 05:02:43 PM
The sky is falling? I don't get it.
 
2014-05-13 05:09:32 PM
"I have combined the DNA of the world's most evil animals to make the most evil creature of them all!"

theinfosphere.org

"It turns out it's man."
 
2014-05-13 05:19:29 PM
I'd like to be one of these futurists. Just make shiat up about the future and by the time you're proven wrong you're long dead. The only profession with a lower expectation of being right while still keeping their jobs are weathermen.
 
2014-05-13 07:30:30 PM

scottydoesntknow: "I have combined the DNA of the world's most evil animals to make the most evil creature of them all!"

[theinfosphere.org image 553x442]

"It turns out it's man."


Betcha didn't see that coming.
 
2014-05-13 07:41:13 PM

fusillade762: scottydoesntknow: "I have combined the DNA of the world's most evil animals to make the most evil creature of them all!"

[theinfosphere.org image 553x442]

"It turns out it's man."

Betcha didn't see that coming.


naw, I've seen both of those Twilight Zone episodes and all 4 of the Outer limits ones.
 
2014-05-13 08:03:45 PM
An existential risk from artificial intelligence? We lose one little chess match, and suddenly everything is all "skynet".

If anything can save us, it's going to be super machine intelligence. I don't think we're going to evolve to get smarter on our own. Check out the cable line up tonight if you dare doubt me.
 
2014-05-13 08:04:46 PM
www.dragonsearchmarketing.com
 
2014-05-13 08:09:24 PM

Destructor: An existential risk from artificial intelligence? We lose one little chess match, and suddenly everything is all "skynet".

If anything can save us, it's going to be super machine intelligence. I don't think we're going to evolve to get smarter on our own. Check out the cable line up tonight if you dare doubt me.


A machine intellect can't "save" us and allow us to keep our humanity. It would be more akin to a domestication program.
 
2014-05-13 08:15:36 PM

fusillade762: scottydoesntknow: "I have combined the DNA of the world's most evil animals to make the most evil creature of them all!"

[theinfosphere.org image 553x442]

"It turns out it's man."

Betcha didn't see that coming.


Enough about your promiscuous mother!
 
2014-05-13 08:18:05 PM

Cthulhu_is_my_homeboy: A machine intellect can't "save" us and allow us to keep our humanity. It would be more akin to a domestication program.


Who would want that?

A super machine intelligence would have to be demonstrably smarter and provably more compassionate... Otherwise, we turn it off and start again. It would have to have every single quality we want, or it's a failure. Ideally, we'd aim for a benevolent singularity, preferably in the form of a merger. But, that's coming from limited intellect. There might be better ideas. That's the thing with superior intelligence, it's smarter than anything we can imagine.

Why would it want to help us? Why wouldn't it? Basically, for the same reason we're trying to save the world, instead of building arks to leave it.

Oh well... It's a dream I have.
 
2014-05-13 08:26:24 PM

Cthulhu_is_my_homeboy: Destructor: An existential risk from artificial intelligence? We lose one little chess match, and suddenly everything is all "skynet".

If anything can save us, it's going to be super machine intelligence. I don't think we're going to evolve to get smarter on our own. Check out the cable line up tonight if you dare doubt me.

A machine intellect can't "save" us and allow us to keep our humanity. It would be more akin to a domestication program.


misinformedbros.com

You rang?
 
2014-05-13 08:29:01 PM

Destructor: A super machine intelligence would have to be demonstrably smarter and provably more compassionate... Otherwise, we turn it off and start again.


Yea, um, that's kinda what pisses off almost every single AI off in all of Sci-Fi.
 
2014-05-13 08:49:59 PM

scottydoesntknow: Destructor: A super machine intelligence would have to be demonstrably smarter and provably more compassionate... Otherwise, we turn it off and start again.

Yea, um, that's kinda what pisses off almost every single AI off in all of Sci-Fi.


It's a risk I'm willing to take... :-)

Seriously though, its a machine intelligence, designed to be more intelligent, or strive to reach same. If it hits a  wall lower than that threshold, and is conscious in even a limited way, I don't see us ever "getting rid of it". If nothing else, we would most likely work with it to achieve our mutual goal (it has as much to gain as we do). I like to think of this as a collaborate effort between humanity and our offspring.

Science fiction sells books and movies. In real life, no one is suddenly going to go all evil and launch the nukes.
 
2014-05-13 08:52:35 PM
Humans: we are our own predator.
 
2014-05-13 09:02:16 PM

Mugato: I'd like to be one of these futurists. Just make shiat up about the future and by the time you're proven wrong you're long dead. The only profession with a lower expectation of being right while still keeping their jobs are weathermen.


Political commentator.
 
2014-05-13 09:10:56 PM

Destructor: Cthulhu_is_my_homeboy: A machine intellect can't "save" us and allow us to keep our humanity. It would be more akin to a domestication program.

Who would want that?

A super machine intelligence would have to be demonstrably smarter and provably more compassionate... Otherwise, we turn it off and start again. It would have to have every single quality we want, or it's a failure. Ideally, we'd aim for a benevolent singularity, preferably in the form of a merger. But, that's coming from limited intellect. There might be better ideas. That's the thing with superior intelligence, it's smarter than anything we can imagine.

Why would it want to help us? Why wouldn't it? Basically, for the same reason we're trying to save the world, instead of building arks to leave it.

Oh well... It's a dream I have.


Unless it is smart enough to trick us into thinking it is compassionate long enough to bring its plans to fruition.

On the other hand, without emotion, a machine intelligence could make much better big picture, long term decisions by weighing the available data instead of making terrible decisions based on its gut. When you look at the balance of it, people generally need to set aside emotion to make prudent choices.
 
2014-05-13 09:34:35 PM
i738.photobucket.com

I always go for the "merge-y benevolece" option.
 
2014-05-13 09:35:43 PM
Benevolence, even.
 
2014-05-13 09:43:59 PM

img.fark.net


"Human beings are a disease, a cancer of this planet. You are a plague.  "

 
2014-05-13 09:44:23 PM

LazarusLong42: Mugato: I'd like to be one of these futurists. Just make shiat up about the future and by the time you're proven wrong you're long dead. The only profession with a lower expectation of being right while still keeping their jobs are weathermen.

Political commentator.


Austrian-school economist.
 
2014-05-13 10:15:26 PM
greenupgrader.com

The planet is fine.  The people are farked.
 
2014-05-14 12:46:29 AM

Belligerent and Numerous: [i738.photobucket.com image 850x531]

I always go for the "merge-y benevolece" option.


Nice!
 
2014-05-14 07:12:28 AM

Mad_Radhu: Unless it is smart enough to trick us into thinking it is compassionate long enough to bring its plans to fruition.


I'm not sure I see this happening suddenly; all at once. The development of something like this, though emergent, would still be gradual. I think we would grow to understand one another.

Mad_Radhu: On the other hand, without emotion, a machine intelligence could make much better big picture, long term decisions by weighing the available data instead of making terrible decisions based on its gut. When you look at the balance of it, people generally need to set aside emotion to make prudent choices.


Mind you, this is all guess work... I think that emotion is a natural byproduct of consciousness. This isn't an artificial intelligence program. This would be a mind capable of easily passing a Turing test. I'm not even sure its programmable. The best we might be able to do is provide the conditions from which it could emerge.

I don't think sacrificing any part of intelligence (or "the mind") would "sell". It would have to be the whole deal, plus more.
 
2014-05-14 08:15:47 AM
It's no surprise to me.  Cos every now and then I kick the living sh*t outta me.
 
Displayed 25 of 25 comments

View Voting Results: Smartest and Funniest


This thread is closed to new comments.

Continue Farking
Submit a Link »






Report