Skip to content
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Science Magazine)   Many of the big advances in artificial intelligence are completely artificial   (sciencemag.org) divider line
    More: Interesting, Artificial intelligence, Better, Computer science, Artificial neural network, Algorithm, Neural network, Machine learning, Carnegie Mellon University  
•       •       •

516 clicks; posted to Geek » on 02 Jun 2020 at 12:17 PM (17 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



20 Comments     (+0 »)
 
View Voting Results: Smartest and Funniest
 
2020-06-02 11:26:20 AM  
Fark user imageView Full Size
 
2020-06-02 12:27:23 PM  
images-cdn.9gag.comView Full Size
 
2020-06-02 12:28:46 PM  
But what if that's what Skynet wants you to think?
 
2020-06-02 12:30:49 PM  
i.redd.itView Full Size
 
2020-06-02 12:35:47 PM  
Yo, dog
 
2020-06-02 12:36:44 PM  
That's ok, who could have seen this coming. Especially in all of those autonomous driving threads?
 
2020-06-02 12:48:23 PM  
Yeah, it drives me nuts how much the term "artificial intelligence" is thrown around.

"AI" does NOT exist! What we have now is really good statistical analysis of enormous datasets. That's not the same thing, and not nearly as useful.
 
2020-06-02 1:35:47 PM  
Marketing. (aka fake news)
 
2020-06-02 2:20:45 PM  
Simple machine learning algorithms are fascinating and look like AI, but under the hood it's all just an automated version of the brute-force trial and error method of 'tweak, try again, tweak again, repeat.'

You just do all that a bunch with fake data and cross your fingers that it keeps working on the real data.
 
2020-06-02 2:55:48 PM  
with the way AI was being reported on, I thought AI would solve the COVID-19 problem quickly.
 
2020-06-02 3:08:54 PM  

realmolo: Yeah, it drives me nuts how much the term "artificial intelligence" is thrown around.

"AI" does NOT exist! What we have now is really good statistical analysis of enormous datasets. That's not the same thing, and not nearly as useful.


PirateKing: Simple machine learning algorithms are fascinating and look like AI, but under the hood it's all just an automated version of the brute-force trial and error method of 'tweak, try again, tweak again, repeat.'

You just do all that a bunch with fake data and cross your fingers that it keeps working on the real data.


It's not all a huge web of "IF"s.  AFAIK the best AI today is training and digging out "good" matches / outcomes from an amorphous set of probability logic.  Much more fuzzy (ugh) than monstrous binary conditionals.

Another thought is that one difficulty in these researchers analyzing the start-of-art from publications is that some of the most successful "AI" isn't published.  Falls under "trade secrets".
 
2020-06-02 3:10:17 PM  

BeotchPudding: with the way AI was being reported on, I thought AI would solve the COVID-19 problem quickly.


Solve or caused? COVID doesn't hurt AI

Fark user imageView Full Size
 
2020-06-02 3:40:13 PM  

SansNeural: realmolo: Yeah, it drives me nuts how much the term "artificial intelligence" is thrown around.

"AI" does NOT exist! What we have now is really good statistical analysis of enormous datasets. That's not the same thing, and not nearly as useful.

PirateKing: Simple machine learning algorithms are fascinating and look like AI, but under the hood it's all just an automated version of the brute-force trial and error method of 'tweak, try again, tweak again, repeat.'

You just do all that a bunch with fake data and cross your fingers that it keeps working on the real data.

It's not all a huge web of "IF"s.  AFAIK the best AI today is training and digging out "good" matches / outcomes from an amorphous set of probability logic.  Much more fuzzy (ugh) than monstrous binary conditionals.

Another thought is that one difficulty in these researchers analyzing the start-of-art from publications is that some of the most successful "AI" isn't published.  Falls under "trade secrets".


I agree. But the simple ones are the most interesting to me. Convolutional neural nets and backpropagation are basically just iterated matrix math. There aren't enough 'ifs' available to do what they can do with just a few layers of 'neurons' aka matrices and feedback training.
 
2020-06-02 5:05:04 PM  

PirateKing: Simple machine learning algorithms are fascinating and look like AI, but under the hood it's all just an automated version of the brute-force trial and error method of 'tweak, try again, tweak again, repeat.'

You just do all that a bunch with fake data and cross your fingers that it keeps working on the real data.


And computers are just all 1s and 0s.

It's just really fast and large scale arithmetic.
 
2020-06-02 5:15:26 PM  
 
2020-06-02 7:10:47 PM  

mr0x: PirateKing: Simple machine learning algorithms are fascinating and look like AI, but under the hood it's all just an automated version of the brute-force trial and error method of 'tweak, try again, tweak again, repeat.'

You just do all that a bunch with fake data and cross your fingers that it keeps working on the real data.

And computers are just all 1s and 0s.

It's just really fast and large scale arithmetic.


It's a bit more than that, but not a lot. It really depends on what level of abstraction you're thinking about. At the very basic level it is semiconductors building and releasing electric fields and pulling electrons across depletion zones to make logic gates that allow current to flow based on increasingly complex criteria, some of which are math.

At the hardware level it's math encoded in physical signals, and other signals to control the math.

And then you start getting into the information theory parts of it...

Things like a little physics simulation that learn to fly a boid around corners to a target, or in more impressive cases learn to recognize images or sounds really are just doing a crapload of math, and plugging the results back into the inputs over and over.

I like to think it's impure crystals throbbing with one of the deep forces of the universe in arcane patterns, dancing to the wills of wizards and sorcerers.
 
2020-06-03 6:35:37 AM  

SansNeural: Much more fuzzy (ugh) than monstrous binary conditionals.


Monstrous Binary Conditionals is my Industrial Devo/Righteous Brothers fusion band.
 
2020-06-03 6:37:16 AM  

PirateKing: mr0x: PirateKing: Simple machine learning algorithms are fascinating and look like AI, but under the hood it's all just an automated version of the brute-force trial and error method of 'tweak, try again, tweak again, repeat.'

You just do all that a bunch with fake data and cross your fingers that it keeps working on the real data.

And computers are just all 1s and 0s.

It's just really fast and large scale arithmetic.

It's a bit more than that, but not a lot. It really depends on what level of abstraction you're thinking about. At the very basic level it is semiconductors building and releasing electric fields and pulling electrons across depletion zones to make logic gates that allow current to flow based on increasingly complex criteria, some of which are math.

At the hardware level it's math encoded in physical signals, and other signals to control the math.

And then you start getting into the information theory parts of it...

Things like a little physics simulation that learn to fly a boid around corners to a target, or in more impressive cases learn to recognize images or sounds really are just doing a crapload of math, and plugging the results back into the inputs over and over.

I like to think it's impure crystals throbbing with one of the deep forces of the universe in arcane patterns, dancing to the wills of wizards and sorcerers.


No, it's literally all math.

In fact, as some very fundamental level, everything is math.
 
2020-06-03 3:23:52 PM  

dittybopper: PirateKing: mr0x: PirateKing: Simple machine learning algorithms are fascinating and look like AI, but under the hood it's all just an automated version of the brute-force trial and error method of 'tweak, try again, tweak again, repeat.'

You just do all that a bunch with fake data and cross your fingers that it keeps working on the real data.

And computers are just all 1s and 0s.

It's just really fast and large scale arithmetic.

It's a bit more than that, but not a lot. It really depends on what level of abstraction you're thinking about. At the very basic level it is semiconductors building and releasing electric fields and pulling electrons across depletion zones to make logic gates that allow current to flow based on increasingly complex criteria, some of which are math.

At the hardware level it's math encoded in physical signals, and other signals to control the math.

And then you start getting into the information theory parts of it...

Things like a little physics simulation that learn to fly a boid around corners to a target, or in more impressive cases learn to recognize images or sounds really are just doing a crapload of math, and plugging the results back into the inputs over and over.

I like to think it's impure crystals throbbing with one of the deep forces of the universe in arcane patterns, dancing to the wills of wizards and sorcerers.

No, it's literally all math.

In fact, as some very fundamental level, everything is math.


Math is a language used to describe reality. But math itself is not 'reality'. It's a map, not the territory. You can use math to describe things that aren't real, just like you can draw things on a map that aren't there.

The relationships may be real, but the math is just the description.
 
2020-06-03 3:25:40 PM  

dittybopper: PirateKing: mr0x: PirateKing: Simple machine learning algorithms are fascinating and look like AI, but under the hood it's all just an automated version of the brute-force trial and error method of 'tweak, try again, tweak again, repeat.'

You just do all that a bunch with fake data and cross your fingers that it keeps working on the real data.

And computers are just all 1s and 0s.

It's just really fast and large scale arithmetic.

It's a bit more than that, but not a lot. It really depends on what level of abstraction you're thinking about. At the very basic level it is semiconductors building and releasing electric fields and pulling electrons across depletion zones to make logic gates that allow current to flow based on increasingly complex criteria, some of which are math.

At the hardware level it's math encoded in physical signals, and other signals to control the math.

And then you start getting into the information theory parts of it...

Things like a little physics simulation that learn to fly a boid around corners to a target, or in more impressive cases learn to recognize images or sounds really are just doing a crapload of math, and plugging the results back into the inputs over and over.

I like to think it's impure crystals throbbing with one of the deep forces of the universe in arcane patterns, dancing to the wills of wizards and sorcerers.

No, it's literally all math.

In fact, as some very fundamental level, everything is math.


Also, I was referring in many cases to propagated signals in the actual hardware like 'write enable' and 'read enable', the clock signals themselves, etc. There's some 'math' behind those signals, but the control signals themselves aren't part of the arithmetic, just the timing.
 
Displayed 20 of 20 comments

View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking




On Twitter



  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.