If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Fox News)   Moore's law may soon be repealed   (foxnews.com) divider line 37
    More: Obvious, Moore's Law, Intel  
•       •       •

5811 clicks; posted to Geek » on 06 Dec 2013 at 4:26 PM (19 weeks ago)   |  Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



37 Comments   (+0 »)
   
View Voting Results: Smartest and Funniest
 
2013-12-06 04:15:04 PM
Is this a repeat from 18 months ago? Or the 18 months before that, etc?
 
2013-12-06 04:30:28 PM

GRCooper: Is this a repeat from 18 months ago? Or the 18 months before that, etc?


From 18 months in the future.
 
2013-12-06 04:32:22 PM

i.imgur.com

Good!

 
2013-12-06 04:34:01 PM
Oh, thank goodness! I can marry my computer now! Then get divorced later when I replace her with a newer model.

That's not what the law says? Oh well...
 
2013-12-06 04:38:43 PM
Full blogpost from the WSJ here

http://blogs.wsj.com/digits/2013/11/21/intel-explains-rare-moores-la w- stumble/

They seem to be aiming for 10 nm chip dies. Dang, I thought my Sandy Bridge CPU was small with a 32 nm die size.
 
2013-12-06 04:39:43 PM
Also FTA:

So, despite the challenges, Holt could not be induced to say there's any looming end to Moore's Law, the invention race that has been a key driver of electronics innovation since first defined by Intel's co-founder in the mid-19060s.
 
2013-12-06 04:44:13 PM
Doomsdayers have been screaming about the looming end of Moore's Law since I started high school, so nothing new here.


It is interesting how much collective work the industry does to innovate and optimize to try to perpetuate Moore's Law, trying to squeeze every dime out of a booming industry.
 
2013-12-06 04:47:58 PM
Well, it's not so much a Law as a rule of thumb.

Eventually I expect it will fall prey to common sense. Consumers will stop buying more powerful machines every 18 months because they will realise that their eyesight is no longer good enough to see extra pixels or colors on screens, to notice the screen flutter on animation, or to detect any increase in speed, even if it is astronomical.

I mean, once you can converse normally over the Internet with sound and video that are barely distinguishable from being there, why would you need double the power or speed each 18 months?

Naturally, the people selling us new crap every 18 months won't be happy if we start keeping our computers for years, as we have done our cars. At one time, you had to have the latest model every year if you were even moderately rich or well-to-do. Now cars last longer and don't look all that different from one year to the next. The market and the psychology have changed.

Moore's Law is not so much a law of physics (that will run up against the laws of physics and fail) as a law of commerce, which is to say, a habit of individuals and masses of people.

Moore's Law has survived several changes of architecture and software and hardware, but it won't survive the limits of the human customers unless robots and androids start buying computers for themselves.

Personally I am sick of changing formats. My VHS tapes are still watchable, my DVDs and CDs and Blu-Rays are just fine. I simply have no room to stack them higher. And as for Netflix, they haven't got anything  that Blockbuster or Pay-per-View didn't offer years ago. I have a better selection of the types of movies and TV shows I like to watch than they do. I have nearly as many, perhaps as many shows as they do (in Canada--the US Netflex has three times as many).

Moore's Law has been a remarkable survivor, but like Abe Vigoda, is not immortal and will eventually die for real. I don't expect it in the next 18 months, though. And I wish Mr. Vigoda well, even if he is totally retired and possibly gaga for all I know
 
2013-12-06 04:49:10 PM

noobiemcfoob: Doomsdayers have been screaming about the looming end of Moore's Law since I started high school, so nothing new here.


It is interesting how much collective work the industry does to innovate and optimize to try to perpetuate Moore's Law, trying to squeeze every dime out of a booming industry.


It's in the industry's interest to make you believe that the current lineup isn't doomed to obsolescence, like the last computer they sold you.
 
2013-12-06 04:49:40 PM
Today, Moore's Law; tomorrow, ObamaCare!
 
2013-12-06 05:10:15 PM

brantgoose: I mean, once you can converse normally over the Internet with sound and video that are barely distinguishable from being there, why would you need double the power or speed each 18 months?


I totally sympathize with that, but in my experience as long as capacity increases, users find a way to consume it.

Madden sports games, for example -- once you've got the gameplay nailed on a 32bit system, who needs more?  Oh, hey, 3D! We totally need 3D now. Who needs more?  Well, texture maps larger than 64x64 would be nice. That's it! Who needs -- oh, dynamic lighting. Sure! Who -- OK, particle effects. And bump mapping. And blur. And customizable player models. And hair simulation. And --

iml.jou.ufl.edu

media1.gameinformer.com
 
2013-12-06 05:14:09 PM

StopLurkListen: brantgoose: I mean, once you can converse normally over the Internet with sound and video that are barely distinguishable from being there, why would you need double the power or speed each 18 months?

I totally sympathize with that, but in my experience as long as capacity increases, users find a way to consume it.

Madden sports games, for example -- once you've got the gameplay nailed on a 32bit system, who needs more?  Oh, hey, 3D! We totally need 3D now. Who needs more?  Well, texture maps larger than 64x64 would be nice. That's it! Who needs -- oh, dynamic lighting. Sure! Who -- OK, particle effects. And bump mapping. And blur. And customizable player models. And hair simulation. And --


I'm still waiting for games to include smellovision.
 
2013-12-06 05:16:48 PM

roflmaonow: They seem to be aiming for 10 nm chip dies. Dang, I thought my Sandy Bridge CPU was small with a 32 nm die size.


I remember reading, years ago, that 9nm was said to be about where they'd stop being able to build transistors without quantum effects børking everything up.

If that theory is still valid, then they'd be getting close to the physical limits of 2D transistor density. Maybe there really is trouble ahead for the industry.
 
2013-12-06 05:20:02 PM
By the way, our kids will soon think what we consider today to be "barely distinguishable from being there" to be quaint, an equivalent of what we think of black-and-white print photography, silent films and gramophones.

http://p3d.in/3X6cW/subd+spin
 
2013-12-06 05:24:29 PM

StopLurkListen: By the way, our kids will soon think what we consider today to be "barely distinguishable from being there" to be quaint, an equivalent of what we think of black-and-white print photography, silent films and gramophones.

http://p3d.in/3X6cW/subd+spin


Doesn't have fingernails! Totally unrealistic... these graphics suck!
 
2013-12-06 05:25:41 PM
Cthulhu_is_my_homeboy: If that theory is still valid, then they'd be getting close to the physical limits of 2D transistor density. Maybe there really is trouble ahead for the industry.

Hence, various researchers (heavily funded by Intel and others) have begun looking into 3D solutions, stacking dies atop each other, among other ideas (e.g. quantum computing, photonic computing) to circumvent those pesky physical laws.
 
2013-12-06 05:30:29 PM
bbsimg.ngfiles.com
 
2013-12-06 05:31:02 PM

danknerd: StopLurkListen: By the way, our kids will soon think what we consider today to be "barely distinguishable from being there" to be quaint, an equivalent of what we think of black-and-white print photography, silent films and gramophones.

http://p3d.in/3X6cW/subd+spin

Doesn't have fingernails! Totally unrealistic... these graphics suck!


I should have found a better example, there's capture software that pretty impressively creates a painted 3D model.
 https://www.youtube.com/watch?v=Oie1ZXWceqM#t=96


The amazing part will be when it's fully automatic in-camera in real-time.

You'll be able to rotate and zoom in on their ... stuff, while talking with a live person or watching... stuff. Yeah, I can think of a few million horny guys who would be first in line to pay for that kind of ...thing.
 
2013-12-06 05:37:03 PM

StopLurkListen: Madden sports games, for example -- once you've got the gameplay nailed on a 32bit system, who needs more? Oh, hey, 3D! We totally need 3D now. Who needs more? Well, texture maps larger than 64x64 would be nice. That's it! Who needs -- oh, dynamic lighting. Sure! Who -- OK, particle effects. And bump mapping. And blur. And customizable player models. And hair simulation. And --


You're going to have a hard enough time convincing people that motion-tracking headsets--the first step on the road to virtual reality--are worth that kind of investment.  Right now, it seems they only want these technologies if you can make them portable and act as a mere complement to their existing lifestyle.  Maybe the discerning video game fan can see the huge difference and potential for games between the seventh- and eighth-gen game consoles (and the more powerful hardware that will be cropping up for computers in the coming years), but the shiat-tier consumer is necessary to subsidize these games, and they can't be arsed to see the difference between a PS4 Madden and a PS3 Madden.
 
2013-12-06 05:38:14 PM

Cthulhu_is_my_homeboy: roflmaonow: They seem to be aiming for 10 nm chip dies. Dang, I thought my Sandy Bridge CPU was small with a 32 nm die size.

I remember reading, years ago, that 9nm was said to be about where they'd stop being able to build transistors without quantum effects børking everything up.

If that theory is still valid, then they'd be getting close to the physical limits of 2D transistor density. Maybe there really is trouble ahead for the industry.


The WSJ article mentioned that this was not due to problems with their 3D architecture -- so this generation of chips has apparently already passed the 2D density limit.
 
2013-12-06 05:40:54 PM
Moore's law is the observation that, over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years.

They don't have to keep making the transistors smaller, just make the physical chips larger. Problem solved.
 
2013-12-06 05:59:37 PM

Hand Banana: Moore's law is the observation that, over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years.

They don't have to keep making the transistors smaller, just make the physical chips larger. Problem solved.


Maybe you should be the CEO of Intel. I think you're onto something. Computers can come full circle back filling up your entire room. Yeah I just got this new all copper HSF, they are bringing by helo, sucker weighs 3 tons.. only cost me 1.5m
 
2013-12-06 06:20:32 PM
Intel is still a full generation ahead of the rest of the industry in process size, right?
 
2013-12-06 06:54:40 PM
No - Intel is scared of Qualcomm!
     They are now frantically playing catch-up!
The future is mobile.....
 
2013-12-06 07:29:23 PM
Moore's Law on predicting the end of Moore's Law: the number of news articles predicting the end of Moore's Law will double approximately every two years.
 
2013-12-06 08:30:50 PM
That's because it's an empirical generalization about economics.  Nothing to see here, move along.
 
2013-12-06 08:43:22 PM
I have my own solution to Moore's Law... More machines.

I realize this doesn't help with playing Crysis, but I do occasionally have some big crunching jobs such as converting wedding videos and 3D rendering. I just send them over so some headless machine and let them crunch at it while I surf Fark on my main box.

Of course, my power bill gets correspondingly larger....

danknerd:

Hand Banana: Moore's law is the observation that, over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years.

They don't have to keep making the transistors smaller, just make the physical chips larger. Problem solved.

Maybe you should be the CEO of Intel. I think you're onto something. Computers can come full circle back filling up your entire room. Yeah I just got this new all copper HSF, they are bringing by helo, sucker weighs 3 tons.. only cost me 1.5m


Well, he does sort of have a point, except it's more a "break it into more cores" thing rather than "run one absurdly dense core faster." Which is in fact what they're doing. We've edged up on the limits of how fast you can run any one core, so you make multiple cores on one die.

Unfortunately programming hasn't progressed as fast as it could there, and of course the bigger the die, the more chance you'll have a manufacturing flaw. There have been a number of chips made that started out as 4 or 8 cores, but testing reveals they have dead cores, so they seal off the dead ones through firmware and sell them as 2 or 4 core chips.
 
2013-12-06 09:54:46 PM
Surprising, that FAUX news would not be fully informed about the modern world

*rubbing chin*
 
2013-12-06 11:51:40 PM

brantgoose: Moore's Law is not so much a law of physics (that will run up against the laws of physics and fail) as a law of commerce, which is to say, a habit of individuals and masses of people.


CSB: Moore's law has always been a law of economics, not physics.
 
2013-12-07 01:01:40 AM

StopLurkListen: brantgoose: I mean, once you can converse normally over the Internet with sound and video that are barely distinguishable from being there, why would you need double the power or speed each 18 months?

I totally sympathize with that, but in my experience as long as capacity increases, users find a way to consume it.

Madden sports games, for example -- once you've got the gameplay nailed on a 32bit system, who needs more?  Oh, hey, 3D! We totally need 3D now. Who needs more?  Well, texture maps larger than 64x64 would be nice. That's it! Who needs -- oh, dynamic lighting. Sure! Who -- OK, particle effects. And bump mapping. And blur. And customizable player models. And hair simulation. And --

[iml.jou.ufl.edu image 320x224]

[media1.gameinformer.com image 850x478]


-- and there's still not enough processing power to realistically model certain things, like waves. Even if we had 1000 times more, they'd still find a way to use it.
 
2013-12-07 02:18:32 AM

Cthulhu_is_my_homeboy: roflmaonow: They seem to be aiming for 10 nm chip dies. Dang, I thought my Sandy Bridge CPU was small with a 32 nm die size.

I remember reading, years ago, that 9nm was said to be about where they'd stop being able to build transistors without quantum effects børking everything up.

If that theory is still valid, then they'd be getting close to the physical limits of 2D transistor density. Maybe there really is trouble ahead for the industry.


meh, not really
1) improved chip design would lead to some minor improvements
2) mature industries have much cheaper products, if they actually can not shrink the chips any smaller, in theory, prices would continue to drop on the last generation of chips.
Once they pay off the fabs, the chips get even cheaper to make.
3) the money that was once spent shrinking to the next node can be spent on research on whole new directions, quantum, biological, etc, etc
 
2013-12-07 03:10:08 AM

vonmatrices:


Favorited.
 
2013-12-07 06:04:48 AM
Kind of a moot point when they purposely stagger hardware releases to whatever timeline yields the most profitability... its not like they actually push the tech out to consumers as fast as they can.
 
2013-12-07 08:59:11 AM
For hard core computing this of course becomes a concern, however with advents of multi-core, GPU computing, parallel programming, etc there are some workaround to continue increasing net computing power.

For the casual home user/PC owner, since when has raw "cpu" power been the bottleneck?  Most people would (or will) be shocked just how much faster their machine is when it gets an SSD upgrade, for example.
 
2013-12-07 09:06:43 AM
Even if Moore's law fails to hold; so what?  You could double the number of transistors every 10 years instead of every 2 and the computing industry would still be leaps and bounds ahead of most others.
 
2013-12-07 12:25:30 PM
Some technologies are advancing faster than Moore's law:

http://www.genome.gov/images/content/cost_per_genome_apr.jpg
 
2013-12-07 01:15:17 PM

Alonjar: Kind of a moot point when they purposely stagger hardware releases to whatever timeline yields the most profitability... its not like they actually push the tech out to consumers as fast as they can.


Oh, how I wish that were true.

/13 years as an engineer in the semiconductor industry
//It is insanely competitive and pushing out a schedule can cost billions
 
Displayed 37 of 37 comments

View Voting Results: Smartest and Funniest


This thread is closed to new comments.

Continue Farking
Submit a Link »






Report