If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(MIT Technology Review)   Moore's Law no more?   (technologyreview.com) divider line 39
    More: Fail, Moore's Law, direct current, Alternating Current, potential energy, wafers, Technology Review, ultraviolet light, Nikon  
•       •       •

9428 clicks; posted to Geek » on 16 Jul 2012 at 11:14 AM (2 years ago)   |  Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



39 Comments   (+0 »)
   
View Voting Results: Smartest and Funniest

Archived thread
 
2012-07-16 09:46:43 AM  
Why Fail? Moore's Law was never actually intended to be a real law, just an after-the-fact observation on the amount of transistors you can fit onto a given bit of silicon.

It's pretty much inevitable that Moore's Law will no longer be true at some point, once single atom transistors become possible and widespread.
 
2012-07-16 09:49:25 AM  
Repeat from two years ago and two years before that.
 
2012-07-16 11:24:56 AM  
OMG this is huge! If we wind up breaking Moore's law then...absolutely nothing will happen!

Seriously, did Moore have connections with the Illuminati or something? Why is this a big deal?
 
2012-07-16 11:44:23 AM  
I predict that the number of articles predicting the end of Moore's Law doubles every 18 months.
 
2012-07-16 11:44:41 AM  
So much for it being a "law".

/teach the controversy
 
2012-07-16 11:46:00 AM  
Remember when they said gate leakage with features under 1 micrometer would be the undoing of Moore's law? Good times. Good times.
 
2012-07-16 11:46:13 AM  

RexTalionis: Why Fail? Moore's Law was never actually intended to be a real law, just an after-the-fact observation on the amount of transistors you can fit onto a given bit of silicon.

It's pretty much inevitable that Moore's Law will no longer be true at some point, once single atom transistors become possible and widespread.


Actually, we will have problems long before we get to single atom transistors. Already quantum mechanical phenomena are limiting the size of components. If you tried to contain a single electron, it very often decide "screw you" and tunnels through the surrounding matter. So, to keep the behavior of the circuit predictable, you need a quorum of a few thousand or so electrons to ensure you are monitoring a population, and the aberrance of a few individuals does not throw off the calculation.

Also, how many transistors they can pack onto a chip is at this point dick size comparing. The really expensive parts of chip manufacturing are packaging and testing. (Thus why the industry is striving for an all-singing all dancing everything on one chip computer.)
 
2012-07-16 11:47:28 AM  
awjeeze.jpg
 
2012-07-16 11:48:06 AM  

Virtuoso80: OMG this is huge! If we wind up breaking Moore's law then...absolutely nothing will happen!

Seriously, did Moore have connections with the Illuminati or something? Why is this a big deal?


Article even acts like Intel follows Moore's law like its gospel. Intel follow market trends, it tries to stay ahead of the game, it doesn't give two shiats about what Moore thought.
 
2012-07-16 11:48:45 AM  
A/S/M/L?
 
2012-07-16 11:51:35 AM  
Why does the writer think that larger wafers will in any way increase chip speed?
 
2012-07-16 12:00:31 PM  
I predict that the number of people predicting the end of Moore's law will double every two years.
 
2012-07-16 12:16:58 PM  

RexTalionis: Moore's Law was never actually intended to be a real law


Not like Rule 34.
 
2012-07-16 12:30:06 PM  

Arkanaut: RexTalionis: Moore's Law was never actually intended to be a real law

Not like Rule 34.


ishkur.com

/Seemed relevant
 
2012-07-16 12:41:55 PM  
Moore's Law has been commonly interpreted to deal with CPU performance, but regardless of whether that was ever accurate, it is now fairly inaccurate for consumer CPUs:

i158.photobucket.com

I doubt it will continue to be accurate for GPGPUs for any extended period of time in the future.
 
2012-07-16 12:54:05 PM  

Virtuoso80: OMG this is huge! If we wind up breaking Moore's law then...absolutely nothing will happen!

Seriously, did Moore have connections with the Illuminati or something? Why is this a big deal?


Investors.
 
2012-07-16 01:18:45 PM  

Virtuoso80: OMG this is huge! If we wind up breaking Moore's law then...absolutely nothing will happen!

Seriously, did Moore have connections with the Illuminati or something? Why is this a big deal?


Moore is a lot like us. He thinks the whole idea is silly.

He just has the unfortunate luck of being the guy who said the original (and nearly always misquoted) definition.
 
2012-07-16 01:33:49 PM  

Virtuoso80: OMG this is huge! If we wind up breaking Moore's law then...absolutely nothing will happen!


In one sense, yes. In another sense, however, it will mean that there will be a big adjustment to the tech industry which now takes Moore's Law into its planning. It will also mean that, once we reach that point, that's about as good as computers will be getting for a good, long time. For those of us who are used to trading up to an every faster and smaller machine every few years, there's going to be an adjustment in perspective.

It also means that, from that point forward, we'll be returning to the era of the mega-machines, since that's your only option if you can't got smaller and faster, although we're probably looking at distributed networks rather than dedicated hardware like the old Crays.
 
2012-07-16 01:56:47 PM  

t3knomanser: I predict that the number of articles predicting the end of Moore's Law doubles every 18 months.


Moore's Meta Law
 
2012-07-16 02:01:52 PM  
In other news, IBM smashes Moore's Law.

3D chip stacking and quantum computing will keep us going. Once the medium reaches it's limit the paradigm changes - ie vacuum tubes to transistors. The singularity is coming. . .
 
2012-07-16 02:14:12 PM  
Lately, I double the amount of cores in my machine every 18 months or so.....
 
2012-07-16 02:15:38 PM  
Moore's Suggestion, you mean
 
2012-07-16 02:18:48 PM  

andrewagill: it is now fairly inaccurate for consumer CPUs:


And GFLOPS is a fairly inaccurate metric for overall CPU performance...
 
2012-07-16 02:31:40 PM  

dalovindj: In other news, IBM smashes Moore's Law.

3D chip stacking and quantum computing will keep us going. Once the medium reaches it's limit the paradigm changes - ie vacuum tubes to transistors. The singularity is coming. . .

FTA The technology could also someday be applied to tape media.


The singularity will be on tape.
 
2012-07-16 02:44:24 PM  

dready zim: Lately, I double the amount of cores in my machine every 18 months or so.....


Not much software that utilizes those extra cores, though.

:(
 
2012-07-16 03:01:17 PM  

RexTalionis: Why Fail? Moore's Law was never actually intended to be a real law, just an after-the-fact observation on the amount of transistors you can fit onto a given bit of silicon.

It's pretty much inevitable that Moore's Law will no longer be true at some point, once single atom transistors become possible and widespread.


In Theory there are things smaller than atoms, and other means to process data beyond physical media.
Moores law might still hold true so long as development continues.

I think the real rub is in how far investors are willing to pay scientists to push the boundaries. Because, in the end, it's money that makes this law ring true.
 
2012-07-16 03:11:16 PM  

way south: I think the real rub is in how far investors are willing to pay scientists to push the boundaries. Because, in the end, it's money that makes this law ring true.


I think I mentioned that already. :)
 
2012-07-16 04:23:24 PM  

IronButterfly: Moore's Suggestion, you mean


If we want to be technically accurate, it was more akin to "Moore's observation"
 
2012-07-16 04:44:38 PM  

StoPPeRmobile: dready zim: Lately, I double the amount of cores in my machine every 18 months or so.....

Not much software that utilizes those extra cores, though.

:(


Realistically would you be more impressed or horrified if you saw outlook or firefox eating 4 cores?

To be fair, most software doesn't need to and the stuff that really benefits from it tends to multi-thread or at least make valiant attempts at it. What's important is all those little programs that have no need to multi-thread and don't consume large amounts of the core they're running on can all be shoveled into one of your 4/6/8 cores and leave the rest alone for the piggies like video encoders. Even a lot of games are starting to show up with threading, at least the ones that genuinely could benefit from it.

/What's amusing is now that I finally have a wicked fast OCed 2600K to video encode with it turns out there's software that does it way way faster using my 580GTXes instead.
 
2012-07-16 04:47:21 PM  

RexTalionis: Why Fail? Moore's Law was never actually intended to be a real law, just an after-the-fact observation on the amount of transistors you can fit onto a given bit of silicon.


Didn't Moore himself say not only was it not a law, but it'll eventually slow and be non-existent at some point in time?
 
2012-07-16 05:41:54 PM  

BumpInTheNight: /What's amusing is now that I finally have a wicked fast OCed 2600K to video encode with it turns out there's software that does it way way faster using my 580GTXes instead.


I haven't researched the issue recently, but last I checked several months ago the GPU-based video encoding solutions all resulted in a noticeably inferior quality result compared to a software encoding method like Handbrake.
 
2012-07-16 06:10:30 PM  

the_sidewinder: IronButterfly: Moore's Suggestion, you mean

If we want to be technically accurate, it was more akin to "Moore's observation"


No, it wasn't an observation. If you're the boss and you tell your employees what to do, how can you call the result of than an "observation"?
 
2012-07-16 07:00:22 PM  
It's been a good run.
 
2012-07-16 07:35:45 PM  

StoPPeRmobile: dready zim: Lately, I double the amount of cores in my machine every 18 months or so.....

Not much software that utilizes those extra cores, though.

:(


After effects....

It doesn`t take much to slow 8 cores at 4Ghz with 24Gb RAM down to a crawl. Last thing I made took about 4 hours to render 2 minutes of footage.
 
2012-07-16 07:39:16 PM  

poot_rootbeer: BumpInTheNight: /What's amusing is now that I finally have a wicked fast OCed 2600K to video encode with it turns out there's software that does it way way faster using my 580GTXes instead.

I haven't researched the issue recently, but last I checked several months ago the GPU-based video encoding solutions all resulted in a noticeably inferior quality result compared to a software encoding method like Handbrake.


I use the GPU for speed to edit and compose and I do the final render without the GPU. Best of both.
 
2012-07-17 01:19:34 AM  
I wonder what AMD is doing about this...
 
2012-07-17 03:56:28 AM  

Maul555: I wonder what AMD is doing about this...


Making cheap processors for people that mainly play games?
 
2012-07-17 04:58:43 AM  

dready zim: Maul555: I wonder what AMD is doing about this...

Making cheap processors for people that mainly play games?


FTFY, its not just gamers that sometimes rifle through the discount pile.
 
2012-07-18 06:52:47 PM  

Evil Twin Skippy: RexTalionis: Why Fail? Moore's Law was never actually intended to be a real law, just an after-the-fact observation on the amount of transistors you can fit onto a given bit of silicon.

It's pretty much inevitable that Moore's Law will no longer be true at some point, once single atom transistors become possible and widespread.

Actually, we will have problems long before we get to single atom transistors. Already quantum mechanical phenomena are limiting the size of components. If you tried to contain a single electron, it very often decide "screw you" and tunnels through the surrounding matter. So, to keep the behavior of the circuit predictable, you need a quorum of a few thousand or so electrons to ensure you are monitoring a population, and the aberrance of a few individuals does not throw off the calculation.

Also, how many transistors they can pack onto a chip is at this point dick size comparing. The really expensive parts of chip manufacturing are packaging and testing. (Thus why the industry is striving for an all-singing all dancing everything on one chip computer.)


No, packaging and testing are not the expensive parts of chip making. The scales of packaging are huge, and the process equipment is low tech. Testing just takes time.

Intel likes the all-in-one chip because they would make more money if we didn't buy nVidia, etc.
 
Displayed 39 of 39 comments

View Voting Results: Smartest and Funniest


This thread is archived, and closed to new comments.

Continue Farking
Submit a Link »






Report