Skip to content
Do you have adblock enabled?
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(CNBC)   Nvidia says Moore's law is so over, but the company Moore co-founded begs to differ   (cnbc.com) divider line
    More: Obvious, Transistor, Intel CEO Pat Gelsinger, Intel Corporation, Moore's law, Integrated circuit, Intel's founder dating, Moore's Law, pace of chip advancements  
•       •       •

635 clicks; posted to STEM » on 29 Sep 2022 at 12:49 PM (8 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



18 Comments     (+0 »)
View Voting Results: Smartest and Funniest
 
2022-09-29 12:50:52 PM  
There should be no question about who is right. Plot the number of gates by area over time and see if it's fallen off.  If it's not dead yet, it is dying tho.

Sure as hell we know Intel's ability to be the leader on that curve was given up years ago tho. Fail.
 
2022-09-29 12:56:41 PM  
uhhh wait this don't make no sense to me

'In his vision, intense applications like artificial intelligence can run on the specific processor that handles them the best, which would be the graphics processor that Nvidia develops. In other words, there's less need for Intel's specialty. "


uhhh those would seem to be the exact opposite of the way they...curiously worded it.

a "specific processor" that "handles it best" would mean, a specialty/bespoke processor, that is job specific designed.

While Intel, making CPUs are in fact making generalized computational hardware. NOT specialty job specific processors.


that does not seem to be the most accurate or honest statement about this situation, as far as what intel and what nivid are each designing/what market space they fill.
 
2022-09-29 1:14:57 PM  

PvtStash: While Intel, making CPUs are in fact making generalized computational hardware. NOT specialty job specific processors.


That's why Intel has been trying to break into the GPU and neural processing space. x86 is on it's way out and they have been working on ARM procs for a decade with nothing to show for it. They are so far behind the curve wasting time on inferior fabs, low yield processors, and low margin garbage I doubt they will ever be a leader in anything at this point.

Apple might as well buy them out, and they nearly have the cash to do just that.
 
2022-09-29 1:46:24 PM  
intel? if i wanted an opinion from the leader in CPU manufacturing i'd go interview AMD.
 
2022-09-29 1:49:07 PM  

Stibium: PvtStash: While Intel, making CPUs are in fact making generalized computational hardware. NOT specialty job specific processors.

That's why Intel has been trying to break into the GPU and neural processing space. x86 is on it's way out and they have been working on ARM procs for a decade with nothing to show for it. They are so far behind the curve wasting time on inferior fabs, low yield processors, and low margin garbage I doubt they will ever be a leader in anything at this point.

Apple might as well buy them out, and they nearly have the cash to do just that.


Doubt Apple has any interest in them, as they seem to be doing pretty well with their own ARM based processors.

Just double checked and the new M2s are based on 5nm as well, so they can't even use Intel as an in-house fabricator without some overhauling.

They're likely better off not even bothering.
 
2022-09-29 2:06:00 PM  

Stibium: x86 is on it's way out and they have been working on ARM procs for a decade with nothing to show for it.


Intel sold off its ARM business in 2006 and focused on Atom. They've been investing in RISC-V for very low power / specialized chips and currently offer it in soft cores in their formerly-Altera products. They plan to offer physical cores for hybrid designs as part of their expanding foundry services, though they will support ARM as well.

Apple would be foolish to buy Intel. Their CPU designs are a means to an end and its going very well for them.

There doesnt seem to be a contradiction between nvidia and intel approaches. Both are working on accelerators, packaging, and more efficient use of silicon. Nvidia doesnt make general purpose cpus so their focus will naturally be narrower.
 
2022-09-29 2:24:45 PM  

Stibium: Apple might as well buy them out, and they nearly have the cash to do just that.


Nearly?

Apple has over $200 billion in cash and cash equivalents. They could almost buy them with cash TWICE. In fact, they could issue a few billion in bonds and buy BOTH Intel and AMD right now.
 
2022-09-29 2:56:10 PM  
that Moore's Law, a rule of thumb from Intel's founder dating back to the 1960s, is "alive and well." The theory, posited by Gordon Moore '

Well which is it? Moore's law was never a 'law' and not even a theory. It was an observation, one that Moore didn't even get right the first time. He changed it when he realized his initial observation was wrong. The only reason Intel constantly crammed 'Moore's Law' down our throats was marketing. Moore's Law was a marketing ploy to undermine competing RISC chips which used less transistors. If you weren't keeping up with the number of transistors Intel was cramming into their chips then you were behind the times.
 
2022-09-29 3:06:54 PM  

Stibium: PvtStash: While Intel, making CPUs are in fact making generalized computational hardware. NOT specialty job specific processors.

That's why Intel has been trying to break into the GPU and neural processing space. x86 is on it's way out and they have been working on ARM procs for a decade with nothing to show for it. They are so far behind the curve wasting time on inferior fabs, low yield processors, and low margin garbage I doubt they will ever be a leader in anything at this point.

Apple might as well buy them out, and they nearly have the cash to do just that.



uhhhh, yeah sure that's all totally correct, and just 100% off topic for what i as posting about.

Did you have any additional insights to the actual topic i was posting about, that NVIDIA describing themselves as specialty processing while also using terminology to suggest intel CPUs are of the same nature as specific processing when they are most known as the CPU, or general processor designers?

that sure did seem oddly worded so as to be manipulatively creating a not so accurate depiction of reality that only careful critical thinking and reading compression would discern the questionable nature of the wording there.
 
2022-09-29 4:27:25 PM  

freakdiablo: Stibium: PvtStash: While Intel, making CPUs are in fact making generalized computational hardware. NOT specialty job specific processors.

That's why Intel has been trying to break into the GPU and neural processing space. x86 is on it's way out and they have been working on ARM procs for a decade with nothing to show for it. They are so far behind the curve wasting time on inferior fabs, low yield processors, and low margin garbage I doubt they will ever be a leader in anything at this point.

Apple might as well buy them out, and they nearly have the cash to do just that.

Doubt Apple has any interest in them, as they seem to be doing pretty well with their own ARM based processors.

Just double checked and the new M2s are based on 5nm as well, so they can't even use Intel as an in-house fabricator without some overhauling.

They're likely better off not even bothering.


Is Apple allowing third parties to build systems with their processors? If not, I'm having a hard time believing that their processors are going to catch on in enterprise environments.
 
2022-09-29 6:07:39 PM  

PvtStash: Stibium: PvtStash: While Intel, making CPUs are in fact making generalized computational hardware. NOT specialty job specific processors.

That's why Intel has been trying to break into the GPU and neural processing space. x86 is on it's way out and they have been working on ARM procs for a decade with nothing to show for it. They are so far behind the curve wasting time on inferior fabs, low yield processors, and low margin garbage I doubt they will ever be a leader in anything at this point.

Apple might as well buy them out, and they nearly have the cash to do just that.


uhhhh, yeah sure that's all totally correct, and just 100% off topic for what i as posting about.

Did you have any additional insights to the actual topic i was posting about, that NVIDIA describing themselves as specialty processing while also using terminology to suggest intel CPUs are of the same nature as specific processing when they are most known as the CPU, or general processor designers?

that sure did seem oddly worded so as to be manipulatively creating a not so accurate depiction of reality that only careful critical thinking and reading compression would discern the questionable nature of the wording there.


"Intel specializes in general-purpose processors."

This doesn't break my brain.

But then, my job title, in full, is "Specialist," and I do all kinds of shiat, so... I see the humor.

/No, not the 1994-movie kind of Specialist.
//No, not the Dead Milkmen-song kind of Specialist, either.
///No, not the E-4 kind of Specialist, either.
////If I told, you, I'd have to SLASHIES!
 
2022-09-29 8:50:55 PM  

PvtStash: Did you have any additional insights to the actual topic i was posting about, that NVIDIA describing themselves as specialty processing while also using terminology to suggest intel CPUs are of the same nature as specific processing when they are most known as the CPU, or general processor designers?


It's specialty processors, all the way down. The idea of any digital implementation of an ISA being more or less generic than another is entirely subjective.

My comment had more to do with the application of Moore's law that TFA was referencing, and more specifically the fabrication technology (and Intel's lack thereof) allowing for additional transistor density beyond what circuit simplification can yield.  That, as opposed to questioning TFA's syntax for lack of understanding that Intel is not only a processor design company, but also one which fabricates them using specialized technology.
 
2022-09-29 8:52:43 PM  

madgonad: Stibium: Apple might as well buy them out, and they nearly have the cash to do just that.

Nearly?

Apple has over $200 billion in cash and cash equivalents. They could almost buy them with cash TWICE. In fact, they could issue a few billion in bonds and buy BOTH Intel and AMD right now.


https://www.macrotrends.net/stocks/charts/AAPL/apple/cash-on-hand
 
2022-09-29 11:01:54 PM  

Stibium: madgonad: Stibium: Apple might as well buy them out, and they nearly have the cash to do just that.

Nearly?

Apple has over $200 billion in cash and cash equivalents. They could almost buy them with cash TWICE. In fact, they could issue a few billion in bonds and buy BOTH Intel and AMD right now.

https://www.macrotrends.net/stocks/charts/AAPL/apple/cash-on-hand


"Apple cash on hand for the quarter ending June 30, 2022 was $48.231B, a 21.82% decline year-over-year."

Now I'm sad.

For that kind of money, all they could buy is Dell.

Just to shut it down and give the money back to the shareholders, of course. ;)

/Actually, Dell'd only cost about half that.
 
2022-09-30 12:42:39 AM  

balial: There should be no question about who is right. Plot the number of gates by area over time and see if it's fallen off.  If it's not dead yet, it is dying tho.

Sure as hell we know Intel's ability to be the leader on that curve was given up years ago tho. Fail.


I don't think "per area" even matters. As originally stated, the ICs can get bigger (or stacked, or whatever), the transistors can get smaller, or both... as long as the number of transistors on an IC on one of the denser modern processes goes up.
 
2022-09-30 12:32:21 PM  

raygundan: balial: There should be no question about who is right. Plot the number of gates by area over time and see if it's fallen off.  If it's not dead yet, it is dying tho.

Sure as hell we know Intel's ability to be the leader on that curve was given up years ago tho. Fail.

I don't think "per area" even matters. As originally stated, the ICs can get bigger (or stacked, or whatever), the transistors can get smaller, or both... as long as the number of transistors on an IC on one of the denser modern processes goes up.


It's exactly about area. Moore's law is about density. You're right that you can always make _more_ transistors given more space, but we could always do that. The problem is that you run into thermal and electrical problems. Every time you shrink the process, you save power and hence energy. This is what makes it such a big win.

The area thing is also why this is in no way subjective. Moore's law is very well defined. It's not about general ability to compute or flops or some other metric, it's just transistors per square whatever.
 
2022-09-30 8:00:21 PM  

balial: It's exactly about area. Moore's law is about density.


It wasn't originally-- Moore's original speculation was that circuits would double in transistor count, whether that happened by density increase or literally just making bigger chips.

It got kinda-sorta-reinterpreted over the years into two related "laws"-- that the speed would double every two years and that the density would double every two years.  But the original was just "there will be twice as many transistors."  The original 1965 quote from Moore is barely recognizable compared to the things people have added on to it over the years:

"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year."
In 1975, he revised that to every two years.  And while historically the main driver of increased complexity was increasing density... that's not a requirement.  It also gets mixed up and combined with Dennard scaling (which stopped working ~15 years ago or so) and some other "laws" of the industry.But as originally stated, Moore's law can hold even if density improvements completely stop, as long as somebody keeps making larger and larger chips or stacks of chips and so on.Moore's law is very well defined. It's not about general ability to compute or flops or some other metric, it's just transistors per square whatever.It is very well defined... you can go look up the original quote from another source if you don't believe me.  Density isn't involved except indirectly, just circuit complexity.
 
2022-09-30 8:04:08 PM  
balial:
The area thing is also why this is in no way subjective. Moore's law is very well defined. It's not about general ability to compute or flops or some other metric, it's just transistors per square whatever.

Sorry about fark eating the formatting up there.  I should know better than to trust the editor at this point.

But just in case the garbled formatting is too screwed up to read... Moore's law is not about density, or speed, or performance, or power use.  Just circuit complexity.
 
Displayed 18 of 18 comments

View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking




On Twitter


  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.