Do you have adblock enabled?
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Abc.net.au)   Scientists break Moore's Law by creating single atom transistor 8 years ahead of schedule   (abc.net.au ) divider line
    More: Cool, Moore's Law, atoms, hydrogen atoms, Nature Nanotechnology, transistors, University of New South Wales, qubits, vacuum chambers  
•       •       •

6338 clicks; posted to Geek » on 20 Feb 2012 at 12:07 AM (4 years ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



74 Comments   (+0 »)
   
View Voting Results: Smartest and Funniest

Archived thread
 
2012-02-19 08:42:44 PM  
Manufacturability is another matter, submitter.

Moore's law describes a long-term trend in the history of computing hardware whereby the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

Cool but if they can only make one, it doesn't count.
 
2012-02-19 09:43:56 PM  
Welcome to the new age of discovery.
 
2012-02-19 09:44:32 PM  
Oh, and now intel might be able to break the 4Ghz stock speed
 
2012-02-19 11:59:34 PM  
When does it become self-aware?
 
2012-02-20 12:10:38 AM  
Actually, this is right on time. Given 8 more years, the technology will be mature enough to create trillions of them on a single surface cheaply and quickly, and sell on the open market for use. They didn't beat the law, it feels like they're upholding it.
 
2012-02-20 12:17:20 AM  
I suppose computing will mean something unimaginable to what "computing" means now in the near future.

I will be like comparing the computing done on an abacus to the computing done playing an online 3D game.
 
2012-02-20 12:26:39 AM  

cman: Oh, and now intel might be able to break the 4Ghz stock speed


fark that shiat, more cores!
 
2012-02-20 12:28:48 AM  

johnnyrocket: I suppose computing will mean something unimaginable to what "computing" means now in the near future.

I will be like comparing the computing done on an abacus to the computing done playing an online 3D game.


I used to have a poster that said 'if cars evolved as quickly as computing, a rolls Royce would go 5000mph and cost $1.'

That was 30 years ago in 1982.
 
2012-02-20 12:35:37 AM  

cman: Oh, and now intel might be able to break the 4Ghz stock speed


No it won't. Heat output of a CMOS circuit scales with frequency. Modern clock speed limits have nothing to do with feature size and everything to do with the inability of air-cooled finned heatsinks to effectively remove more than ~150 watts of heat from a one square centimeter area without sounding like jet engines or having an unacceptable temperature at the hot side, known as the Thermal Brick Wall.

And no, Joe Average is not going to install liquid cooling until it's maintainence-free for life the same way we expect every other piece of computer hardware to be.
 
2012-02-20 12:38:43 AM  

gaslight: Manufacturability is another matter, submitter.

Moore's law describes a long-term trend in the history of computing hardware whereby the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

Cool but if they can only make one, it doesn't count.


Read it again, many have made a single atom transistor, they have worked it to a level of precision that is repeatable and useful for manufacturing.
 
2012-02-20 12:46:37 AM  
So how long before we make transistors made out of a single electron? Or are we just going to start splitting atoms to squeeze more if them into the processor?
 
2012-02-20 12:46:53 AM  
This story helpfully omits the fact that is has to be something like -200 F to work.
 
2012-02-20 12:48:56 AM  

Electrify: So how long before we make transistors made out of a single electron? Or are we just going to start splitting atoms to squeeze more if them into the processor?


the next advancement seems to be carbon and silicon nanotubes...current two-dimensional processors are inefficient compared to three-dimensional processors.
 
2012-02-20 01:02:57 AM  
No one cares about Moore's law being too speedy, we are more concerned with keeping it going.
 
2012-02-20 01:04:27 AM  

Chiad: gaslight: Manufacturability is another matter, submitter.

Moore's law describes a long-term trend in the history of computing hardware whereby the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

Cool but if they can only make one, it doesn't count.

Read it again, many have made a single atom transistor, they have worked it to a level of precision that is repeatable and useful for manufacturing.


Nope, placing atoms individually with the tip of a electron microscope isn't inexpensive, and certainly won't scale to the level that you could build an actual computer chip with it.

It's still a cool result, but I wouldn't exactly be popping champaign over it.
 
2012-02-20 01:30:18 AM  
Aaaaand done in the Boobies. Moore's Law is an axiom that refers to what the consumer sees, not what R&D does.
 
2012-02-20 01:32:51 AM  
Am I the only one who fears the inevitable coming of the nano plague?
 
2012-02-20 01:34:50 AM  

PerilousApricot: Chiad: gaslight: Manufacturability is another matter, submitter.

Moore's law describes a long-term trend in the history of computing hardware whereby the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

Cool but if they can only make one, it doesn't count.

Read it again, many have made a single atom transistor, they have worked it to a level of precision that is repeatable and useful for manufacturing.

Nope, placing atoms individually with the tip of a electron microscope isn't inexpensive, and certainly won't scale to the level that you could build an actual computer chip with it.

It's still a cool result, but I wouldn't exactly be popping champaign over it.



Repeatable and useful, not inexpensive, they still have to work that part out, I'll grant you. But this isn't the first single atom transistor.
 
2012-02-20 01:38:47 AM  

unyon: johnnyrocket: I suppose computing will mean something unimaginable to what "computing" means now in the near future.

I will be like comparing the computing done on an abacus to the computing done playing an online 3D game.

I used to have a poster that said 'if cars evolved as quickly as computing, a rolls Royce would go 5000mph and cost $1.'

That was 30 years ago in 1982.


Unless it was an Apple, then it would still cost $1000 more than it needs to.
 
2012-02-20 02:11:42 AM  
SN1987a goes boom

Unless it was an Apple, then it would still cost $1000 more than it needs to.

Go do your homework, junior, the adults are talking here.
 
2012-02-20 02:34:15 AM  

bingethinker: Go do your homework, junior, the adults are talking here.


he's right
 
2012-02-20 03:17:00 AM  
You can tell the article was written with Americans in mind. It includes a link to Google maps showing what an Australia may look like.
 
2012-02-20 03:26:13 AM  
One more step towards singularity.
 
2012-02-20 03:45:19 AM  

erik-k: And no, Joe Average is not going to install liquid cooling until it's maintainence-free for life the same way we expect every other piece of computer hardware to be.


Take a look at FC-72. It's easy enough to design a self-propelling spray system for it using the same waste heat it's built to remove. Granted, this may not be practical as performance will be affected by motion.

Fiber-based laser cooling may be a decent option though.
 
2012-02-20 03:46:39 AM  
It'll still end up booting from BIOS.
 
2012-02-20 04:02:04 AM  
Meanwhile, Moore and his wife are putting that money to good use. I'd give the man a [Hero] tag if I could.
 
2012-02-20 04:10:40 AM  

Chiad: gaslight: Manufacturability is another matter, submitter.

Moore's law describes a long-term trend in the history of computing hardware whereby the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

Cool but if they can only make one, it doesn't count.

Read it again, many have made a single atom transistor, they have worked it to a level of precision that is repeatable and useful for manufacturing.


Making it precise enough to be repeatable doesn't mean it's cheap enough to be manufacturable. Totally different goals. At best, within 10 years you'd have other groups repeat the experiment.
 
2012-02-20 04:58:25 AM  
Serious question here, anyone know if the human brain can power these components?
 
2012-02-20 05:13:54 AM  
Meh, wake me up then they start building computers from quarks
 
2012-02-20 05:20:57 AM  
*cough*

i29.tinypic.com
 
2012-02-20 05:37:48 AM  

erik-k: And no, Joe Average is not going to install liquid cooling until it's maintainence-free for life the same way we expect every other piece of computer hardware to be.


Joe Average doesn't install his own computer components. He buys a Dell and throws it out when it meets its planned obsolescence in 3-5 years and buys another Dell.

For those who do install their own components (Joe Above-Average), all-in-one coolers like the Corsair H80 are brilliant, and yes, maintenance-free (and much smaller than the giganormous air-cooled sinks you see these days).
 
2012-02-20 05:54:36 AM  
Doesn't Moore's Law take into account that it's logical to expect a lag fase in an 'old' technology while a 'new' technology is being developed that can then start a new period of exponential evolution? We can't make a 'classic' transistor smaller than one atom, so I would guess we're approaching (or are already in) this lag face (constraints on working with single atoms or low number of atoms, quantum effects and the like). So when can I expect my radical new approach to a personal computer.

/Open the pod bay doors, HAL.
//HAL?
///...Halp!?
 
2012-02-20 06:11:19 AM  

cman: Welcome to the new age of discovery.


Welcome to the age of finger nail sized iPhones.

/An age of discovery requires something to be discovered.
/Can't do much of that down here.
 
2012-02-20 06:32:51 AM  
i was also going to say it, but was not on a silicon chip there for is not cheating moores law
 
2012-02-20 07:21:53 AM  

BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right


B-b-b-but the stores are so pretty and it's what all the cool kids buy
 
2012-02-20 07:35:17 AM  

phaseolus: BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right

B-b-b-but the stores are so pretty and it's what all the cool kids buy



I don't even own any Apple products, but I've got enough sense to realize Apple is more than a hardware vender. So no, he's not right, and neither are you.
 
2012-02-20 07:52:44 AM  

erik-k: cman: Oh, and now intel might be able to break the 4Ghz stock speed

No it won't. Heat output of a CMOS circuit scales with frequency. Modern clock speed limits have nothing to do with feature size and everything to do with the inability of air-cooled finned heatsinks to effectively remove more than ~150 watts of heat from a one square centimeter area without sounding like jet engines or having an unacceptable temperature at the hot side, known as the Thermal Brick Wall.

And no, Joe Average is not going to install liquid cooling until it's maintainence-free for life the same way we expect every other piece of computer hardware to be.


Liquid cooling is easy. Just drop your PC into an aquarium filled with mineral oil (new window).
 
2012-02-20 07:59:04 AM  

DarnoKonrad: phaseolus: BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right

B-b-b-but the stores are so pretty and it's what all the cool kids buy


I don't even own any Apple products, but I've got enough sense to realize Apple is more than a hardware vender I stayed at a Holiday Inn Express last night. So no, he's not right, and neither are you.


FTFY
 
2012-02-20 08:01:28 AM  

stonelotus: DarnoKonrad: phaseolus: BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right

B-b-b-but the stores are so pretty and it's what all the cool kids buy


I don't even own any Apple products, but I've got enough sense to realize Apple is more than a hardware vender I stayed at a Holiday Inn Express last night. So no, he's not right, and neither are you.

FTFY


wut?
 
2012-02-20 08:04:17 AM  

BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right


Not in 1982 he wasn't. In 1982, your typical choices were:

The original IBC PC with a Intel 8088 (16 bit, ~ 1/2 MHz), 16 K of RAM, one floppy drive & no hard-drive, running PC-DOS, for $2,945.

An Apple II Plus with a MOS 6052 (8 bit, ~ 1 MHz), 16 K of RAM, one floppy drive & no hard-drive, running Apple DOS, for $1,195.

At the time, Apple's computer was clearly the better choice for home users. This stayed true in 1983, with the release of the IBM PC XT and the Apple IIe.

Where things started to diverge was with the 1984 release of the Macintosh. With PCs stuck and DOS and Apple rolling out a graphical UI, Apple could charge a premium and people would pay gladly. It wasn't until 1992 that Microsoft had a workable competitor with Windows 3.1, but that still ran on DOS. You can't say that Microsoft really caught up until Windows 95, released 11 years after the Mac.
 
2012-02-20 08:06:09 AM  

DarnoKonrad: phaseolus: BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right

B-b-b-but the stores are so pretty and it's what all the cool kids buy


I don't even own any Apple products, but I've got enough sense to realize Apple is more than a hardware vender. So no, he's not right, and neither are you.


A monstrous amalgamation of pinko-commie stooges that cribbed their entire product line from sci-fi movies?

You're right, they are way more than a hardware vendor.
 
2012-02-20 08:06:21 AM  

Chiad: Read it again, many have made a single atom transistor, they have worked it to a level of precision that is repeatable and useful for manufacturing.


Sorry, but I've worked in cutting-edge semiconductor R&D. They move fast, but lab-to-market is still more than 5 years, and usually about 10. Moore's Law is intact.
 
2012-02-20 08:23:06 AM  
Patent trolls are already gearing up to file suit in Tyler, Texas.
 
2012-02-20 08:32:36 AM  

Chiad: PerilousApricot: Chiad: gaslight: Manufacturability is another matter, submitter.

Moore's law describes a long-term trend in the history of computing hardware whereby the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

Cool but if they can only make one, it doesn't count.

Read it again, many have made a single atom transistor, they have worked it to a level of precision that is repeatable and useful for manufacturing.

Nope, placing atoms individually with the tip of a electron microscope isn't inexpensive, and certainly won't scale to the level that you could build an actual computer chip with it.

It's still a cool result, but I wouldn't exactly be popping champaign over it.


Repeatable and useful, not inexpensive, they still have to work that part out, I'll grant you. But this isn't the first single atom transistor.


As an integration engineer who works in cutting edge semiconductor research, I get a kick out of these articles. Don't get me wrong, it's really cool stuff, but I think most people don't understand the gap between a university lab result and an actual product. Did you notice the size of the sample? Today's work is done on 300mm wafers with some lookahead work started on 450mm. The reason? We scale primarily for cost, not for performance. It's not about having smaller chips to fit into smaller packaging because no consumer product is going to be much smaller than they are now because they need to be handled, but rather, we're trying to get to more chips per wafer. If each generation isn't cheaper, then it's not going to be made. So far the industry has been able to keep the pace, but it's getting much more difficult.

Also, a functioning chip has many other elements so it would be curious to see how you would integrate this process with all the passives and other devices.

Anyhow, I'm not trying to pee into anyone's Corn Flakes, but just to level-set expectations.
 
2012-02-20 08:39:30 AM  

cman: Oh, and now intel might be able to break the 4Ghz stock speed


Clock speed is no longer a meaningful measure for a computer's effectiveness.
 
2012-02-20 08:41:45 AM  

bingethinker: SN1987a goes boom

Unless it was an Apple, then it would still cost $1000 more than it needs to.

Go do your homework, junior, the adults are talking here.


Riiight, Apple's products aren't ridiculously expensive...
 
2012-02-20 08:44:52 AM  

The Iron duke: Serious question here, anyone know if the human brain can power these components?


The human brain isn't much of a power source. If you want human-powered computing, a few kinetically powered generators placed on various body parts (knees, hips, feet, upper arms) would be a better choice.
 
2012-02-20 08:49:39 AM  

DarnoKonrad: phaseolus: BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right

B-b-b-but the stores are so pretty and it's what all the cool kids buy


I don't even own any Apple products, but I've got enough sense to realize Apple is more than a hardware vender. So no, he's not right, and neither are you.


Regardless of what business you think Apple is in, their hardware is ludicrously overpriced.

The image I made that demonstrates that is several years old, so I won't bother posting it as it's entirely outdated, but nothing has changed. $1000 extra for a computer, given that you're comparing a PC of similar capabilities, is a very conservative estimate. When last I priced out the difference, hard drives alone were costing more than 300% of what they cost when buying from non-Apple companies. There is no excuse for that, it's just gouging. The exact same HD, which at the time went for some $80-$90 elsewhere, was being sold on Apple's website (when you customize an Apple PC to buy) for over $300. This was about 4 years ago, maybe less.
 
2012-02-20 08:57:02 AM  

SomeAmerican: Not in 1982 he wasn't. In 1982, your typical choices were:

The original IBC PC with a Intel 8088 (16 bit, ~ 1/2 MHz), 16 K of RAM, one floppy drive & no hard-drive, running PC-DOS, for $2,945.

An Apple II Plus with a MOS 6052 (8 bit, ~ 1 MHz), 16 K of RAM, one floppy drive & no hard-drive, running Apple DOS, for $1,195.

At the time, Apple's computer was clearly the better choice for home users.


Unless you were actually a home user in 1982, in which case you would have either had a Texas Instruments TI-994A, or a Commodore VIC-20. The Commodore sold for about $300, I forget the price tag on the TI.
 
2012-02-20 09:11:28 AM  

LavenderWolf: Regardless of what business you think Apple is in, their hardware is ludicrously overpriced.


They flat out don't sell hardware. They sell an interface that's tied to their hardware. If you don't want the interface, you'd never go to Apple to begin with. That's why it's not comparable. It's a package deal. If someone wants to say their interface is not worth the premium, that's their opinion -- but that's what they're selling, not 'apple brand' hard-drives.

And while people are loath to admit it sometimes, Apple's interface has always been a couple steps ahead of the competition. That's what you're paying for.
 
2012-02-20 09:17:24 AM  
So, if, as people are saying, this is upholding Moore's Law, and these transistors can be used in ~8 years, what do we do in 10? How do you go from 1 atom transistors to 1/2 atom transistors? Or am I being unimaginative? Like there'll be some other breakthrough were a transistor that's 2 atoms can hold more than 2 different states? Or whatever the value would be as long as the number of atoms is 1 less than the number of states held, even if it's 1BB atoms and 1BB + 1 states.
 
2012-02-20 10:10:48 AM  

Mayhem of the Black Underclass: How do you go from 1 atom transistors to 1/2 atom transistors? Or am I being unimaginative?


Current CPUs are 2D.. transistors on a flat surface. The next gen will be 3D CPUs with nanotube transistors in 3-dimensional constructions. Not sure how Moore's law applies there, but that's the direction we're headed I think.
 
2012-02-20 10:13:01 AM  
You're all forgetting something:

Phase 3 is Profit!
 
2012-02-20 10:27:50 AM  

LavenderWolf: cman: Oh, and now intel might be able to break the 4Ghz stock speed

Clock speed is no longer a meaningful measure for a computer's effectiveness.


Clock speed hasn't been meaningful in a while. Years ago, my dream processor was the Intel Q6600. To me, it was like sticking 4 of my Pentium 4s on a single chip but getting even better results. Despite having 4 cores running at the same clock speed as my processor, it was 10 times faster.
 
2012-02-20 10:33:17 AM  

Mayhem of the Black Underclass: So, if, as people are saying, this is upholding Moore's Law, and these transistors can be used in ~8 years, what do we do in 10? How do you go from 1 atom transistors to 1/2 atom transistors? Or am I being unimaginative? Like there'll be some other breakthrough were a transistor that's 2 atoms can hold more than 2 different states? Or whatever the value would be as long as the number of atoms is 1 less than the number of states held, even if it's 1BB atoms and 1BB + 1 states.


Moore's "law" literally has to have an end-point. Atomically is probably it, but it could get even smaller. But we're getting nearer to the limit.
 
2012-02-20 10:35:26 AM  

Tobin_Lam: LavenderWolf: cman: Oh, and now intel might be able to break the 4Ghz stock speed

Clock speed is no longer a meaningful measure for a computer's effectiveness.

Clock speed hasn't been meaningful in a while. Years ago, my dream processor was the Intel Q6600. To me, it was like sticking 4 of my Pentium 4s on a single chip but getting even better results. Despite having 4 cores running at the same clock speed as my processor, it was 10 times faster.


Hell, I can just demonstrate it from my last 3 PCs.

The first one I bought myself was a P4 3.2GHz. My second was a 2.8GHz (I think) Core 2. My current clock speed is just 2.6GHz. And yet each time I increased my computer's capabilities many fold.
 
2012-02-20 10:39:59 AM  

DarnoKonrad: LavenderWolf: Regardless of what business you think Apple is in, their hardware is ludicrously overpriced.

They flat out don't sell hardware. They sell an interface that's tied to their hardware. If you don't want the interface, you'd never go to Apple to begin with. That's why it's not comparable. It's a package deal. If someone wants to say their interface is not worth the premium, that's their opinion -- but that's what they're selling, not 'apple brand' hard-drives.

And while people are loath to admit it sometimes, Apple's interface has always been a couple steps ahead of the competition. That's what you're paying for.


That is the most ridiculous doublespeak I've ever heard.

Apple sells Apple brand computers. When someone buys a mac, they're not "buying the Apple interface" they're buying a computer. And the computers they sell are ludicrously overpriced for the components inside, which are nothing special to begin with. Further, the computers are hobbled either by an operating system which supports about a tenth of the software, or having to boot into another operating system and entirely losing the "couple steps ahead" Apple OS while doing so.

I have no problem with people buying Apple if that's what they want, but to say it's not overpriced is stupid beyond measure.
 
2012-02-20 11:11:26 AM  

LavenderWolf: When someone buys a mac, they're not "buying the Apple interface" they're buying a computer.


*facepalm* Whatever guy. When you buy an Apple, you're buying hardware that's specifically designed to run the Apple OS interface. That's what you're paying for. Not the farking hardware. If you run their OS or software on any other hardware, you're breaking their licensing agreement. If you just want hardware, you don't buy it from Apple, because that's not what they sell.
 
2012-02-20 11:26:15 AM  

LavenderWolf: When someone buys a mac, they're not "buying the Apple interface" they're buying a computer.


TL;DR DEEEERRRRRRRRPPPPPPPPPPP
 
2012-02-20 11:59:49 AM  
2x 4GB PC3 10666 Ram chips for the PC... $43-$38
(new window)

2x 4GB PC3 10666 Ram chips for the MAC... $400
(new window)

They stopped being specifically designed for the MAC OS when they stopped using Power PC based machines, PCs and MACs are all x86 now...

Wonderful OS, Overpriced hardware.
 
2012-02-20 12:27:54 PM  

Gonz: SomeAmerican: Not in 1982 he wasn't. In 1982, your typical choices were:

The original IBC PC with a Intel 8088 (16 bit, ~ 1/2 MHz), 16 K of RAM, one floppy drive & no hard-drive, running PC-DOS, for $2,945.

An Apple II Plus with a MOS 6052 (8 bit, ~ 1 MHz), 16 K of RAM, one floppy drive & no hard-drive, running Apple DOS, for $1,195.

At the time, Apple's computer was clearly the better choice for home users.

Unless you were actually a home user in 1982, in which case you would have either had a Texas Instruments TI-994A, or a Commodore VIC-20. The Commodore sold for about $300, I forget the price tag on the TI.


Well, speaking from the perspective of a home computer user in 1982, there was a clear difference in my mind at the time between enthusiast computers and personal computers.

PCs were what you used to get things done. Like my dad's IBM PC XT that his work bought for him, or my neigbor's Apple IIe. They were more expensive, but meant for serious use. They competed with mainframes; a PC meant one less person on the mainframe using CPU cycles.

Enthusiast compturs were what you played around on if you weren't too serious about it, or gave your engineering minded kids. Like my TI-99, which I used to program games on. They were cheap, but had no staying power. They competed with home arcade units.

At the end of the day, the most important difference between the two wasn't the hardware but the software that was available for the platform. That's how I remember it, anyway.

But getting back to Apple, you have to understand that the mac made Apple into a different company. Pre mac they were serious PCs, and cheaper than IBM. Post mac they could charge a premium due to their graphical UI, but lost a lot of business users as it took a long time for programmers to port their command line driven business software to a graphical UI.
 
2012-02-20 12:46:18 PM  
When are memristors going to be widely and cheaply commercially available?
 
2012-02-20 12:58:08 PM  

DarnoKonrad: LavenderWolf: When someone buys a mac, they're not "buying the Apple interface" they're buying a computer.

*facepalm* Whatever guy. When you buy an Apple, you're buying hardware that's specifically designed to run the Apple OS interface. That's what you're paying for. Not the farking hardware. If you run their OS or software on any other hardware, you're breaking their licensing agreement. If you just want hardware, you don't buy it from Apple, because that's not what they sell.


Superjew: LavenderWolf: When someone buys a mac, they're not "buying the Apple interface" they're buying a computer.

TL;DR DEEEERRRRRRRRPPPPPPPPPPP


You are both either dumb, fanboys, or paid shills. Take your pick.

"I just got a new laptop!"
"What kind?"
"Macbook Air"
"Sweet"

"I just got a new desktop computer!"
"What kind?"
"Mac Pro"
"Neat!"

"I just bought into Apple's interface!"
"The fark are you smoking? Speak English."
 
2012-02-20 03:13:47 PM  

DarnoKonrad:

I don't even own any Apple products, but I've got enough sense to realize Apple is more than a hardware vender. So no, he's not right, and neither are you.


They're a consumer electronics company. A touch broader than "hardware vendor." They put a lot of work into their operating systems and user interfaces for their consumer electronics.
 
2012-02-20 03:36:02 PM  
That's cool and all, but why haven't these guys isolated the DNA to grow a human brain?
 
2012-02-20 04:21:45 PM  
Apple = overpriced garbage.

Buy a pc or install linux, and stop pretending that Steve Jobs is a saint.
 
2012-02-20 04:32:16 PM  

Cyno01: cman: Oh, and now intel might be able to break the 4Ghz stock speed

fark that shiat, more cores!


can`t we have both?
 
2012-02-20 04:56:41 PM  
As someone that's currently helping manufacture Ivy Bridge, I got a kick out of some of these replies.

/Please buy Ivy Bridge chips
//It'll help my three bonuses!
 
2012-02-20 07:02:46 PM  

bingethinker: SN1987a goes boom

Unless it was an Apple, then it would still cost $1000 more than it needs to.

Go do your homework, junior, the adults are talking here.


Why don't you take Steve Jobs decaying dick out of your ass.
 
2012-02-20 09:30:27 PM  

This About That: That's cool and all, but why haven't these guys isolated the DNA to grow a human brain?


Because the human brain, while good in its current application, would be woefully inadequate for modern computing purposes.

Unless we radically rethink computing.
 
2012-02-20 09:31:25 PM  

Cthulhu_is_my_homeboy: erik-k: And no, Joe Average is not going to install liquid cooling until it's maintainence-free for life the same way we expect every other piece of computer hardware to be.

Joe Average doesn't install his own computer components. He buys a Dell and throws it out when it meets its planned obsolescence in 3-5 years and buys another Dell.

For those who do install their own components (Joe Above-Average), all-in-one coolers like the Corsair H80 are brilliant, and yes, maintenance-free (and much smaller than the giganormous air-cooled sinks you see these days).


Hmm. I was under the impression that they haven't yet managed to produce a single-metal water cooler, and that until they do the accumulation of ionic nasties fouling the water requires replacement of the coolant at some point or another.

I've also read of a very interesting development at Sandia, where they seem to have at least significantly dented the boundary-layer problem without sounding like a tornado. In principle this is also applicable to any heatsink that rejects to air (industrial sinks, your AC, nuclear cooling towers), not just CPU fans - hello, reduced power usage across the board.
 
2012-02-20 10:49:04 PM  

LavenderWolf: radically rethink computing


Or define another paradigm, or application space, or modus operandi, or something for computing. It need not render the Von Neumann model obsolete. In fact the two could be wired together, then, finally, take over the world.
 
2012-02-21 12:57:33 AM  

the_geek: Mayhem of the Black Underclass: How do you go from 1 atom transistors to 1/2 atom transistors? Or am I being unimaginative?

Current CPUs are 2D.. transistors on a flat surface. The next gen will be 3D CPUs with nanotube transistors in 3-dimensional constructions. Not sure how Moore's law applies there, but that's the direction we're headed I think.


Unless a way can be found to transfer the heat out and away from the between layers you won't gain any efficiencies. Best thing I can think of is diamond coating with CO2 and lasers.

see Chemical Vapor Deposition
 
2012-02-21 10:23:18 AM  

LavenderWolf: Mayhem of the Black Underclass: So, if, as people are saying, this is upholding Moore's Law, and these transistors can be used in ~8 years, what do we do in 10? How do you go from 1 atom transistors to 1/2 atom transistors? Or am I being unimaginative? Like there'll be some other breakthrough were a transistor that's 2 atoms can hold more than 2 different states? Or whatever the value would be as long as the number of atoms is 1 less than the number of states held, even if it's 1BB atoms and 1BB + 1 states.

Moore's "law" literally has to have an end-point. Atomically is probably it, but it could get even smaller. But we're getting nearer to the limit.


This discounts both 3 dimensional computing and biocomputing. You lack imagination.
 
Displayed 74 of 74 comments

View Voting Results: Smartest and Funniest


This thread is archived, and closed to new comments.

Continue Farking
Submit a Link »
On Twitter






In Other Media


  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.

Report