If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Abc.net.au)   Scientists break Moore's Law by creating single atom transistor 8 years ahead of schedule   (abc.net.au) divider line 74
    More: Cool, Moore's Law, atoms, hydrogen atoms, Nature Nanotechnology, transistors, University of New South Wales, qubits, vacuum chambers  
•       •       •

6329 clicks; posted to Geek » on 20 Feb 2012 at 12:07 AM (2 years ago)   |  Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



74 Comments   (+0 »)
   
View Voting Results: Smartest and Funniest

Archived thread

First | « | 1 | 2 | » | Last | Show all
 
2012-02-19 08:42:44 PM
Manufacturability is another matter, submitter.

Moore's law describes a long-term trend in the history of computing hardware whereby the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

Cool but if they can only make one, it doesn't count.
 
2012-02-19 09:43:56 PM
Welcome to the new age of discovery.
 
2012-02-19 09:44:32 PM
Oh, and now intel might be able to break the 4Ghz stock speed
 
2012-02-19 11:59:34 PM
When does it become self-aware?
 
2012-02-20 12:10:38 AM
Actually, this is right on time. Given 8 more years, the technology will be mature enough to create trillions of them on a single surface cheaply and quickly, and sell on the open market for use. They didn't beat the law, it feels like they're upholding it.
 
2012-02-20 12:17:20 AM
I suppose computing will mean something unimaginable to what "computing" means now in the near future.

I will be like comparing the computing done on an abacus to the computing done playing an online 3D game.
 
2012-02-20 12:26:39 AM

cman: Oh, and now intel might be able to break the 4Ghz stock speed


fark that shiat, more cores!
 
2012-02-20 12:28:48 AM

johnnyrocket: I suppose computing will mean something unimaginable to what "computing" means now in the near future.

I will be like comparing the computing done on an abacus to the computing done playing an online 3D game.


I used to have a poster that said 'if cars evolved as quickly as computing, a rolls Royce would go 5000mph and cost $1.'

That was 30 years ago in 1982.
 
2012-02-20 12:35:37 AM

cman: Oh, and now intel might be able to break the 4Ghz stock speed


No it won't. Heat output of a CMOS circuit scales with frequency. Modern clock speed limits have nothing to do with feature size and everything to do with the inability of air-cooled finned heatsinks to effectively remove more than ~150 watts of heat from a one square centimeter area without sounding like jet engines or having an unacceptable temperature at the hot side, known as the Thermal Brick Wall.

And no, Joe Average is not going to install liquid cooling until it's maintainence-free for life the same way we expect every other piece of computer hardware to be.
 
2012-02-20 12:38:43 AM

gaslight: Manufacturability is another matter, submitter.

Moore's law describes a long-term trend in the history of computing hardware whereby the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

Cool but if they can only make one, it doesn't count.


Read it again, many have made a single atom transistor, they have worked it to a level of precision that is repeatable and useful for manufacturing.
 
2012-02-20 12:46:37 AM
So how long before we make transistors made out of a single electron? Or are we just going to start splitting atoms to squeeze more if them into the processor?
 
2012-02-20 12:46:53 AM
This story helpfully omits the fact that is has to be something like -200 F to work.
 
2012-02-20 12:48:56 AM

Electrify: So how long before we make transistors made out of a single electron? Or are we just going to start splitting atoms to squeeze more if them into the processor?


the next advancement seems to be carbon and silicon nanotubes...current two-dimensional processors are inefficient compared to three-dimensional processors.
 
2012-02-20 01:02:57 AM
No one cares about Moore's law being too speedy, we are more concerned with keeping it going.
 
2012-02-20 01:04:27 AM

Chiad: gaslight: Manufacturability is another matter, submitter.

Moore's law describes a long-term trend in the history of computing hardware whereby the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

Cool but if they can only make one, it doesn't count.

Read it again, many have made a single atom transistor, they have worked it to a level of precision that is repeatable and useful for manufacturing.


Nope, placing atoms individually with the tip of a electron microscope isn't inexpensive, and certainly won't scale to the level that you could build an actual computer chip with it.

It's still a cool result, but I wouldn't exactly be popping champaign over it.
 
2012-02-20 01:30:18 AM
Aaaaand done in the Boobies. Moore's Law is an axiom that refers to what the consumer sees, not what R&D does.
 
2012-02-20 01:32:51 AM
Am I the only one who fears the inevitable coming of the nano plague?
 
2012-02-20 01:34:50 AM

PerilousApricot: Chiad: gaslight: Manufacturability is another matter, submitter.

Moore's law describes a long-term trend in the history of computing hardware whereby the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

Cool but if they can only make one, it doesn't count.

Read it again, many have made a single atom transistor, they have worked it to a level of precision that is repeatable and useful for manufacturing.

Nope, placing atoms individually with the tip of a electron microscope isn't inexpensive, and certainly won't scale to the level that you could build an actual computer chip with it.

It's still a cool result, but I wouldn't exactly be popping champaign over it.



Repeatable and useful, not inexpensive, they still have to work that part out, I'll grant you. But this isn't the first single atom transistor.
 
2012-02-20 01:38:47 AM

unyon: johnnyrocket: I suppose computing will mean something unimaginable to what "computing" means now in the near future.

I will be like comparing the computing done on an abacus to the computing done playing an online 3D game.

I used to have a poster that said 'if cars evolved as quickly as computing, a rolls Royce would go 5000mph and cost $1.'

That was 30 years ago in 1982.


Unless it was an Apple, then it would still cost $1000 more than it needs to.
 
2012-02-20 02:11:42 AM
SN1987a goes boom

Unless it was an Apple, then it would still cost $1000 more than it needs to.

Go do your homework, junior, the adults are talking here.
 
2012-02-20 02:34:15 AM

bingethinker: Go do your homework, junior, the adults are talking here.


he's right
 
2012-02-20 03:17:00 AM
You can tell the article was written with Americans in mind. It includes a link to Google maps showing what an Australia may look like.
 
2012-02-20 03:26:13 AM
One more step towards singularity.
 
2012-02-20 03:45:19 AM

erik-k: And no, Joe Average is not going to install liquid cooling until it's maintainence-free for life the same way we expect every other piece of computer hardware to be.


Take a look at FC-72. It's easy enough to design a self-propelling spray system for it using the same waste heat it's built to remove. Granted, this may not be practical as performance will be affected by motion.

Fiber-based laser cooling may be a decent option though.
 
2012-02-20 03:46:39 AM
It'll still end up booting from BIOS.
 
2012-02-20 04:02:04 AM
Meanwhile, Moore and his wife are putting that money to good use. I'd give the man a [Hero] tag if I could.
 
2012-02-20 04:10:40 AM

Chiad: gaslight: Manufacturability is another matter, submitter.

Moore's law describes a long-term trend in the history of computing hardware whereby the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

Cool but if they can only make one, it doesn't count.

Read it again, many have made a single atom transistor, they have worked it to a level of precision that is repeatable and useful for manufacturing.


Making it precise enough to be repeatable doesn't mean it's cheap enough to be manufacturable. Totally different goals. At best, within 10 years you'd have other groups repeat the experiment.
 
2012-02-20 04:58:25 AM
Serious question here, anyone know if the human brain can power these components?
 
2012-02-20 05:13:54 AM
Meh, wake me up then they start building computers from quarks
 
2012-02-20 05:20:57 AM
*cough*

i29.tinypic.com
 
2012-02-20 05:37:48 AM

erik-k: And no, Joe Average is not going to install liquid cooling until it's maintainence-free for life the same way we expect every other piece of computer hardware to be.


Joe Average doesn't install his own computer components. He buys a Dell and throws it out when it meets its planned obsolescence in 3-5 years and buys another Dell.

For those who do install their own components (Joe Above-Average), all-in-one coolers like the Corsair H80 are brilliant, and yes, maintenance-free (and much smaller than the giganormous air-cooled sinks you see these days).
 
2012-02-20 05:54:36 AM
Doesn't Moore's Law take into account that it's logical to expect a lag fase in an 'old' technology while a 'new' technology is being developed that can then start a new period of exponential evolution? We can't make a 'classic' transistor smaller than one atom, so I would guess we're approaching (or are already in) this lag face (constraints on working with single atoms or low number of atoms, quantum effects and the like). So when can I expect my radical new approach to a personal computer.

/Open the pod bay doors, HAL.
//HAL?
///...Halp!?
 
2012-02-20 06:11:19 AM

cman: Welcome to the new age of discovery.


Welcome to the age of finger nail sized iPhones.

/An age of discovery requires something to be discovered.
/Can't do much of that down here.
 
2012-02-20 06:32:51 AM
i was also going to say it, but was not on a silicon chip there for is not cheating moores law
 
2012-02-20 07:21:53 AM

BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right


B-b-b-but the stores are so pretty and it's what all the cool kids buy
 
2012-02-20 07:35:17 AM

phaseolus: BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right

B-b-b-but the stores are so pretty and it's what all the cool kids buy



I don't even own any Apple products, but I've got enough sense to realize Apple is more than a hardware vender. So no, he's not right, and neither are you.
 
2012-02-20 07:52:44 AM

erik-k: cman: Oh, and now intel might be able to break the 4Ghz stock speed

No it won't. Heat output of a CMOS circuit scales with frequency. Modern clock speed limits have nothing to do with feature size and everything to do with the inability of air-cooled finned heatsinks to effectively remove more than ~150 watts of heat from a one square centimeter area without sounding like jet engines or having an unacceptable temperature at the hot side, known as the Thermal Brick Wall.

And no, Joe Average is not going to install liquid cooling until it's maintainence-free for life the same way we expect every other piece of computer hardware to be.


Liquid cooling is easy. Just drop your PC into an aquarium filled with mineral oil (new window).
 
2012-02-20 07:59:04 AM

DarnoKonrad: phaseolus: BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right

B-b-b-but the stores are so pretty and it's what all the cool kids buy


I don't even own any Apple products, but I've got enough sense to realize Apple is more than a hardware vender I stayed at a Holiday Inn Express last night. So no, he's not right, and neither are you.


FTFY
 
2012-02-20 08:01:28 AM

stonelotus: DarnoKonrad: phaseolus: BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right

B-b-b-but the stores are so pretty and it's what all the cool kids buy


I don't even own any Apple products, but I've got enough sense to realize Apple is more than a hardware vender I stayed at a Holiday Inn Express last night. So no, he's not right, and neither are you.

FTFY


wut?
 
2012-02-20 08:04:17 AM

BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right


Not in 1982 he wasn't. In 1982, your typical choices were:

The original IBC PC with a Intel 8088 (16 bit, ~ 1/2 MHz), 16 K of RAM, one floppy drive & no hard-drive, running PC-DOS, for $2,945.

An Apple II Plus with a MOS 6052 (8 bit, ~ 1 MHz), 16 K of RAM, one floppy drive & no hard-drive, running Apple DOS, for $1,195.

At the time, Apple's computer was clearly the better choice for home users. This stayed true in 1983, with the release of the IBM PC XT and the Apple IIe.

Where things started to diverge was with the 1984 release of the Macintosh. With PCs stuck and DOS and Apple rolling out a graphical UI, Apple could charge a premium and people would pay gladly. It wasn't until 1992 that Microsoft had a workable competitor with Windows 3.1, but that still ran on DOS. You can't say that Microsoft really caught up until Windows 95, released 11 years after the Mac.
 
2012-02-20 08:06:09 AM

DarnoKonrad: phaseolus: BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right

B-b-b-but the stores are so pretty and it's what all the cool kids buy


I don't even own any Apple products, but I've got enough sense to realize Apple is more than a hardware vender. So no, he's not right, and neither are you.


A monstrous amalgamation of pinko-commie stooges that cribbed their entire product line from sci-fi movies?

You're right, they are way more than a hardware vendor.
 
2012-02-20 08:06:21 AM

Chiad: Read it again, many have made a single atom transistor, they have worked it to a level of precision that is repeatable and useful for manufacturing.


Sorry, but I've worked in cutting-edge semiconductor R&D. They move fast, but lab-to-market is still more than 5 years, and usually about 10. Moore's Law is intact.
 
2012-02-20 08:23:06 AM
Patent trolls are already gearing up to file suit in Tyler, Texas.
 
2012-02-20 08:32:36 AM

Chiad: PerilousApricot: Chiad: gaslight: Manufacturability is another matter, submitter.

Moore's law describes a long-term trend in the history of computing hardware whereby the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

Cool but if they can only make one, it doesn't count.

Read it again, many have made a single atom transistor, they have worked it to a level of precision that is repeatable and useful for manufacturing.

Nope, placing atoms individually with the tip of a electron microscope isn't inexpensive, and certainly won't scale to the level that you could build an actual computer chip with it.

It's still a cool result, but I wouldn't exactly be popping champaign over it.


Repeatable and useful, not inexpensive, they still have to work that part out, I'll grant you. But this isn't the first single atom transistor.


As an integration engineer who works in cutting edge semiconductor research, I get a kick out of these articles. Don't get me wrong, it's really cool stuff, but I think most people don't understand the gap between a university lab result and an actual product. Did you notice the size of the sample? Today's work is done on 300mm wafers with some lookahead work started on 450mm. The reason? We scale primarily for cost, not for performance. It's not about having smaller chips to fit into smaller packaging because no consumer product is going to be much smaller than they are now because they need to be handled, but rather, we're trying to get to more chips per wafer. If each generation isn't cheaper, then it's not going to be made. So far the industry has been able to keep the pace, but it's getting much more difficult.

Also, a functioning chip has many other elements so it would be curious to see how you would integrate this process with all the passives and other devices.

Anyhow, I'm not trying to pee into anyone's Corn Flakes, but just to level-set expectations.
 
2012-02-20 08:39:30 AM

cman: Oh, and now intel might be able to break the 4Ghz stock speed


Clock speed is no longer a meaningful measure for a computer's effectiveness.
 
2012-02-20 08:41:45 AM

bingethinker: SN1987a goes boom

Unless it was an Apple, then it would still cost $1000 more than it needs to.

Go do your homework, junior, the adults are talking here.


Riiight, Apple's products aren't ridiculously expensive...
 
2012-02-20 08:44:52 AM

The Iron duke: Serious question here, anyone know if the human brain can power these components?


The human brain isn't much of a power source. If you want human-powered computing, a few kinetically powered generators placed on various body parts (knees, hips, feet, upper arms) would be a better choice.
 
2012-02-20 08:49:39 AM

DarnoKonrad: phaseolus: BigJake: bingethinker: Go do your homework, junior, the adults are talking here.

he's right

B-b-b-but the stores are so pretty and it's what all the cool kids buy


I don't even own any Apple products, but I've got enough sense to realize Apple is more than a hardware vender. So no, he's not right, and neither are you.


Regardless of what business you think Apple is in, their hardware is ludicrously overpriced.

The image I made that demonstrates that is several years old, so I won't bother posting it as it's entirely outdated, but nothing has changed. $1000 extra for a computer, given that you're comparing a PC of similar capabilities, is a very conservative estimate. When last I priced out the difference, hard drives alone were costing more than 300% of what they cost when buying from non-Apple companies. There is no excuse for that, it's just gouging. The exact same HD, which at the time went for some $80-$90 elsewhere, was being sold on Apple's website (when you customize an Apple PC to buy) for over $300. This was about 4 years ago, maybe less.
 
2012-02-20 08:57:02 AM

SomeAmerican: Not in 1982 he wasn't. In 1982, your typical choices were:

The original IBC PC with a Intel 8088 (16 bit, ~ 1/2 MHz), 16 K of RAM, one floppy drive & no hard-drive, running PC-DOS, for $2,945.

An Apple II Plus with a MOS 6052 (8 bit, ~ 1 MHz), 16 K of RAM, one floppy drive & no hard-drive, running Apple DOS, for $1,195.

At the time, Apple's computer was clearly the better choice for home users.


Unless you were actually a home user in 1982, in which case you would have either had a Texas Instruments TI-994A, or a Commodore VIC-20. The Commodore sold for about $300, I forget the price tag on the TI.
 
2012-02-20 09:11:28 AM

LavenderWolf: Regardless of what business you think Apple is in, their hardware is ludicrously overpriced.


They flat out don't sell hardware. They sell an interface that's tied to their hardware. If you don't want the interface, you'd never go to Apple to begin with. That's why it's not comparable. It's a package deal. If someone wants to say their interface is not worth the premium, that's their opinion -- but that's what they're selling, not 'apple brand' hard-drives.

And while people are loath to admit it sometimes, Apple's interface has always been a couple steps ahead of the competition. That's what you're paying for.
 
Displayed 50 of 74 comments

First | « | 1 | 2 | » | Last | Show all

View Voting Results: Smartest and Funniest


This thread is archived, and closed to new comments.

Continue Farking
Submit a Link »






Report