Do you have adblock enabled?
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(The Daily Galaxy)   20,000 trillion calculations per second makes it much faster to incorrectly forecast the weather   (dailygalaxy.com) divider line 40
    More: Interesting, Oak Ridge National Laboratory, U.S. Department of Energy, energy economy, physical systems, GPUs, computational science, nuclear power reactors, Electric energy consumption  
•       •       •

2305 clicks; posted to Geek » on 30 Oct 2012 at 4:07 PM (2 years ago)   |  Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



40 Comments   (+0 »)
   
View Voting Results: Smartest and Funniest

Archived thread
 
2012-10-30 03:37:34 PM  
I guess "20 quadrillion" doesn't sound as cool as "20,000 trillion", or something.
 
2012-10-30 04:10:44 PM  
I've always preferred "20 million billion".
 
2012-10-30 04:12:08 PM  
and me with this terrible pain in all the diodes down my left sid
 
2012-10-30 04:16:21 PM  
I've always preferred "20 billion million".
 
2012-10-30 04:17:13 PM  
So how many mega-dopplers is that?
 
2012-10-30 04:19:59 PM  
I'm willing to bet that "20,000 Trillion" is just so po-dunk America can figure out what the heck you are talking about. but how many FPS does it get in timedemo?? And can it compile pr0n faster? that's all I really need it to do.
 
2012-10-30 04:20:19 PM  
20,000 trillion calculations per second? Meh, call me when it can do the Kessel run in less than 12 parsecs.
 
2012-10-30 04:22:12 PM  

impaler: I've always preferred "20 billion million".


"ExaFLOPS" (the next prefix) is one Billion Billion, or as I like to call it "One Sagan"

Titan is a measly 20 milliSagans.
 
2012-10-30 04:27:43 PM  
One Sagan


I like that.
 
2012-10-30 04:31:42 PM  
big deal, women make that many calculations every time you ask them a question.
 
2012-10-30 04:44:22 PM  
musakazhim.files.wordpress.com

Approves of headline
 
2012-10-30 04:53:09 PM  
www.moviereviews.co.uk
Skeptical.

/obscure?
 
2012-10-30 05:08:02 PM  
it still would take a day to render one frame of a pixar flick

i.telegraph.co.uk
 
2012-10-30 05:12:51 PM  
It's too busy trying to calculate how many clones have to die to produce a level six esper.
 
2012-10-30 05:57:14 PM  

Antimatter: It's too busy trying to calculate how many clones have to die to produce a level six esper.


I feel incredibly dirty for getting that reference instantly with no thought what so ever.....
 
2012-10-30 06:00:46 PM  
I was not under the impression they were anywhere this close to completion this summer.
 
jvl
2012-10-30 06:09:55 PM  
16 Core CPU plus GPU on each node? The stupid, it burns.
 
2012-10-30 06:10:57 PM  

jvl: 16 Core CPU plus GPU on each node? The stupid, it burns.


I'm sure you know better than everyone at ORNL how supercomputers should work.
 
2012-10-30 07:18:08 PM  
How fast can it trade stocks?
 
2012-10-30 07:50:35 PM  

jvl: 16 Core CPU plus GPU on each node? The stupid, it burns.


yeah because Integer cores (main CPU) and Shader cores (GPU) are equally useful for all computing task loads.

lemme guess, you're a gamer who doesn't know a damn thing about software engineering?


as for the accuracy jokes on the forecast.. the 3 day temperature forecast is within 3F 95% of the time, and there is a reason that precipitation forecasts are given in % chance.
 
jvl
2012-10-30 08:01:28 PM  

GAT_00: jvl: 16 Core CPU plus GPU on each node? The stupid, it burns.

I'm sure you know better than everyone at ORNL how supercomputers should work.


Gosh: it's almost as if I'm a computer scientician and electrical ingineer and know something about high-end computation. What are the odds of that on the internets?

Look it's pretty simple: do it on a GPU and use a decent CPU for comms? Good idea. Use a high-end core for everything? Good idea.

Here's what obviously happened:

We know from the article that this supercomputer is to be used for many projects, which means many departments fighting like cocks over the spec for the computer. We know from the state of tech that GPU-based systems are awesome. We also know from the state of tech that CPU-based systems are awesome. But which to get?

Here's the guessing part: some of those projects need to run on CPUs (legacy climate-modeling, for example, is unlikely to be capable of using GPUs without major code-base modifications). Some of the projects wanted to have a GPU-based supercomputer for their new one-off stuff because they are faster than CPUs for many things.

Solution: buy two supercomputers and you they can be used in parallel! Just kidding: you just know they are authorized to buy exactly one in their budget. No one will let them buy two instead of one, so they compromise and get this supercomputer.
 
jvl
2012-10-30 08:07:08 PM  

Kazan: yeah because Integer cores (main CPU) and Shader cores (GPU) are equally useful for all computing task loads.


Destroyers and tanks are not equally useful in all wars. What the military needs to build is a new 21st Tank which floats.

lemme guess, you're a gamer who doesn't know a damn thing about software engineering?

The fail is strong in this sentence.
 
2012-10-30 08:11:25 PM  

jvl: Kazan: yeah because Integer cores (main CPU) and Shader cores (GPU) are equally useful for all computing task loads.

Destroyers and tanks are not equally useful in all wars. What the military needs to build is a new 21st Tank which floats.


What happened to the first 20 tanks?
 
2012-10-30 08:12:13 PM  

jvl: GAT_00: jvl: 16 Core CPU plus GPU on each node? The stupid, it burns.

I'm sure you know better than everyone at ORNL how supercomputers should work.

Gosh: it's almost as if I'm a computer scientician and electrical ingineer and know something about high-end computation. What are the odds of that on the internets?

Look it's pretty simple: do it on a GPU and use a decent CPU for comms? Good idea. Use a high-end core for everything? Good idea.

Here's what obviously happened:

We know from the article that this supercomputer is to be used for many projects, which means many departments fighting like cocks over the spec for the computer. We know from the state of tech that GPU-based systems are awesome. We also know from the state of tech that CPU-based systems are awesome. But which to get?

Here's the guessing part: some of those projects need to run on CPUs (legacy climate-modeling, for example, is unlikely to be capable of using GPUs without major code-base modifications). Some of the projects wanted to have a GPU-based supercomputer for their new one-off stuff because they are faster than CPUs for many things.

Solution: buy two supercomputers and you they can be used in parallel! Just kidding: you just know they are authorized to buy exactly one in their budget. No one will let them buy two instead of one, so they compromise and get this supercomputer.


Look, despite the people at ORNL being accurately described as the dumbest smart people in the world, the detail of which is probably national security so I must leave it off, I am aware that these people know what they are doing. Jaguar was the fastest supercomputer in the world at one point and is still running there along with Titan now. So I'm going to trust them over random Internet person.
 
2012-10-30 08:50:41 PM  

Treygreen13: jvl: Kazan: yeah because Integer cores (main CPU) and Shader cores (GPU) are equally useful for all computing task loads.

Destroyers and tanks are not equally useful in all wars. What the military needs to build is a new 21st Tank which floats.


What happened to the first 20 tanks?


Polish engineer left the screen door in the design.
 
2012-10-30 09:03:48 PM  
So, when does it become self-aware?
 
2012-10-30 09:08:43 PM  
Can it run Crysis?
 
A7
2012-10-30 09:30:44 PM  
WOW!
The government is as tech savvy as World of Warcraft - A few expansions ago.
 
2012-10-30 09:32:27 PM  

jvl: Solution: buy two supercomputers and you they can be used in parallel! Just kidding: you just know they are authorized to buy exactly one in their budget. No one will let them buy two instead of one, so they compromise and get this supercomputer.


Your proposed solution is a compromise, too. Many problems can't be neatly pigeon-holed into either the "runs best on CPUs" or "runs best on GPUs" category. Codes for these kinds of problems may do a bunch of work on the CPUs to set up a problem, then squirt a bunch of data and code at the GPU, then pick apart the results and decide what to do next at the CPU level. Your solution would require communication across the network for these kinds of problems, whereas the ORNL solution allows the CPUs and GPUs to talk in a way that is orders of magnitude faster.

By the way, figuring out how best to make these compromises is an active area of research which the DOE is very much engaged in.

/computational scientist
 
2012-10-30 09:40:21 PM  

jvl: No one will let them buy two instead of one, so they compromise and get this supercomputer.


I'm not seeing how that is so stupid it burns.
 
2012-10-30 10:00:09 PM  

jvl: Solution: buy two supercomputers and you they can be used in parallel! Just kidding: you just know they are authorized to buy exactly one in their budget. No one will let them buy two instead of one, so they compromise and get this supercomputer.


And I might add that if ORNL is stupid, they aren't the only ones. The University of Texas at Austin is only a few months away from turning on a machine with a similar architecture (the Intel Xeon Phi coprocessors mentioned in the press release are Intel's answer to GPUs and fulfill the same role), and you might recall this one built by China a year or so ago, which was briefly the fastest in the world.
 
2012-10-30 10:04:09 PM  
Oak Ridge National Laboratory

It's to simulate nukes. They'll also use it to run weather forecasts.
 
2012-10-30 10:24:31 PM  

jvl: GAT_00: jvl: 16 Core CPU plus GPU on each node? The stupid, it burns.

I'm sure you know better than everyone at ORNL how supercomputers should work.

Gosh: it's almost as if I'm a computer scientician and electrical ingineer and know something about high-end computation. What are the odds of that on the internets?

Look it's pretty simple: do it on a GPU and use a decent CPU for comms? Good idea. Use a high-end core for everything? Good idea.

Here's what obviously happened:

We know from the article that this supercomputer is to be used for many projects, which means many departments fighting like cocks over the spec for the computer. We know from the state of tech that GPU-based systems are awesome. We also know from the state of tech that CPU-based systems are awesome. But which to get?

Here's the guessing part: some of those projects need to run on CPUs (legacy climate-modeling, for example, is unlikely to be capable of using GPUs without major code-base modifications). Some of the projects wanted to have a GPU-based supercomputer for their new one-off stuff because they are faster than CPUs for many things.

Solution: buy two supercomputers and you they can be used in parallel! Just kidding: you just know they are authorized to buy exactly one in their budget. No one will let them buy two instead of one, so they compromise and get this supercomputer.


Ok...where to start.

First off, obviously (as you mention) each node in the system needs a CPU to run the OS on, as well as communicate over the system network - it's a requirement for the GPUs - they can't run in a system by themselves, just like the one in your PC. There is actually very little reason for them in choosing a cheaper CPU, as the difference
1) would make a very small change in the total system cost, and
2) it greatly broadens the base of applications that can benefit from the system.

Looking at 2 - lets say for a minute that you're right (and you are) that multiple constituencies are involved in the procurement of the machine. The people purchasing it have in their job descriptions the role of serving all of those scientists, to the best of their ability. That often leads to compromises so that everyone can get their work done - as opposed to simply building the machine that gets them the biggest dick-measuring number for the Top500 list.

I'd much rather my tax dollars be spent to actually get science done, which Oak Ridge (and Cray) has an excellent track record for with Jaguar.

Lastly, one of ORNL's development goals is to get applications to use both the CPU and the GPU at the same time - leveraging both for performance. In a system where you're required to have both, it's incredibly inefficient to leave even a mild-strength CPU sitting mostly idle. They've already ported some significant linear algebra libraries to do this.

So, yes, they know what they're doing.
 
2012-10-30 11:45:04 PM  
GIGO
 
2012-10-30 11:59:58 PM  
So, cloudy this weekend...or not?
 
2012-10-31 09:25:49 AM  

Ford Perfect: GIGO


Came to say this.

Now they can run inaccurate models even faster.
 
2012-10-31 11:25:17 AM  

Rostin: Your proposed solution is a compromise, too. Many problems can't be neatly pigeon-holed into either the "runs best on CPUs" or "runs best on GPUs" category.


Yep, jvl doesn't have a fuking clue about system architecture or software engineering.

jvl: "computer scientician and electrical ingineer and know something about high-end computation"

Ha ha ha ha! No you're not you lying sack of shat.
 
2012-10-31 01:17:19 PM  
Wow... SLI rocks!
 
2012-10-31 03:19:37 PM  

old_toole: One Sagan


I like that.


Already in use as jargon

http://catb.org/jargon/html/S/sagan.html
 
2012-11-02 07:37:50 AM  

impaler: Rostin: Your proposed solution is a compromise, too. Many problems can't be neatly pigeon-holed into either the "runs best on CPUs" or "runs best on GPUs" category.

Yep, jvl doesn't have a fuking clue about system architecture or software engineering.

jvl: "computer scientician and electrical ingineer and know something about high-end computation"

Ha ha ha ha! No you're not you lying sack of shat.


I like the way you used shat, as in the past tense. It sort of implies a sack of stuff that has been shat which may be different from actual shiat which in turn implies objects being put up there to be shat out then put in a sack...
 
Displayed 40 of 40 comments

View Voting Results: Smartest and Funniest


This thread is archived, and closed to new comments.

Continue Farking
Submit a Link »
Advertisement
On Twitter





In Other Media


  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.

Report