If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Ars Technica)   And the computer language used for the world's most state-of-the-art calculations is.... ...FORTRAN. suprisedcounter = suprisedcounter + 1;   (arstechnica.com) divider line 93
    More: Interesting, scientific computing, physical scientist, Fortran, National Center for Atmospheric Research, national language, weather predictions, high-performance computing, modeling  
•       •       •

2685 clicks; posted to Geek » on 08 May 2014 at 8:52 AM (11 weeks ago)   |  Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



93 Comments   (+0 »)
   
View Voting Results: Smartest and Funniest

First | « | 1 | 2 | » | Last | Show all
 
2014-05-08 02:46:59 AM
FORTRAN has been the supercomputer programming language of choice for decades, and no one wants to rewrite millions of lines of code of specialized numerical libraries just because FORTRAN's no longer the flavor of the month. Or the decade.

Or isn't it? As TFA notes in passing, the latest versions of FORTRAN include language features oriented toward parallel computing, which is important because supercomputers long since reached the limits of what a single processor can do, and further speed is in including as many processors/cores as you can fit into a small space.
 
ZAZ [TotalFark]
2014-05-08 08:08:42 AM
(set! surprisedcounter (apply (lambda (x) (+ 1 x)) (list surprisedcounter)))
 
2014-05-08 08:39:22 AM

Mithiwithi: FORTRAN has been the supercomputer programming language of choice for decades, and no one wants to rewrite millions of lines of code of specialized numerical libraries just because FORTRAN's no longer the flavor of the month. Or the decade.

Or isn't it? As TFA notes in passing, the latest versions of FORTRAN include language features oriented toward parallel computing, which is important because supercomputers long since reached the limits of what a single processor can do, and further speed is in including as many processors/cores as you can fit into a small space.


--surprisedcounter;
 
2014-05-08 08:42:58 AM
Well, duh. Anyone who does HPC knows this. FORTRAN is the language of scientific number crunching.

FORTRAN sucks but the HPC vendors supply fast compilers and F95 is a high level language. It gets the job done. F95 added a lot of features compared to F77 and since it's backwards compatible, legacy code works, but you can do things like dynamic memory allocation and fun stuff like frob=0 when frob is a 3D array, etc., etc.

If I had a nickel for each line of FORTRAN I've written I'd have a shiatload of nickels. But they would be stinky nickels because I would rather be writing in another language. I want to punch kittens every time I have to do string manipulation in FORTRAN. My left nut for a FORTRAN sprintf(). And no null string termination... fark me with a chainsaw.

I prefer C and python for analyzing/displaying output but when it comes to running models (what's done on supercomputers), it's mostly F95 (some folks still writing F77). Some folks have C/C++ models but not many.
 
2014-05-08 08:44:51 AM
Or:

surprisedcounter += 1
 
2014-05-08 08:47:32 AM

Mithiwithi: FORTRAN has been the supercomputer programming language of choice for decades, and no one wants to rewrite millions of lines of code of specialized numerical libraries just because FORTRAN's no longer the flavor of the month. Or the decade.

Or isn't it? As TFA notes in passing, the latest versions of FORTRAN include language features oriented toward parallel computing, which is important because supercomputers long since reached the limits of what a single processor can do, and further speed is in including as many processors/cores as you can fit into a small space.


MPI and OpenMP are available for both C and FORTRAN. Modern supercomputers generally run fastest when you you have one MPI rank per shared-memory node and further parallelize with OpenMP for each node.

The problem is taking legacy code that was written for MPI, which was written when a node was the same as a processor, and converting it to a hybrid MPI+OpenMP model intelligently. It's a major pain. Running "All MPI" is the easiest thing to do, where you cross your fingers and hope the compilers and libraries are smart enough to do the intranode MPI exchanges in shared memory.
 
2014-05-08 08:56:39 AM
No love for COBOL?
 
2014-05-08 08:58:10 AM

b0rscht: Well, duh. Anyone who does HPC knows this. FORTRAN is the language of scientific number crunching.

FORTRAN sucks but the HPC vendors supply fast compilers and F95 is a high level language. It gets the job done. F95 added a lot of features compared to F77 and since it's backwards compatible, legacy code works, but you can do things like dynamic memory allocation and fun stuff like frob=0 when frob is a 3D array, etc., etc.

If I had a nickel for each line of FORTRAN I've written I'd have a shiatload of nickels. But they would be stinky nickels because I would rather be writing in another language. I want to punch kittens every time I have to do string manipulation in FORTRAN. My left nut for a FORTRAN sprintf(). And no null string termination... fark me with a chainsaw.

I prefer C and python for analyzing/displaying output but when it comes to running models (what's done on supercomputers), it's mostly F95 (some folks still writing F77). Some folks have C/C++ models but not many.


Could be worse.  Could be COBOL.

Yes, occasionally I have to dig into COBOL code at my work, though we are no longer actively maintaining that code:  If a COBOL program needs to be modified, we re-write it in something a tad more modern instead of modifying the existing code.

Even so, we have well over a hundred of them still being used.
 
2014-05-08 08:58:41 AM

Mithiwithi: which is important because supercomputers long since reached the limits of what a single processor can do


Uh, supercomputers always used multiple processors.  It's just that networking finally got fast enough so you didn't have to slap them all in the same machine, which made the whole idea of HPC vastly cheaper.
 
2014-05-08 08:58:45 AM

SteakMan: No love for COBOL?


No.  None whatsoever.
 
2014-05-08 09:00:02 AM

nekom: Mithiwithi: FORTRAN has been the supercomputer programming language of choice for decades, and no one wants to rewrite millions of lines of code of specialized numerical libraries just because FORTRAN's no longer the flavor of the month. Or the decade.

Or isn't it? As TFA notes in passing, the latest versions of FORTRAN include language features oriented toward parallel computing, which is important because supercomputers long since reached the limits of what a single processor can do, and further speed is in including as many processors/cores as you can fit into a small space.

--surprisedcounter;


You've just introduced a bug. Good job.
 
2014-05-08 09:00:28 AM

xanadian: Or:

surprisedcounter += 1


surprisedcounter++
 
2014-05-08 09:01:28 AM

b0rscht: Well, duh. Anyone who does HPC knows this. FORTRAN is the language of scientific number crunching.

FORTRAN sucks but the HPC vendors supply fast compilers and F95 is a high level language. It gets the job done. F95 added a lot of features compared to F77 and since it's backwards compatible, legacy code works, but you can do things like dynamic memory allocation and fun stuff like frob=0 when frob is a 3D array, etc., etc.


Dynamic memory allocation?  Hell, I worked with high performance ab-initio FORTRAN code back in grad school 25 years ago that did that.

Of course it did it by having the first statement in the program be

DOUBLE A[ memory size of machine ]

Every variable was just an offset of the array A.  You needed more space for something?  Just allocate more of A, or free it up when you were done.

*Twitch*
 
2014-05-08 09:06:45 AM

dittybopper: Could be worse. Could be COBOL


No, sorry.  No real scientist would ever code in COBOL.  They would commit suicide first.
 
2014-05-08 09:20:19 AM
Watfor or watfiv?
 
2014-05-08 09:21:40 AM
ADD 1 TO surprisedcounter.
 
2014-05-08 09:24:19 AM
Pfft.  I only write in Pascal.  You've probably never heard of it.
 
2014-05-08 09:26:19 AM

Glockenspiel Hero: b0rscht: Well, duh. Anyone who does HPC knows this. FORTRAN is the language of scientific number crunching.

FORTRAN sucks but the HPC vendors supply fast compilers and F95 is a high level language. It gets the job done. F95 added a lot of features compared to F77 and since it's backwards compatible, legacy code works, but you can do things like dynamic memory allocation and fun stuff like frob=0 when frob is a 3D array, etc., etc.

Dynamic memory allocation?  Hell, I worked with high performance ab-initio FORTRAN code back in grad school 25 years ago that did that.

Of course it did it by having the first statement in the program be

DOUBLE A[ memory size of machine ]


You mean DOUBLE PRECISION A( memory size of machine / size of double - whatever OS and your app need to, you know, actually run )?

/Haven't touched F77 since 1984
//Hated it back then
///Rather fuzzy on syntax, so might have rotskied myself
 
2014-05-08 09:27:18 AM

Marcus Aurelius: dittybopper: Could be worse. Could be COBOL

No, sorry.  No real scientist would ever code in COBOL.  They would commit suicide first.


COBOL is designed for use in business, not science. It's good for data management, not number-crunching.
 
2014-05-08 09:31:16 AM

dittybopper: SteakMan: No love for COBOL?

No.  None whatsoever.


I've never understood all the Battlestar Galactica hate on Fark.
 
2014-05-08 09:34:31 AM

xanadian: Or:

surprisedcounter += 1


Not in Fortran
 
2014-05-08 09:34:36 AM
Weird. I would have guessed Visual Basic.
 
2014-05-08 09:35:19 AM
In particle physics we use Root.

It goes down about as well as you might imagine.
 
2014-05-08 09:37:32 AM

Rapmaster2000: Pfft.  I only write in Pascal.  You've probably never heard of it.


I'd wager most people have.

/ Ada FTW!
 
2014-05-08 09:40:27 AM
The mark of a good scientist is that he can make code in any programming language look like Fortran
 
2014-05-08 09:46:06 AM
This is not at all surprising.
 
2014-05-08 09:49:51 AM
Fortran is hot.
 
2014-05-08 09:55:32 AM

SteakMan: No love for COBOL?


COBOL puts food on my table and a roof over my head.  Where I work, the powers that be plan to convert it all to JAVA.  Management announced a few months ago that JAVA programs currently outnumber COBOL programs.  The problem is they were counting JAVA classes to COBOL modules.  A typical JAVA class has 10 - 20 lines of code.  Cobol modules have thousands...  I'm 33 and will be long retired before they finish converting all of this code.
 
2014-05-08 09:59:52 AM
      I had to  learn FORTRAN back in college in the early 90's during my failed attempt at an electrical engineering degree at Tulane.
      At least I no longer put six blanks before each sentence when typing.
 
2014-05-08 10:03:58 AM

bark_atda_moon: Where I work, the powers that be plan to convert it all to JAVA


I know that COBOL programmers use their Caps Lock key pretty much  all the time, but JAVA is not an acronym. It's simply Java. I used to give crash-courses to COBOL programmers introducing them to Java, which was roughly akin to teaching a fish to ride a bicycle. The paradigms are so different.

bark_atda_moon: A typical JAVA class has 10 - 20 lines of code.


Those are amazingly small classes. That's not a bad thing, per se, but it's amazing considering a typical class usually contains 10-20 lines of pure boiler-plate before you include any implementation code. It's not unusual, or bad, for a class to be hundreds of lines long. There are poorly designed classes that may contain thousands of lines.

What I'm saying is that either your Java programmers are being extremely aggressive about keeping classes small and readable, or they're complete lunatics. That's not an exclusive "or", by the way.
 
2014-05-08 10:04:24 AM
And a lot of the programs are actually hybrid.You can purchase lots of  numerical optimization libraries (like NAG) which are coded in Fortran but you can call them from programs written in C for instance.
 
2014-05-08 10:05:52 AM

TasyRadiSkull: In particle physics we use Root.

It goes down about as well as you might imagine.


WHich is an unbelievable mess of a library. And don't get me started on CINT. But it's the unbelievable mess that we know...  cheers
 
2014-05-08 10:07:16 AM

b0rscht: I want to punch kittens every time I have to do string manipulation in FORTRAN. My left nut for a FORTRAN sprintf(). And no null string termination... fark me with a chainsaw.


I always wondered why the original Adventure was written in FORTRAN. You'd think they'd have used something more suitable for string processing.

/a hollow voice says "Plugh."
 
2014-05-08 10:13:44 AM

bark_atda_moon: SteakMan: No love for COBOL?

COBOL puts food on my table and a roof over my head.  Where I work, the powers that be plan to convert it all to JAVA.  Management announced a few months ago that JAVA programs currently outnumber COBOL programs.  The problem is they were counting JAVA classes to COBOL modules.  A typical JAVA class has 10 - 20 lines of code.  Cobol modules have thousands...  I'm 33 and will be long retired before they finish converting all of this code.


We have a COBOL program manages a couple gazillion records gathered over the last 25 years.  We get sub-second returns from queries and I don't think it's gone down in the 15 years I've been using it.  The powers that be want to upgrade to a "modern environment" and get away from the "green screens".  I have a feeling it will be an epic clusterfark.
 
2014-05-08 10:17:35 AM

t3knomanser: Those are amazingly small classes. That's not a bad thing, per se, but it's amazing considering a typical class usually contains 10-20 lines of pure boiler-plate before you include any implementation code. It's not unusual, or bad, for a class to be hundreds of lines long. There are poorly designed classes that may contain thousands of lines.

What I'm saying is that either your Java programmers are being extremely aggressive about keeping classes small and readable, or they're complete lunatics. That's not an exclusive "or", by the way.


I probably underestimated the average number of lines per class, but they keep them small here.  Most of the code was contracted out and the contractors are told not to duplicate code.  Common getters and setters are broken off into their own classes so that they can be reused.    I would rather have the getter and setter in my program rather than call a method that does something so simple, but I have a COBOL programmers mentality.
 
2014-05-08 10:22:04 AM

Muta: We have a COBOL program manages a couple gazillion records gathered over the last 25 years.  We get sub-second returns from queries and I don't think it's gone down in the 15 years I've been using it.  The powers that be want to upgrade to a "modern environment" and get away from the "green screens".  I have a feeling it will be an epic clusterfark.



We are forbidden from creating any more CICS screens.  Instead we build Java front ends and use the CICS transaction gateway to get to all of the backend stuff.  Slowly the green screens are going away, but it will take a decade or 2 to get them all.

What will take forever is converting all of the Batch COBOL to Java Batch.   Not going to happen in my lifetime.
 
2014-05-08 10:34:15 AM
Because n00bs are afraid of C.
 
2014-05-08 10:34:24 AM
Back in the day, FORTRAN was for the sissies - the math guys. The real boys, chemists, used MACHINE LANGUAGE. Only the National Science Foundation and your local drug dealer could tell you why.

Now get off my onion field.
 
2014-05-08 10:39:26 AM
surprisedcounter = suprisedcounter++; /* not sure if surprised */
 
2014-05-08 10:42:57 AM
My dissertation research was done in Fortran, so I've been getting a kick out of convincing employers that I know how to program when they see it listed first on my resume.  Most interviewers outside of universities are  only vaguely familiar with it as something that old people do.

My best friend is probably the best programmer I've ever met, but he's never seen Fortran.  I should've taken a picture of his face the time I explained to him about indenting at the beginning of each line and limiting variable names to six characters.  He thought I was shiatting him.
 
2014-05-08 10:46:15 AM

Rapmaster2000: Pfft.  I only write in Pascal.  You've probably never heard of it.


Pfft. I only write in Fourth. I know you have not heard of it.
 
2014-05-08 10:47:09 AM

dittybopper: b0rscht: Well, duh. Anyone who does HPC knows this. FORTRAN is the language of scientific number crunching.

FORTRAN sucks but the HPC vendors supply fast compilers and F95 is a high level language. It gets the job done. F95 added a lot of features compared to F77 and since it's backwards compatible, legacy code works, but you can do things like dynamic memory allocation and fun stuff like frob=0 when frob is a 3D array, etc., etc.

If I had a nickel for each line of FORTRAN I've written I'd have a shiatload of nickels. But they would be stinky nickels because I would rather be writing in another language. I want to punch kittens every time I have to do string manipulation in FORTRAN. My left nut for a FORTRAN sprintf(). And no null string termination... fark me with a chainsaw.

I prefer C and python for analyzing/displaying output but when it comes to running models (what's done on supercomputers), it's mostly F95 (some folks still writing F77). Some folks have C/C++ models but not many.

Could be worse.  Could be COBOL.

Yes, occasionally I have to dig into COBOL code at my work, though we are no longer actively maintaining that code:  If a COBOL program needs to be modified, we re-write it in something a tad more modern instead of modifying the existing code.

Even so, we have well over a hundred of them still being used.


No, it could be freaking Java, which I'm convinced is just a farking cruel joke perpetrated on us by the 1%ers.
 
2014-05-08 10:47:20 AM

RandomTux: Glockenspiel Hero: b0rscht: Well, duh. Anyone who does HPC knows this. FORTRAN is the language of scientific number crunching.

FORTRAN sucks but the HPC vendors supply fast compilers and F95 is a high level language. It gets the job done. F95 added a lot of features compared to F77 and since it's backwards compatible, legacy code works, but you can do things like dynamic memory allocation and fun stuff like frob=0 when frob is a 3D array, etc., etc.

Dynamic memory allocation?  Hell, I worked with high performance ab-initio FORTRAN code back in grad school 25 years ago that did that.

Of course it did it by having the first statement in the program be

DOUBLE A[ memory size of machine ]

You mean DOUBLE PRECISION A( memory size of machine / size of double - whatever OS and your app need to, you know, actually run )?

/Haven't touched F77 since 1984
//Hated it back then
///Rather fuzzy on syntax, so might have rotskied myself


Nope, just checked and you're correct.  Sigh- I probably should have looked it up before posting. It's been a long time since I did FORTRAN.  The application helpfully had a comment right before this giving the proper array size for various machines with differing amounts of core.  (Portable binary?  What's that?)

Looking this up also reminded me of the greatness of the IMPLICIT command.
 
2014-05-08 10:48:08 AM

t3knomanser: bark_atda_moon: Where I work, the powers that be plan to convert it all to JAVA

I know that COBOL programmers use their Caps Lock key pretty much  all the time, but JAVA is not an acronym. It's simply Java. I used to give crash-courses to COBOL programmers introducing them to Java, which was roughly akin to teaching a fish to ride a bicycle. The paradigms are so different.

bark_atda_moon: A typical JAVA class has 10 - 20 lines of code.

Those are amazingly small classes. That's not a bad thing, per se, but it's amazing considering a typical class usually contains 10-20 lines of pure boiler-plate before you include any implementation code. It's not unusual, or bad, for a class to be hundreds of lines long. There are poorly designed classes that may contain thousands of lines.

What I'm saying is that either your Java programmers are being extremely aggressive about keeping classes small and readable, or they're complete lunatics. That's not an exclusive "or", by the way.


They could be using a bunch of frameworks and annotating everything.
 
2014-05-08 10:56:20 AM

studebaker hoch: Because n00bs are afraid of C assembly machine code toggle switches.

 
2014-05-08 10:56:56 AM

Slaves2Darkness: Rapmaster2000: Pfft.  I only write in Pascal.  You've probably never heard of it.

Pfft. I only write in Fourth. I know you have not heard of it.


That's because the programming language is called "Forth" and I know it quite well.
 
2014-05-08 10:57:24 AM

bark_atda_moon: Muta: We have a COBOL program manages a couple gazillion records gathered over the last 25 years.  We get sub-second returns from queries and I don't think it's gone down in the 15 years I've been using it.  The powers that be want to upgrade to a "modern environment" and get away from the "green screens".  I have a feeling it will be an epic clusterfark.


We are forbidden from creating any more CICS screens.  Instead we build Java front ends and use the CICS transaction gateway to get to all of the backend stuff.  Slowly the green screens are going away, but it will take a decade or 2 to get them all.

What will take forever is converting all of the Batch COBOL to Java Batch.   Not going to happen in my lifetime.


Honestly that's the way to go, creating front ends in flavor-of-the-month-that-management-wants that just talks to the older stuff. Leave the heavy lifting back end in place if it works. I'm doing a lot of that kind of thing replacing some of our old AS400 green screens with web interfaces. It's nowhere near as complicated as what you do but the concept is similar. I'm a C# monkey so most of this stuff is a little over my head.
 
2014-05-08 11:05:43 AM
Hate FORTRAN - hate it hate it hate it

kludgy, awkward mess that takes 20 lines to do things that sane languages do in 1 on top of the memory management paradigm from hell that meant that every non-trivial program implemented it's own memory manager.  And you haven't seen a bloody hack until you've seen a memory manager written by a physicist who cut his teeth on the Gemini program.

I still have to work with it alot because so much of our legacy code is in FORTRAN and there's no time or budget to rewrite & validate it.  Some of this stuff may as well have been handed down by god - ie. there's nobody still here that was there at the Creation and it's blasphemous to even suggest that maybe, just maybe program XYZ isn't all that and we might do better.

And of course the corporate greybeards still write it - mostly F77, because damned if they'll learn everything new and they did try it once and there was darkness upon the world, dogs and cats living together, etc.  Took a long time to get decent F95 compilers, other then the expensive vendor compilers and even those had/have ugly quirks between platforms.  Last time I tried, GCC F95 still sucked.

I honestly haven't seen much performance difference between well written C/C++ with modern compilers.  The maintainability, extensibility, development time, etc. is infinitely better.  No doubt, you CAN screw the pooch performance wise much more easily with those languages - FORTRAN simply doesn't allow enough abstraction to really truly fsck things up the way you can with C++.

Anyway, most of the time now I just write a C wrapper for the important bits and move on.  More interested in the problem then the tools - just hate working with kludgy tools.
 
2014-05-08 11:10:32 AM
Granted, I DNRTFA all the way through, but the author is somewhat off.

Yes, there are lots of people who still use FORTRAN for their scientific calculations, but it's probably not as prevalent as he thinks, particularly when people write new code. If you're using a common compiler, FORTRAN and C/C++ are completely inter-operable, so there's really no reason to write new FORTRAN code unless you're sticking with FORTRAN for the sake of FORTRAN. I know that anecdotes are stupid, but I've never personally seen anyone write new FORTRAN code (though I'm from a CS department).

Second, the author has some pretty quaint notions of what is going to replace FORTRAN. The people who have traditionally used FORTRAN (or C or C++) have done so because they feel that they need the level of control that such languages provide, and usability and elegance are secondary. Stuff like functional programming is an absolute non-starter, and in general terms any programming language/paradigm that's centered around really optimizing a single use case isn't going to be flexible enough to do low-level, highly optimized system work. For example, some people replace the standard C stack convention with a cactus stack for parallel computing. It makes perfect sense in C and FORTRAN, and while I'm sure you can do it in something like Java or Haskell, it's a pretty objectively bad way to do things.
 
2014-05-08 11:11:44 AM
C++ far exceeds all other languages in optimizable code because of expression templates, which completely removes the unnecessary intermediate copies of large collection operations (like vector ops). C, Fortran, etc. Cannot do that in library functions and must hand remove intermediates by expanding out the operations once for every element (which is never done because you can't easily change data size once done).

I did FORTRAN on my simulations in college too. Nothing surprising. But I certainly use c++ for everything these days because it allows real optimization.

I also would have preferred to mention Ocaml if we are talking performant functional languages. Haskell is nice, but the ML guys really have the market on speed, and OO programming in functional land is best with the duck typing of Ocaml.
 
Displayed 50 of 93 comments

First | « | 1 | 2 | » | Last | Show all

View Voting Results: Smartest and Funniest


This thread is closed to new comments.

Continue Farking
Submit a Link »






Report