If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Space.com)   USGS has just released the most detailed map of the Martian surface ever   (space.com) divider line 20
    More: Interesting  
•       •       •

1357 clicks; posted to Geek » on 15 Jul 2014 at 4:28 PM (8 days ago)   |  Favorite    |   share:  Share on Twitter share via Email Share on Facebook   more»



20 Comments   (+0 »)
   
View Voting Results: Smartest and Funniest
 
2014-07-15 03:54:26 PM
i65.photobucket.com
 
2014-07-15 04:51:12 PM
Would be nice without THE farkING BLARING AD THAT COMES OUT WITH MAXIMUM VOLUME

dickbags.
 
2014-07-15 05:09:29 PM
I didn't even know that the Martian surface was part of the US.
 
2014-07-15 05:31:04 PM

SpdrJay: I didn't even know that the Martian surface was part of the US.


marsbymartian.wikispaces.com
 
2014-07-15 06:02:01 PM

SpdrJay: I didn't even know that the Martian surface was part of the US.



It's not just part of the U.S., it's overwhelmingly Republican.

/red planet, indeed.
 
2014-07-15 06:51:54 PM
Mars

/get you ass to it
 
2014-07-15 08:33:28 PM
3.bp.blogspot.com
 
2014-07-15 08:41:33 PM
Will we be soon seeing a Google-Mars?
 
2014-07-15 08:52:49 PM
CSB time...

In '96 or so, when I lived in Flagstaff, my roommate wrote this software. At the time, it was a massive project. The computer they ran it on had 4 Terabytes of RAM,took half an hour to boot and 3 hours I'd they ram a memory test.

His software was designed to take orbiter and probe data and stitch it together. I am pretty sure that his software was at least the base used for Google Earth as well, but since this is USGS, which is who he worked for, I can feel pretty confident that this is his software, obviously almost 20 years down the road.

I hope he and his team are still getting money, but he probably just got paid his normal salary and the government keeps the rest.

Back in pre DVD days, I think this spread out over something like 32 CDs.
 
2014-07-15 08:53:24 PM
How does it compare to my map of Duna?
 
2014-07-15 11:35:54 PM

Mikey1969: CSB time...

In '96 or so, when I lived in Flagstaff, my roommate wrote this software. At the time, it was a massive project. The computer they ran it on had 4 Terabytes of RAM,took half an hour to boot and 3 hours I'd they ram a memory test.


Uh... 4 *TERA* bytes of RAM? You mean gigabytes, right? What computer architecture could address that amount of memory back then?
 
2014-07-16 12:19:29 AM
No planet is more steeped in myth and misconception than Mars. Earth.

People even think there is intelligent life there.
 
2014-07-16 12:33:18 AM

neilbradley: Mikey1969: CSB time...

In '96 or so, when I lived in Flagstaff, my roommate wrote this software. At the time, it was a massive project. The computer they ran it on had 4 Terabytes of RAM,took half an hour to boot and 3 hours I'd they ram a memory test.

Uh... 4 *TERA* bytes of RAM? You mean gigabytes, right? What computer architecture could address that amount of memory back then?


Nope, he specified Terabytes. This was when GB hard drives first started hitting the shelves, and he specifically told me a tera was 1,000 GB because I was still amazed at the concept of a friggin' GB. When I had been working with computers in high school, it was all on a floppy drive. Besides, 64 bit will handle more than that(4 Petabytes is the theoretical limit, I think). This was a federal gov't project, so they could throw money at it like it was a stripper and all they had was gold bullion.

And a quick Wiki check confirms that 64 bit was bouncing around by the mid-90s. I'm sure it was still a lumbering beast in comparison, but it had 4 terabytes.
 
2014-07-16 01:09:41 AM

Mikey1969: neilbradley: Mikey1969: CSB time...

Uh... 4 *TERA* bytes of RAM? You mean gigabytes, right? What computer architecture could address that amount of memory back then?

Nope, he specified Terabytes. This was when GB hard drives first started hitting the shelves, and he specifically told me a tera was 1,000 GB because I was still amazed at the concept of a friggin' GB. When I had been working with computers in high school, it was all on a floppy drive. Besides, 64 bit will handle more than that(4 Petabytes is the theoretical limit, I think). This was a federal gov't project, so they could throw money at it like it was a stripper and all they had was gold bullion.

And a quick Wiki check confirms that 64 bit was bouncing around by the mid-90s. I'm sure it was still a lumbering beast in comparison, but it had 4 terabytes.


What and where was it? As an aside, just because a computer is "64 bits" doesn't mean the ecosystem can address it all implicitly. There are ecosystem limitations in most cases (i.e. chipset won't allow higher addressability).
 
2014-07-16 01:59:03 AM

elchupacabra: Would be nice without THE farkING BLARING AD THAT COMES OUT WITH MAXIMUM VOLUME

dickbags.


Get Ghostery, works wonders for stuff like this.

img.fark.net
 
2014-07-16 02:07:26 AM

noblewolf: Will we be soon seeing a Google-Mars?


Hmmm....maybe.

http://www.google.com/mars/
 
2014-07-16 05:20:12 AM

Hollie Maea: noblewolf: Will we be soon seeing a Google-Mars?

Hmmm....maybe.

http://www.google.com/mars/


Neato! That was fun to play with. Thanks.
 
2014-07-16 08:19:18 AM
static.guim.co.uk

"I thought you said they'd never see me sunbathing nude, John Carter."
 
2014-07-16 09:16:23 AM

neilbradley: Mikey1969: neilbradley: Mikey1969: CSB time...

Uh... 4 *TERA* bytes of RAM? You mean gigabytes, right? What computer architecture could address that amount of memory back then?

Nope, he specified Terabytes. This was when GB hard drives first started hitting the shelves, and he specifically told me a tera was 1,000 GB because I was still amazed at the concept of a friggin' GB. When I had been working with computers in high school, it was all on a floppy drive. Besides, 64 bit will handle more than that(4 Petabytes is the theoretical limit, I think). This was a federal gov't project, so they could throw money at it like it was a stripper and all they had was gold bullion.

And a quick Wiki check confirms that 64 bit was bouncing around by the mid-90s. I'm sure it was still a lumbering beast in comparison, but it had 4 terabytes.

What and where was it? As an aside, just because a computer is "64 bits" doesn't mean the ecosystem can address it all implicitly. There are ecosystem limitations in most cases (i.e. chipset won't allow higher addressability).


This was at a USGS facility in Flagstaff, and it wasn't something off of the shelf. I always had the idea that it was a custom build by USGS specifically for that project and for that project only. This was the initial tech to stitch multiple images together to wrap around a globe, exactly what Google Earth did to blow us all away 10 years or so ago. Keep in mind that when these projects go live, "We have the tech, but nobody has made one yet" doesn't usually apply. They announce what they want and then take bids.

He's credited all over the place, but here's the site that mentions his original development:
http://www.mapaplanet.org/about.html
 
2014-07-16 09:48:19 PM

neilbradley: Mikey1969: neilbradley: Mikey1969: CSB time...

Uh... 4 *TERA* bytes of RAM? You mean gigabytes, right? What computer architecture could address that amount of memory back then?

Nope, he specified Terabytes. This was when GB hard drives first started hitting the shelves, and he specifically told me a tera was 1,000 GB because I was still amazed at the concept of a friggin' GB. When I had been working with computers in high school, it was all on a floppy drive. Besides, 64 bit will handle more than that(4 Petabytes is the theoretical limit, I think). This was a federal gov't project, so they could throw money at it like it was a stripper and all they had was gold bullion.

And a quick Wiki check confirms that 64 bit was bouncing around by the mid-90s. I'm sure it was still a lumbering beast in comparison, but it had 4 terabytes.

What and where was it? As an aside, just because a computer is "64 bits" doesn't mean the ecosystem can address it all implicitly. There are ecosystem limitations in most cases (i.e. chipset won't allow higher addressability).


I only repeated this 3 or 4 times. It was the computer that this mapping software was originally developed on when things like Google weren't even concepts, let alone software like Google Earth.

http://astrogeology.usgs.gov/
 
Displayed 20 of 20 comments

View Voting Results: Smartest and Funniest


This thread is closed to new comments.

Continue Farking
Submit a Link »






Report