Skip to content
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(PC Gamer)   A history of literally changing the game   (pcgamer.com) divider line
    More: Cool, Nvidia, Video cards, ATI Technologies, Graphics processing unit, Scalable Link Interface, GeForce 8 Series, GeForce, Nvidia's pocket  
•       •       •

1696 clicks; posted to Geek » on 02 Jul 2020 at 3:10 PM (11 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



28 Comments     (+0 »)
 
View Voting Results: Smartest and Funniest
 
2020-07-02 12:31:43 PM  
The RTX 2080 Ti isn't just an overpowered graphics card, it's also the reason why nobody's crying over losing the ability to enable SLI: one's enough, for now.

No, its more like no can stomach the idea of paying for two of them when SLI has been left in such a shambles of diminished support from developers and Nvidia itself.  The bonkers price tag jump for this one vs previous top-tier cards didn't help either.  4K gaming at 60FPS is more approachable then ever with that behemoth, but trying to reach higher then that would still take two of the brutes along with SLI support that largely doesn't exist anymore.

One day maybe AMD will come back to competing in the top-end space and force the costs down into competitive levels.
 
2020-07-02 3:05:50 PM  
... did I just lose The Game?
 
2020-07-02 3:18:26 PM  
My first graphics card was a Diamond S3 Savage4 GT.

Fark user imageView Full Size


It had terrible support except for NFS and MotoGP pretty much. 

Now I have a 2080, I wish I had the Ti.
 
2020-07-02 3:20:58 PM  
Jesus... all I can remember from those years is having to update your drivers to play any new game.
 
2020-07-02 3:21:46 PM  
Huh. This guy's an idiot.

Hercules. Graphics. Adapter.

There. There's your game changer.
 
2020-07-02 3:29:43 PM  
I still have my 3DFX Voodoo in a box in the closet.

/hey it might come in handy someday
 
2020-07-02 3:35:08 PM  
Radeon 9700 was every bit as game changing as the Geforce 8800 GT blowdryer was.
 
2020-07-02 3:37:04 PM  

Jedekai: Huh. This guy's an idiot.

Hercules. Graphics. Adapter.

There. There's your game changer.


They don't mention any VESA cards either.
 
2020-07-02 3:38:06 PM  
List fails with the ATi 9800 Pro. That card was a magnificent beast in its day. It had color depth that Nvidia couldn't hope to match and damn fine acceleration on top of it.
 
2020-07-02 3:51:03 PM  
The first half of the list is accurate. The second half is basically the hot hands of recent years.
 
2020-07-02 4:07:35 PM  
$200 for a 4MB Diamond Monster, IIRC that was the name. Fired up Quake and it might have been the first time I splooged over something besides Pamela Anderson or Jenny McCarthy.
 
2020-07-02 4:13:54 PM  
My first 3D video accelerator was a 3DFX Voodoo card. I remember the "click" that it would make when it activated and I remember being amazed at how much it improved my framerate in Quake. It made the PC version of Tomb Raider decidedly superior to the Playstation version, also.

I played other games with it before eventually moving on to an Nvidia card of some kind (TNT2 I believe, or maybe just the original TNT), but those two are the first that I really used with the card.
 
2020-07-02 4:26:53 PM  
The list really needed the GeForce 2 MX. it was a great budget GPU for the time period.
 
2020-07-02 4:32:14 PM  
Had the Voodoo and Voodoo2 as well as the 8800 gtx, nowadays even as an adult with a job I can't justify the cost of those top tier cards. I was hoping the list would include more cards that were attainable without a second mortage.

rtx 2070 now, no complaints
 
2020-07-02 5:04:26 PM  
Worked on the design of the Matrox G400 where we launched the idea of a dual monitor system.

However the owners of Matrox, a private co, were a bit of pinch purses and they scrapped every chance Matrox would be able to capitalize on that or other technologies. So no patents were made.

And a few years later, Matrox faded from competition, and a few years later to that ATI was bought.
 
2020-07-02 5:07:57 PM  
Lol, they rolled with the 5970, a buggy card that no one bought and rarely met its full potential using an idea that was dropped within a couple generations, but didn't go with the 4770, just giving it a casual mention, which redefined the mid-range and was purpose built to dominate that market on price/performance, reviving ATI.

I was under the impression changing the game meant introducing ideas and technologies that redefined the game and became the standard for a long time.

Although I guess ruining the game technically counts as changing it.
 
2020-07-02 5:43:35 PM  
pbs.twimg.comView Full Size


This plus seeing Quake with opengl for the first time.
 
2020-07-02 6:10:17 PM  
I skipped the Voodoo 3dfx and went for the Canopus Pure 3d instead. My next 3d setup was Voodoo 2 SLI with active cooling, extremely active. Those were the days when I overclocked everything even though overclocking wasn't the 'click a check box' that it is today. Change jumpers, change voltage, run a wire, toast the BIOS which meant I needed a new mobo since the BIOS chips weren't removable...
 
2020-07-02 6:13:44 PM  

BumpInTheNight: No, its more like no can stomach the idea of paying for two of them


I have two of them. But I don't use them for gaming. 3D rendering is a very expensive hobby.
 
2020-07-02 6:18:00 PM  

Lonestar: Worked on the design of the Matrox G400 where we launched the idea of a dual monitor system.


Really? I loved my G400. Great card.
 
2020-07-02 6:39:40 PM  

Lonestar: Worked on the design of the Matrox G400 where we launched the idea of a dual monitor system.


That Matrox Triplehead2Go dongle was my first taste of triple-screen gaming, it was glorious.

Ed Grubermann: I have two of them. But I don't use them for gaming. 3D rendering is a very expensive hobby.


Same, but two different PCs.  Wife and I play Ark and Ark is a hungry beast for GPU hardware.

/expensive hobbies indeed
 
zez
2020-07-02 6:51:39 PM  
I had a riva128 and the drivers for quake came out when I was halfway through the game.  It was so different looking that I couldn't remember where I was on the map.
 
2020-07-02 7:49:12 PM  
S3 Virge = Best Ever!
 
2020-07-02 8:08:14 PM  
VGA  was what changed everything for the PC, along with the Gravis Ultrasound card. With those the PC was finally able to catch up with competitors like the Amiga and the Falcon.

Dope (Complex, 1995, PC Demo)
Youtube de6P9JPnBVA
 
2020-07-02 10:23:42 PM  
Extending my rant on the RTX 2080TI:  We overpaid for this stupid card partly in thanks to the vaporware that is Nvidia's implementation of raytracing showing up in games.  There's what, like three meaningful titles that use it now vs a couple dozen that were promising when the series showed up coming up on two years ago?  Another cool tech that would be great but pushing more development cycles onto the game creators killed it in the crib, just like various other gimmicks.

If Nvidia had released a GTX2080TI that didn't have any of the those features or the cost built in associated with them you know in your heart which of the two you'd have bought.
 
2020-07-02 11:00:51 PM  

BumpInTheNight: Extending my rant on the RTX 2080TI:  We overpaid for this stupid card partly in thanks to the vaporware that is Nvidia's implementation of raytracing showing up in games.  There's what, like three meaningful titles that use it now vs a couple dozen that were promising when the series showed up coming up on two years ago?  Another cool tech that would be great but pushing more development cycles onto the game creators killed it in the crib, just like various other gimmicks.

If Nvidia had released a GTX2080TI that didn't have any of the those features or the cost built in associated with them you know in your heart which of the two you'd have bought.


It's not that you won't get games you can run ray tracing on, it's just that Turing isn't powerful enough for it and was the only real thing that was vaguely capable of running it in real time. So people didn't bother.

Everything supports it (AMD, Nvidia and consoles) in the coming gen and offers a visible difference so it will become a thing that your GPU is capable of, but only just. Like running a 4K game on an RX 580. You can turn other settings down to do so but it's not really worth it.

Even if Nvidia's raytracing dies in favour of an open version, they'll just rewrite the drivers to forward the processing to the tensor cores so there should be no real difference in which implementation is used, like what happened with GSync and FreeSync

So expect to find future games have it, particularly after the new consoles release, and you will technically be able to turn it on, but also expect to turn it off again after, Nvidia really should have halved the size of the tensor package and used that space for more CUDA cores.

On a positive note that noise cancelling thing is intended to be offloaded to the tensor cores in a future release, so they're not entirely useless. Just mostly useless.
 
2020-07-03 1:35:04 AM  
Have a GTX 1060 3GB in my computer. Despite being a generation old, I can still max out everything and get at least a solid 30fps, usually 60.

As revolutionary as graphics were in the 90s, those were some expensive growing pains. A $250 card (in 1990s money) could be made completely obsolete within 18 months. It has only been in the last 10-20 years that PC gaming hardware has managed to level off a bit and not become obsolete by the time you get it home.
 
2020-07-03 8:42:28 AM  

dyhchong: BumpInTheNight: Extending my rant on the RTX 2080TI:  We overpaid for this stupid card partly in thanks to the vaporware that is Nvidia's implementation of raytracing showing up in games.  There's what, like three meaningful titles that use it now vs a couple dozen that were promising when the series showed up coming up on two years ago?  Another cool tech that would be great but pushing more development cycles onto the game creators killed it in the crib, just like various other gimmicks.

If Nvidia had released a GTX2080TI that didn't have any of the those features or the cost built in associated with them you know in your heart which of the two you'd have bought.

It's not that you won't get games you can run ray tracing on, it's just that Turing isn't powerful enough for it and was the only real thing that was vaguely capable of running it in real time. So people didn't bother.

Everything supports it (AMD, Nvidia and consoles) in the coming gen and offers a visible difference so it will become a thing that your GPU is capable of, but only just. Like running a 4K game on an RX 580. You can turn other settings down to do so but it's not really worth it.

Even if Nvidia's raytracing dies in favour of an open version, they'll just rewrite the drivers to forward the processing to the tensor cores so there should be no real difference in which implementation is used, like what happened with GSync and FreeSync

So expect to find future games have it, particularly after the new consoles release, and you will technically be able to turn it on, but also expect to turn it off again after, Nvidia really should have halved the size of the tensor package and used that space for more CUDA cores.

On a positive note that noise cancelling thing is intended to be offloaded to the tensor cores in a future release, so they're not entirely useless. Just mostly useless.


No worries, with two decades of buying video cards and following them I was well aware of the albatross that the ray tracing hardware built into the RTX 2xxx series was.  About the expectation of future games to support it:  I imagine it'll be as widely adopted as SLI, as in, it won't except for a sliver of major devs pushing classic AAA titles that are bored and looking to add yet another buzz word to their engine's capabilities.

I agree with you about how they should have reduced the emphasis on the tensor hardware, but they knew they have no competition in this space and they don't even really care that they knew full well how little of a user-relevant innovation this series was.
 
Displayed 28 of 28 comments

View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking




On Twitter



  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.