Skip to content
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Guru3D.com)   Nvidia finally says the quiet part out loud about their technology   (guru3d.com) divider line
    More: Obvious, Scalable Link Interface, emergence of low level graphics APIs, game developers, English-language films, SLI support, expertise of the game developer, SLI driver profile, best performance  
•       •       •

1922 clicks; posted to Fandom » on 18 Sep 2020 at 12:47 PM (5 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



21 Comments     (+0 »)
 
View Voting Results: Smartest and Funniest
 
2020-09-18 12:59:46 PM  
I mean, I want to be disappointed, but I have been building gaming PC's for more than a quarter century and I can't remember having ever actually used SLI/Crossfire.

I've bought mobos that could support it, with the intention that I could add a second graphics card later (at lower prices because it will have been out a few years) once my first one started showing its age. But that's not how it ever worked out. By that time there was pretty much always a newer card out that I could buy for a little more than the older card I'd need to SLI, and the performance of the single newer card would have been better than the two older ones SLI'd - meaning that I'd need to upgrade the two older cards pretty soon anyway, which would make it a waste of money.

For example, my last graphics card upgrade was going from a 780ti to a RTX 2080. Two 780ti's in SLI would probably get crushed by my 2080. Instead I dedicated the 780ti to PhysX for a while, then took it out and installed it in one of my servers to assist with transcoding - to no noticeable reduction in speed of my gaming machine.

And seeing how few people IRL actually use SLI... I understand Nvidia not bothering to support it at their driver level anymore, being that newer graphics tech means developers can choose to include it in their games anyway if they think it makes sense.

I think SLI made the most sense to the few people who could afford to buy two of the latest cards at launch for max performance. Most PC gamers don't have that kind of money laying around.
 
2020-09-18 1:18:31 PM  

mongbiohazard: I mean, I want to be disappointed, but I have been building gaming PC's for more than a quarter century and I can't remember having ever actually used SLI/Crossfire.

I've bought mobos that could support it, with the intention that I could add a second graphics card later (at lower prices because it will have been out a few years) once my first one started showing its age. But that's not how it ever worked out. By that time there was pretty much always a newer card out that I could buy for a little more than the older card I'd need to SLI, and the performance of the single newer card would have been better than the two older ones SLI'd - meaning that I'd need to upgrade the two older cards pretty soon anyway, which would make it a waste of money.

For example, my last graphics card upgrade was going from a 780ti to a RTX 2080. Two 780ti's in SLI would probably get crushed by my 2080. Instead I dedicated the 780ti to PhysX for a while, then took it out and installed it in one of my servers to assist with transcoding - to no noticeable reduction in speed of my gaming machine.

And seeing how few people IRL actually use SLI... I understand Nvidia not bothering to support it at their driver level anymore, being that newer graphics tech means developers can choose to include it in their games anyway if they think it makes sense.

I think SLI made the most sense to the few people who could afford to buy two of the latest cards at launch for max performance. Most PC gamers don't have that kind of money laying around.


Agreed.  Only time I went with SLI was when I got a new 8800GT, then got a 1920x1200 monitor for Christmas.  This was when a single upper-mid range card would struggle going with 1080/1200p, and getting a second 8800GT made more sense than going through the pain of selling the first one and stepping up to an Ultra.

These days i just stick to a single upper tier card, and it's a lot easier.

That being said -

What DirectX 12 games support SLI natively within the game?
DirectX 12 titles include Shadow of the Tomb Raider, Civilization VI, Sniper Elite 4, Gears of War 4, Ashes of the Singularity: Escalation, Strange Brigade, Rise of the Tomb Raider, Zombie Army 4: Dead War, Hitman, Deus Ex: Mankind Divided, Battlefield 1, and Halo Wars 2.


Now it's been a while since I've played Civ, I think IV was the last I played.  But... It's not exactly a series I think of when I imagine "Graphically intense enough to need multiple GPUs".
 
2020-09-18 1:22:08 PM  

mongbiohazard: I mean, I want to be disappointed, but I have been building gaming PC's for more than a quarter century and I can't remember having ever actually used SLI/Crossfire.

I've bought mobos that could support it, with the intention that I could add a second graphics card later (at lower prices because it will have been out a few years) once my first one started showing its age. But that's not how it ever worked out. By that time there was pretty much always a newer card out that I could buy for a little more than the older card I'd need to SLI, and the performance of the single newer card would have been better than the two older ones SLI'd - meaning that I'd need to upgrade the two older cards pretty soon anyway, which would make it a waste of money.

For example, my last graphics card upgrade was going from a 780ti to a RTX 2080. Two 780ti's in SLI would probably get crushed by my 2080. Instead I dedicated the 780ti to PhysX for a while, then took it out and installed it in one of my servers to assist with transcoding - to no noticeable reduction in speed of my gaming machine.

And seeing how few people IRL actually use SLI... I understand Nvidia not bothering to support it at their driver level anymore, being that newer graphics tech means developers can choose to include it in their games anyway if they think it makes sense.

I think SLI made the most sense to the few people who could afford to buy two of the latest cards at launch for max performance. Most PC gamers don't have that kind of money laying around.


I used SLI with two 8800GTX's, damn that rig got hot. I also used it to link my gtx970 and gtx 750ti for physX but found it difficult to see much point, especially at the cost of extra heat and power.
 
2020-09-18 1:23:40 PM  
Its still possible for graphics card vendors to roll drivers that make any combination of display adapters look like a single GPU to applications. The real question is whether there is enough TAM for such a solution to be worth the effort. Personally I'd rather see efforts focused elsewhere but at the end of the day chasing the almighty dollar is still priority one.
 
2020-09-18 1:27:25 PM  

freakdiablo: Now it's been a while since I've played Civ, I think IV was the last I played. But... It's not exactly a series I think of when I imagine "Graphically intense enough to need multiple GPUs".


Strange Brigade being on the Vulkan SLI list was even weirder to me.  There really isn't anything going on there that should even come close to needing multiples.
 
2020-09-18 1:31:51 PM  
I've tried SLI, even though the cost:performance ratio was all wrong, but it was always broken for 3D, and does not work for VR, so I've just bought the best single card I can afford for quite a while now, and upgrade as often as I can.
 
2020-09-18 1:51:11 PM  
I too have been building my onw for a long time now.
Did SLI once, never bothered to spend the monye a subsequent build.

Just too much hassle and cost, for too little performance gains.

I don;t care what the FPS change at what details level, how much time sink and effort do i have to dump int making it work or figuring out that disabling will be needed for a game to work well.

once the FPS and detail level is good enough, I can't asked to output effort for better.
It's either push button reliably better, because i spent twice as much for it, or the hassle is way not worth the gains to be found.


.
 
2020-09-18 1:52:59 PM  
I think SLI was only relevant for the first year of VR.  After that newer graphics cards were powerful enough where you only needed one for VR.
 
2020-09-18 2:12:18 PM  
SLI mattered when it took a lot of GPU to render good framerates in high quality for 1080p. With cards like the 3090, single GPUs are able to render to 8k displays in high quality... SLI hasn't been relevant for the past two generations of GPUs, and the gains were approaching rapidly diminishing returns for that configuration anyway.

In the next few generations, it will be about adding more specialty cores (and making them more efficient), as well as fine tuning the needs of VR for things like foveated rendering, body and gesture recognition.

Honestly, I think most people can happily desktop game at 1080p these days with Ultra and ray tracing, and be pretty happy. You get that with mid level cards, no need at all for SLI.
 
2020-09-18 2:20:16 PM  
They ended SLI support again??

I think the only game that came out in the past few years that supported it was Farcry 5. I got it running on Red Dead 2 but you need Riva tuner and its Vulcan only, it doesn't work on DX12.
Doom Eternal doesnt need it.

I did it just to do it. Totally not worth spending 1400$ on GPUs.
 
2020-09-18 2:26:51 PM  
I'd been a long-time user of SLI since voodoo2, but my 1080s were the last ones.  Once I'd read that they stripped out the really good functionality from Nvlink to make it just a dumb bridge like the SLI connectors I knew it was dead dead.  Not to mention the handful of games that supported it well was getting so slim and skewing beyond what I actually wanted to play.

It was also just logistically a royal pain in the ass to manage the multiple cards and their heat dumping and then juggling SLI settings and tweaks etc to kind of sort of get things working too.
 
2020-09-18 2:53:42 PM  
Spend your money on a better power supply. Good investment.
 
2020-09-18 3:46:40 PM  
I always thought using dual videocards was an ill advised idea. That's not even counting lack of game support.

If you spend $150 now on a card, then add a second card later for another $150, you've spent $300 but don't get a $300 card's worth of performance. Your one upside from doing it this way is you spend less upfront and maybe even get the second card after a price drop to let's say $100. You've still spent enough to not get good value in performance for the money.

But the reality is worse. Games get more demanding and videocards get faster over time. Unless you buy that second card relatively soon, you're going to be buying old and outdated tech. So not only will you get less performance per dollar, you're getting an additional hit to performance simply due to the passage of time.

Meanwhile, buying the second card relatively quickly defeats the purpose of spreading the expense out over time. You might as well have just waited until you had the entire $300 and bought the better individual videocard.
 
2020-09-18 3:50:05 PM  

LesserEvil: SLI mattered when it took a lot of GPU to render good framerates in high quality for 1080p. With cards like the 3090, single GPUs are able to render to 8k displays in high quality... SLI hasn't been relevant for the past two generations of GPUs, and the gains were approaching rapidly diminishing returns for that configuration anyway.

In the next few generations, it will be about adding more specialty cores (and making them more efficient), as well as fine tuning the needs of VR for things like foveated rendering, body and gesture recognition.

Honestly, I think most people can happily desktop game at 1080p these days with Ultra and ray tracing, and be pretty happy. You get that with mid level cards, no need at all for SLI.


That's where I'm at right now. I have a 1080p monitor I'm gaming on.... with my PC that has an i7-8700k, a RTX 2080, and 32GB of ram. With that machine, there's really nothing I'm going to play which is going to be a problem for my machine to run at all at the highest graphics settings.

I'm currently deciding on whether to upgrade my main display to 1440 or 4K. I'm leaning toward 4K just because monitors tend to stick with you a lot longer than the PC's themselves (for me anyway), so I'm afraid that 1440p wouldn't make sense as an upgrade at this point. Can always run it in a lower resolution if need be.
 
2020-09-18 3:58:33 PM  

Birnone: I always thought using dual videocards was an ill advised idea.


For games, yes. My gaming rigs have always had only one card. For 3D rendering more cards is more performance. But no rendering software supports SLI as they treat the video cards as separate entities.
 
2020-09-18 4:02:23 PM  

Birnone: I always thought using dual videocards was an ill advised idea. That's not even counting lack of game support.

If you spend $150 now on a card, then add a second card later for another $150, you've spent $300 but don't get a $300 card's worth of performance. Your one upside from doing it this way is you spend less upfront and maybe even get the second card after a price drop to let's say $100. You've still spent enough to not get good value in performance for the money.

But the reality is worse. Games get more demanding and videocards get faster over time. Unless you buy that second card relatively soon, you're going to be buying old and outdated tech. So not only will you get less performance per dollar, you're getting an additional hit to performance simply due to the passage of time.

Meanwhile, buying the second card relatively quickly defeats the purpose of spreading the expense out over time. You might as well have just waited until you had the entire $300 and bought the better individual videocard.


Oh it was never an economic advantage to do that, anyone trying to justify it that way wasn't thinking things through.  What SLI helped the most with was being able to drive very demanding screen resolutions that went beyond the normal.  Like in years past I always ran a triple-screen setup so I had effectively 3x the pixels and 2-3x the FOV then the target market, so SLI was typically required to maintain good frame rate and settings.  Even today I still run into games who's settings go beyond what a top-end card can handle, even a high-clock 2080TI can't do RDR2 4K@60 without dumping some of the settings down.

I still think next-gen VR requirements sit above what the cards are able to deliver, at least without sacrificing graphics settings, which is a sin that cannot be forgiven obviously.  The 3080s are targeting 4K screens at 60-90fps for a usual experience.  Next-gen VR is gunning for 2x4K, at as fast a refresh rate as you can pull off, with the minimum being 90fps.
 
2020-09-18 4:18:07 PM  

freakdiablo: mongbiohazard: I mean, I want to be disappointed, but I have been building gaming PC's for more than a quarter century and I can't remember having ever actually used SLI/Crossfire.

I've bought mobos that could support it, with the intention that I could add a second graphics card later (at lower prices because it will have been out a few years) once my first one started showing its age. But that's not how it ever worked out. By that time there was pretty much always a newer card out that I could buy for a little more than the older card I'd need to SLI, and the performance of the single newer card would have been better than the two older ones SLI'd - meaning that I'd need to upgrade the two older cards pretty soon anyway, which would make it a waste of money.

For example, my last graphics card upgrade was going from a 780ti to a RTX 2080. Two 780ti's in SLI would probably get crushed by my 2080. Instead I dedicated the 780ti to PhysX for a while, then took it out and installed it in one of my servers to assist with transcoding - to no noticeable reduction in speed of my gaming machine.

And seeing how few people IRL actually use SLI... I understand Nvidia not bothering to support it at their driver level anymore, being that newer graphics tech means developers can choose to include it in their games anyway if they think it makes sense.

I think SLI made the most sense to the few people who could afford to buy two of the latest cards at launch for max performance. Most PC gamers don't have that kind of money laying around.

Agreed.  Only time I went with SLI was when I got a new 8800GT, then got a 1920x1200 monitor for Christmas.  This was when a single upper-mid range card would struggle going with 1080/1200p, and getting a second 8800GT made more sense than going through the pain of selling the first one and stepping up to an Ultra.

These days i just stick to a single upper tier card, and it's a lot easier.

That being said -

What DirectX 12 games support SLI natively within the game?
DirectX 12 titles include Shadow of the Tomb Raider, Civilization VI, Sniper Elite 4, Gears of War 4, Ashes of the Singularity: Escalation, Strange Brigade, Rise of the Tomb Raider, Zombie Army 4: Dead War, Hitman, Deus Ex: Mankind Divided, Battlefield 1, and Halo Wars 2.

Now it's been a while since I've played Civ, I think IV was the last I played.  But... It's not exactly a series I think of when I imagine "Graphically intense enough to need multiple GPUs".


Well, if you turn all the graphics options to max every hex is fully 3d, animated, shaded, and occluded by clouds. My old 1050ti couldnt handle it on High (it would have melted on Ultra), and that card was new when Civ6 came out.
 
2020-09-18 4:28:08 PM  
I bought a Radeon R9 290x six years back. It still works great. It would probably not be happy with 4k gaming, but my monitor doesn't support that either so I keep chugging along at 1080p. Newer cards are definitely faster, (they ought to be after nearly a decade) but eh.
 
2020-09-18 4:32:57 PM  
My understanding of dual card use is that the videocard memory stays the same as far as the software is concerned. So if you use two 4gb vram cards you don't have the equivalent of 8gb vram. What you have is the processing power of those two cards minus the overhead of it being a two card setup rather than one card, with only 4gb vram for all practical purposes. Did that change? As you increase resolution you will need much more vram or you will have to reduce settings in game. Spending money on two cards then having to still play at reduced settings will turn people off.

I never thought about rendering because that's a specialized use that represents a small part of the sales total for these cards. I don't think SLI could survive even if everyone who uses it for rendering always bought two cards. Selling cards to gamers is the equivalent of selling SUVs to car buyers, it generates the profit that subsidizes the development and sales of niche products. If it does nothing for gamers then that tech is going to limp along until it dies.
 
2020-09-18 8:50:11 PM  

Birnone: My understanding of dual card use is that the videocard memory stays the same as far as the software is concerned. So if you use two 4gb vram cards you don't have the equivalent of 8gb vram. What you have is the processing power of those two cards minus the overhead of it being a two card setup rather than one card, with only 4gb vram for all practical purposes. Did that change? As you increase resolution you will need much more vram or you will have to reduce settings in game. Spending money on two cards then having to still play at reduced settings will turn people off.


You're right about the VRAM too, it was duplicated on each card so that was yet another variable in the equation.  In one of my sets I was able to get special doubled-up edition GTX580s with an astounding 3GB/card as a way to get around that.
 
2020-09-18 10:46:33 PM  
Heck, for those of us who render, you have to specifically turn off SLI to properly use your video cards for IRAY rendering anyway.
 
Displayed 21 of 21 comments

View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking





On Twitter



  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.