Skip to content
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(The Verge)   Nvidia offers new two year old technology to address GPU shortage   (theverge.com) divider line
    More: Fail, CUDA, ATI Technologies, Scalable Link Interface, GPGPU, Graphics processing unit, much VRAM, secondary market, couple of low-cost options  
•       •       •

882 clicks; posted to STEM » on 02 Dec 2021 at 2:00 PM (6 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



39 Comments     (+0 »)
View Voting Results: Smartest and Funniest
 
2021-12-02 12:13:45 PM  
FWIW, I'm on a 2060 right now and I've been blown away with its performance. I can run damn near anything at 2K res and max settings.

When I can't, I drop it to 1080 with max settings and no issues.

I rarely come across a game I can't play beautifully.
 
2021-12-02 12:23:49 PM  

HedlessChickn: FWIW, I'm on a 2060 right now and I've been blown away with its performance. I can run damn near anything at 2K res and max settings.

When I can't, I drop it to 1080 with max settings and no issues.

I rarely come across a game I can't play beautifully.


Yeah, the 2060 and 2070 are the sweet spot for 2k gaming. I have a 2070 gaming laptop from 2019, and short of Battlefield 2042, it does a great job on everything at 2k, and runs a lot of games at 4k with good quality. DLSS has helped performance out a lot.  Oddly enough, BF2042 is so CPU bound that the 2070 laptop is basically unplayable, but an audio workstation I built with an AMD 5950x and 2 nVidia 1080s in SLI hits 60+ FPS in BF2042 at 4k resolutions with a combination of high and medium settings. In my experience, those 2 1080s in SLI were about 90% of the speed of the laptop variety 2070, so the CPU is definitely doing the heavy lifting there.
 
Xai
2021-12-02 2:02:00 PM  
Honestly this is a fantastic idea because they can fabricate the chips in house
 
2021-12-02 2:08:11 PM  
If you're sitting there with a broken video card, do you want 2 year old technology or no card at all?
 
2021-12-02 2:08:20 PM  
But can I mine bitcoin with that?
 
2021-12-02 2:10:20 PM  

HedlessChickn: I can run damn near anything at 2K res and max settings.


Completely unrelated, but it bugs me that people call 2560x1440 "2K" now.  For common TV and PC resolutions, "4K" is "about 4000 pixels wide," usually 3840x2160.  "2K" should also be "about 2000 pixels wide," and 1920x1080 is the close match.

In cinema standards, they're exactly 2K and 4K pixels wide.  Since the common TV/PC resolutions don't match up exactly, we pick close ones... and for quite a while, that's how the term was used. I don't know exactly when the split happened, and people started using it for 2560x1440, but it's weird.
 
2021-12-02 2:10:59 PM  
Better than being stuck with a GTX 1030 or 710...
 
2021-12-02 2:11:32 PM  
I'm still running on a GTX 1070.  It's not the latest and greatest by any stretch, but nine times out of ten, it's honestly plenty.  I'd still like to upgrade just for those times where my current card can only run on medium settings, though, so this is a pretty nice compromise for a lower-cost option.

That said, this really only relieves the competition from people who already have cards at this level.  The main bottleneck is still scalpers and miners, and I'm skeptical that they won't snatch these up too.  If they're the best currently available, people like those won't turn their noses up just because it's not quite as good as the bleeding-edge cards.  It's a nice gesture on Nvidia's part, but doesn't address the underlying problem.
 
2021-12-02 2:11:41 PM  
I was overdue for a video card when I got my 2080 in late 2019. I'd been using a 780Ti before the upgrade, and I was kinda pissed at myself after buying it, because I thought it was a better idea to wait for the 3000 series, but I just couldn't take the 780Ti's performance anymore (fine when it launched, but old at this point).

Best bad decision I ever made. Couldn't have seen THAT shiat coming.

The 2080 has been treating me well. 1080p on Cyberpunk 2077 with ray tracing and all the goodies on and I still get 60fps in it, if memory serves (played through twice, but that was a while ago). Dips below that at 1440p, but still performs admirably. Sold me on ray tracing too, I thought it wasn't really all that big a deal before I got a chance to see it in action in CP2077. Looks real, real pretty.

GPU's are hard to come by in general. A 2060 is better than nothing by a long shot, and probably a decent upgrade for a lot of people who got screwed by the shortages. If you can get your hands on one at MSRP and it's faster than what you got... why not?
 
2021-12-02 2:16:56 PM  

mongbiohazard: Best bad decision I ever made. Couldn't have seen THAT shiat coming.


I tripped and fell into the same lucky accident.  Bought a 2080Ti because it was the only card of that generation really capable of running all the raytracing bells and whistles at a playable framerate.  It was a dumb purchase that didn't otherwise justify the price premium over even a 2080, and I'd been regretting it right up until we all got shut in our houses and I played like 5x the video games for 18 months and then eventually sold the card for more than I'd originally paid for it.

Dumb decision at the time, turned into a good value during lockdown and then literally a free upgrade plus profit now that we're into the Great GPU Shortage.  I'll still admit it was a dumb purchase.  I just got lucky.
 
2021-12-02 2:24:21 PM  
Or

You could just buy AMD
 
2021-12-02 2:26:11 PM  

SuperChuck: If you're sitting there with a broken video card, do you want 2 year old technology or no card at all?


Yeah, this is a good card for that.  It's also built in a different fab (TSMC 12nm), which means it really will increase availability-- unlike adding another model in the 3000-series line which will just fight for the same factory time as the other 3000-series cards (Samsung 8nm).
 
2021-12-02 2:28:50 PM  

Linux_Yes: Or

You could just buy AMD


Yes... of course... that's exactly what a cat might say...

external-preview.redd.itView Full Size
 
2021-12-02 2:31:53 PM  

raygundan: mongbiohazard: Best bad decision I ever made. Couldn't have seen THAT shiat coming.

I tripped and fell into the same lucky accident.  Bought a 2080Ti because it was the only card of that generation really capable of running all the raytracing bells and whistles at a playable framerate.  It was a dumb purchase that didn't otherwise justify the price premium over even a 2080, and I'd been regretting it right up until we all got shut in our houses and I played like 5x the video games for 18 months and then eventually sold the card for more than I'd originally paid for it.

Dumb decision at the time, turned into a good value during lockdown and then literally a free upgrade plus profit now that we're into the Great GPU Shortage.  I'll still admit it was a dumb purchase.  I just got lucky.


Just think of it as "retroactively smart".
 
2021-12-02 2:35:51 PM  
"it is a premium version of the RTX 2060 6GB and we expect the price to reflect that."

Yaaaaaaaaay, obscenely high video card prices solved.

Thank you Nvidia!
 
2021-12-02 2:54:24 PM  
I just paid about $1500 for a 3080, my computer picked a bad time to commit electrical seppuku.  I was hoping to start a new build around the time of release for the 4000 series, but that us months away at best.
 
2021-12-02 3:01:42 PM  

mongbiohazard: raygundan: mongbiohazard: Best bad decision I ever made. Couldn't have seen THAT shiat coming.

I tripped and fell into the same lucky accident.  Bought a 2080Ti because it was the only card of that generation really capable of running all the raytracing bells and whistles at a playable framerate.  It was a dumb purchase that didn't otherwise justify the price premium over even a 2080, and I'd been regretting it right up until we all got shut in our houses and I played like 5x the video games for 18 months and then eventually sold the card for more than I'd originally paid for it.

Dumb decision at the time, turned into a good value during lockdown and then literally a free upgrade plus profit now that we're into the Great GPU Shortage.  I'll still admit it was a dumb purchase.  I just got lucky.

Just think of it as "retroactively smart".


Hooray for our mutual accidental success!
 
2021-12-02 3:08:07 PM  

Linux_Yes: Or

You could just buy AMD


Where?
 
2021-12-02 3:23:07 PM  

delsydsoftware: In my experience, those 2 1080s in SLI were about 90% of the speed of the laptop variety 2070,


What?

The DESKTOP 1080 and 2070 are on par with each other. A single desktop 1080 is certainly going to outperform a mobile 2070. Your 1080 SLI setup is severely broken.
 
2021-12-02 3:30:22 PM  
If miners will pay for expensive cards more than gamers why aren't they making their cards just for the miners and ignoring gaming completely? They're in business to make money aren't they?
 
2021-12-02 3:41:28 PM  
Sheesh.

I balked before paying $206.72 off Amazon for a Radeon RX 580 a little over a year ago.  Now that pansy card is listed there starting at $600.
 
2021-12-02 3:48:00 PM  

raygundan: HedlessChickn: I can run damn near anything at 2K res and max settings.

Completely unrelated, but it bugs me that people call 2560x1440 "2K" now.  For common TV and PC resolutions, "4K" is "about 4000 pixels wide," usually 3840x2160.  "2K" should also be "about 2000 pixels wide," and 1920x1080 is the close match.

In cinema standards, they're exactly 2K and 4K pixels wide.  Since the common TV/PC resolutions don't match up exactly, we pick close ones... and for quite a while, that's how the term was used. I don't know exactly when the split happened, and people started using it for 2560x1440, but it's weird.


We really should label them by megapixel, especially when considering all of the non-16x9 widescreens out there.

1080p is 2MP
2560x1440 is 3.7MP
4K is 8MP

The number of pixels is going to be the key factor for GPU performance. (well, I guess color depth too)
 
2021-12-02 4:12:39 PM  
Like you'd be able to farking get one anyway...
 
2021-12-02 4:20:06 PM  

raygundan: mongbiohazard: Best bad decision I ever made. Couldn't have seen THAT shiat coming.

I tripped and fell into the same lucky accident.


Me three! Went from a i7-2600K/GTX 970 to an R9-3950x/2080 Super at the beginning of 2020. Been able to run everything on my 3440x1440 Freesynch monitor with basically no compromises. Sure came in handy when the lockdowns hit...
 
2021-12-02 4:42:51 PM  

Russ1642: If miners will pay for expensive cards more than gamers why aren't they making their cards just for the miners and ignoring gaming completely? They're in business to make money aren't they?


They do make mining only cards. Been making them for a few years already.

This $5000 Graphics Card Can't Game
Youtube EcGkF9SBuSo
 
2021-12-02 4:46:33 PM  

Randrew: Sheesh.

I balked before paying $206.72 off Amazon for a Radeon RX 580 a little over a year ago.  Now that pansy card is listed there starting at $600.


I tried to replace the ancient ATI Radeon X1650 Pro (yes, I wrote that right - ATI, not AMD) that's in one of my servers with a cheap modern graphics card. Just something good enough for the machine to boot and attach a monitor to. In normal times that would be a $35 issue. A few months ago when I looked I decided that maybe ATI can live on just a little bit longer over here. Cheap cards are hard to find. I mostly found low profile ones, without adapters.
 
2021-12-02 4:52:22 PM  

madgonad: delsydsoftware: In my experience, those 2 1080s in SLI were about 90% of the speed of the laptop variety 2070,

What?

The DESKTOP 1080 and 2070 are on par with each other. A single desktop 1080 is certainly going to outperform a mobile 2070. Your 1080 SLI setup is severely broken.


SLI scaling doesn't work the way you think it does. 2 cards in SLI doesn't equal a 100% increase in performance. It's more like 20-40%, depending on the architecture of the card and the game you're running. nVidia doesn't make SLI profiles anymore, having basically dropped support for SLI altogether, so performance can vary wildly between older and modern games. If there isn't a good SLI profile for a game, alternate frame rendering is used, which is not as efficient and causes performance losses. Newer games are also optimized for RTX, and the 1080 was the last of the non-RTX systems. That system benchmarks extremely well, but benchmarks don't actually result in better real world performance.
 
2021-12-02 5:48:40 PM  

Russ1642: If miners will pay for expensive cards more than gamers why aren't they making their cards just for the miners and ignoring gaming completely? They're in business to make money aren't they?


Mostly because the differences between a card designed for mining and a card designed for gaming are very small-- just a couple of bucks worth of video connectors you can remove.

But if the prices are only a little bit different, miners will prefer gaming cards because when they're no longer profitable for mining they can still be sold used to gamers.  An unprofitable mining card doesn't have a used market.
 
2021-12-02 6:01:32 PM  

delsydsoftware: madgonad: delsydsoftware: In my experience, those 2 1080s in SLI were about 90% of the speed of the laptop variety 2070,

What?

The DESKTOP 1080 and 2070 are on par with each other. A single desktop 1080 is certainly going to outperform a mobile 2070. Your 1080 SLI setup is severely broken.

SLI scaling doesn't work the way you think it does. 2 cards in SLI doesn't equal a 100% increase in performance. It's more like 20-40%, depending on the architecture of the card and the game you're running. nVidia doesn't make SLI profiles anymore, having basically dropped support for SLI altogether, so performance can vary wildly between older and modern games. If there isn't a good SLI profile for a game, alternate frame rendering is used, which is not as efficient and causes performance losses. Newer games are also optimized for RTX, and the 1080 was the last of the non-RTX systems. That system benchmarks extremely well, but benchmarks don't actually result in better real world performance.


Yep. SLI was always pretty niche, as only so many people can afford two high-end cards. I'd always plan to add a second one later down the road, when they were cheaper and there were newer cards out, but every single time when I got to a time to do it it made more sense to just buy one current card instead. Great in theory, not so much in practice.

And particularly in more recent years, performance gains from doing it weren't all that great. In some specific cases SLI actually harmed performance.
 
2021-12-02 6:02:48 PM  

raygundan: Russ1642: If miners will pay for expensive cards more than gamers why aren't they making their cards just for the miners and ignoring gaming completely? They're in business to make money aren't they?

Mostly because the differences between a card designed for mining and a card designed for gaming are very small-- just a couple of bucks worth of video connectors you can remove.

But if the prices are only a little bit different, miners will prefer gaming cards because when they're no longer profitable for mining they can still be sold used to gamers.  An unprofitable mining card doesn't have a used market.


Check out that LTT video I linked. Seems like there's more to that card than just not including a HDMI connector or two.
 
2021-12-02 6:21:35 PM  

Randrew: Sheesh.

I balked before paying $206.72 off Amazon for a Radeon RX 580 a little over a year ago.  Now that pansy card is listed there starting at $600.


Whoa. That's exactly what I bought back then on Amazon.

1920x1080 is fine for me so it's more than good enough. I'm astounded how much it has gone up in price.
 
2021-12-02 7:13:48 PM  

mongbiohazard: Check out that LTT video I linked. Seems like there's more to that card than just not including a HDMI connector or two.


That one is based on a datacenter deep-learning chip, instead of a gaming chip-- but the idea's the same.  It's got almost no substantial differences between it and the datacenter card it's based on.

But the problem remains-- that $5000 card is only a little bit faster at mining than the biggest gaming card, the 3090 (when they were selling them without attempts to limit mining speed).  158MH/s vs. 125MH/s.

You can buy three 3090s for less than the price of that thing, which will give you roughly double the profit per day mining (this accounts for the lower efficiency of the 3090-- but you have so many of them for the price you come out ahead).  And then turn around and sell them to gamers when you're done.That datacenter-based mining card has the same issue as one based on a gaming SKU does-- it's no good for anything but mining.  When it's no longer profitable to mine, nobody will want it.Long story short, the idea of "mining cards" is mostly not appealing to miners, because they end up with lower profit and no resale.
 
2021-12-02 7:27:28 PM  
mongbiohazard:

Yep. SLI was always pretty niche, as only so many people can afford two high-end cards. I'd always plan to add a second one later down the road, when they were cheaper and there were newer cards out, but every single time when I got to a time to do it it made more sense to just buy one current card instead. Great in theory, not so much in practice.

And particularly in more recent years, performance gains from doing it weren't all that great. In some specific cases SLI actually harmed performance.


I got a good deal on the second card, so it was worth it.  I had them in SLI in an old AMD 8350 system for years, and they were great. When I built this audio workstation, I thought I might as well throw them in to see how it performed. They're still pretty good. If I could replace them with a single RTX card I would but that won't happen for a while. It was really just supposed to be a CPU and memory heavy workstation for music production, but then I had to install some games on it :)

I had crossfired ATI 5770s back in the day, and those had something like 80% scaling, so they really worked well. The tradeoff was loads of microstuttering. With Freesync and GSync, it's not even a problem now, but damn that was annoying.
 
2021-12-02 9:49:15 PM  

basscomm: Linux_Yes: Or

You could just buy AMD

Where?


Microcenter AMD case is always full.

6900xt mines 62. 3060ti mines more than that.
 
2021-12-02 10:50:41 PM  
Hi, I'm a console gamer but I just wanted to say hi and that I don't understand anything in the thread but it's fun to read.
 
2021-12-02 11:38:47 PM  

mr0x: basscomm: Linux_Yes: Or

You could just buy AMD

Where?

Microcenter AMD case is always full.

6900xt mines 62. 3060ti mines more than that.



That would be great if I lived anywhere near a Microcenter. And had $1,500 to blow on a video card.
 
2021-12-02 11:49:43 PM  

king of vegas: Hi, I'm a console gamer but I just wanted to say hi and that I don't understand anything in the thread but it's fun to read.


Hi, I'm not a gamer at all.
 
2021-12-03 3:05:39 AM  

mr0x: basscomm: Linux_Yes: Or

You could just buy AMD

Where?

Microcenter AMD case is always full.

6900xt mines 62. 3060ti mines more than that.


thank fark i have a microcenter when i need it
 
2021-12-03 10:41:47 AM  
globalnerdy.comView Full Size
 
Displayed 39 of 39 comments

View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking




On Twitter


  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.