Skip to content
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(YouTube)   Hardware Jesus takes on Nvidia's RTX 3090 8K gaming claims   (youtube.com) divider line
    More: Amusing  
•       •       •

667 clicks; posted to Fandom » and STEM » on 24 Sep 2020 at 1:21 PM (5 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



53 Comments     (+0 »)
 
View Voting Results: Smartest and Funniest


Oldest | « | 1 | 2 | » | Newest | Show all

 
2020-09-24 11:11:53 AM  
As predicted, its narrowly more capable for gaming then a 3080, and that headway disappears if you tweak a 3080 to more suitable OC.  Definitely not worth the price tag vs a 3080 if your primary goal is gaming, like mine is.

So far its still hard to justify going from a 2080TI to either of them, but if I was itching on a 10xx or lower the 3080 looks like a great card.  There's still the other iterations that will show up later, like the leaked plans for a 20GB edition of the 3080, I've only met one game myself that would eat that and that's Ark, but I bet FS2020 would love this stuff too.
 
2020-09-24 1:27:40 PM  
Vettel stuck behind Russell?  Man, those graphics are hyper-realistic!
 
2020-09-24 1:37:57 PM  

BumpInTheNight: As predicted, its narrowly more capable for gaming then a 3080, and that headway disappears if you tweak a 3080 to more suitable OC.  Definitely not worth the price tag vs a 3080 if your primary goal is gaming, like mine is.

So far its still hard to justify going from a 2080TI to either of them, but if I was itching on a 10xx or lower the 3080 looks like a great card.  There's still the other iterations that will show up later, like the leaked plans for a 20GB edition of the 3080, I've only met one game myself that would eat that and that's Ark, but I bet FS2020 would love this stuff too.


I'm finding it difficult to justify going from a 2080, because I'd need to buy a 4k monitor to truly get the benefits. On top of that, 27 inch 1440p at 144hz is still the sweet spot, imo, and 4k IPS 144hz monitors are expensive as fark.

Aside from probably not being able to purchase one of these things for months and months, I think the smart move is to wait. The 3090 is not a full-fat chip. It's cut down and will likely get an additional processing unit in some sort of 3090 TI or TITAN within 6 months.

There is a TON of room in the pricing and architecture for both a 3090 TI and a 3080 TI. All they need to do for the 3080 TI is, maybe, activate a few more cores, slap 12gb of memory on it and sell it for $1,000.00. That might be the card to wait for here.
 
2020-09-24 1:39:40 PM  
I'm not buying shiat until I see what Big Navi can do.
 
2020-09-24 1:44:07 PM  
This is good news for me. Built a brand new top range pc with the only re-used piece being my 1080TI. Sounds like I'll put the cherry on top with a new 3080 and be happy for years to come.
 
2020-09-24 1:46:03 PM  

fragMasterFlash: I'm not buying shiat until I see what Big Navi can do.


this is wise, but I don't think AMD will be competitive in this space for a couple years. Nvidia doesn't have the same issues as Intel. The 3080 is a great card for the price. It was certainly priced that way to kneecap AMD's offerings. we'll see. It would be great to have competition again. I still can't believe I went with AMD for a cpu on my most recent build.
 
2020-09-24 1:47:07 PM  

fragMasterFlash: I'm not buying shiat until I see what Big Navi can do.


Has ATI fixed their drivers issues? I hated each and every ATI card I've ever bought due to driver issues, a constant pain in the ass, never had that issue with nVidia. Still have nightmares about not being able to use Anti Aliasing in CoH because of ATI card.
 
2020-09-24 2:01:42 PM  

BumpInTheNight: As predicted, its narrowly more capable for gaming then a 3080, and that headway disappears if you tweak a 3080 to more suitable OC.  Definitely not worth the price tag vs a 3080 if your primary goal is gaming, like mine is.

So far its still hard to justify going from a 2080TI to either of them, but if I was itching on a 10xx or lower the 3080 looks like a great card.  There's still the other iterations that will show up later, like the leaked plans for a 20GB edition of the 3080, I've only met one game myself that would eat that and that's Ark, but I bet FS2020 would love this stuff too.


Not for mainstream gaming, but this gives people who install high-resolution mods a lot of space for the textures.
I can't justify it, but a hardcore Skyrim VR player can.

I'm hoping for a 3080ti that will be worth upgrading from my 2080ti. I might just get the 3080 if a ti isn't announced soon.
 
2020-09-24 2:12:31 PM  
I'm gonna wait to see what Metal Jesus says.
 
2020-09-24 2:13:30 PM  
I'm more inclined to save for a 3080 since I now have a 1070ti.  Then I'll have room to upgrade my monitors.
 
2020-09-24 2:18:57 PM  

fragMasterFlash: I'm not buying shiat until I see what Big Navi can do.


This is also my position, even as someone who's bought exclusively nvidia for over a decade (for performance reasons), if this card can actually compete in the halo-tier space that will mean nvidia is forced to reduce prices, and that's a good thing for everyone.

drewsclues: I'm finding it difficult to justify going from a 2080, because I'd need to buy a 4k monitor to truly get the benefits.


I get yah, for me I went 4K pretty early and my 2080TI can struggle at times with some games at that resolution just keeping 60fps, so a 3080 is in my future.  But we'll see if I can hold off long enough for a 3080TI
 
2020-09-24 2:22:25 PM  
I've got a 2080 ti but I think the only game I've used it for is Kerbal Space Program. Otherwise I'm doing CAD stuff. I figure if I can do that for a hobby there are shiatloads more gamers with more money to spend on hardware. Makes sense for Nvidia to market to them. However, nobody's doing 8k anything yet other than maybe video editing for future compatibility, but they're always using proxies so they won't even have 8k monitors. And gamers aren't going to drastically drop their framerates just to go from 4k to 8k.
 
2020-09-24 2:34:37 PM  
The 3080 definitely seems to be the sweet spot for new builds.  Hoping that I can score one (and that the driver issues are fixed) by the time my Reverb G2 preorder arrives. 

So... stick with the 3080 and use the extra cash to go toward a YawVR
 
2020-09-24 2:40:20 PM  
I'm staying with 1080ti's until there's a non threadripper 32 core AMD desktop.
 
2020-09-24 2:41:16 PM  

Luse: fragMasterFlash: I'm not buying shiat until I see what Big Navi can do.

Has ATI fixed their drivers issues? I hated each and every ATI card I've ever bought due to driver issues, a constant pain in the ass, never had that issue with nVidia. Still have nightmares about not being able to use Anti Aliasing in CoH because of ATI card.


Hate to break it to you, but ATi ain't doing jack these days.  They were bought out by AMD years ago.
 
2020-09-24 2:59:11 PM  
But will it play Crysis?
 
2020-09-24 3:00:19 PM  

BumpInTheNight: As predicted, its narrowly more capable for gaming then a 3080, and that headway disappears if you tweak a 3080 to more suitable OC.  Definitely not worth the price tag vs a 3080 if your primary goal is gaming, like mine is.

So far its still hard to justify going from a 2080TI to either of them, but if I was itching on a 10xx or lower the 3080 looks like a great card.  There's still the other iterations that will show up later, like the leaked plans for a 20GB edition of the 3080, I've only met one game myself that would eat that and that's Ark, but I bet FS2020 would love this stuff too.


ordered my 3090 today.  I see the articles and benchmarks saying only 10% increase, but I think I'm future-proofing (I am one of those people on a 1080) some more than with a 3080, and I'll be able to OC the 3090 to keep pace with the OC'd 3080s.  I also just ordered a 48" OLED to use as a monitor and plan to run FS2020 at 4k ultra settings so I'm ok with the cost.  Plus it's all a gift to myself after finishing a grueling 5 year project.
 
2020-09-24 3:01:47 PM  
Was planning to treat myself to a new 4K TV for Christmas, so will likely upgrade the gaming rig attached to it with the 3080 at the same time. It's currently got a 1070 in it, so it's past due for an upgrade... after making sure the motherboard and power supply can actually handle it
 
2020-09-24 3:02:38 PM  

Malenfant: I'm hoping for a 3080ti that will be worth upgrading from my 2080ti. I might just get the 3080 if a ti isn't announced soon.


I doubt there will be a Ti series, but there might be. Right now a 3080 is already roughly 75% more powerful than a 2080Ti, 100% for a 2080. 

I wouldn't get the 3090. 10-15% fps increase isn't big enough for 100% the price increase. It can do 8k gaming at roughly 30fps, and the cheapest 8k screens are $4000+.
 
2020-09-24 3:05:06 PM  
I realize they tested PCIe 4.0 vs 3.0 on the 3080 and only saw a couple of percent (if that) of gains.... but maybe the 3090 shines on PCIe 4.0? We don't know because they did all their testing on 3.0.

I feel like pointing that out to Gamers Nexus might just make somebody over there pop a blood vessel, before they scrap the whole review to recertify their tests on 4.0. Heh heh.
 
2020-09-24 3:10:18 PM  

PawisBetlog: BumpInTheNight: As predicted, its narrowly more capable for gaming then a 3080, and that headway disappears if you tweak a 3080 to more suitable OC.  Definitely not worth the price tag vs a 3080 if your primary goal is gaming, like mine is.

So far its still hard to justify going from a 2080TI to either of them, but if I was itching on a 10xx or lower the 3080 looks like a great card.  There's still the other iterations that will show up later, like the leaked plans for a 20GB edition of the 3080, I've only met one game myself that would eat that and that's Ark, but I bet FS2020 would love this stuff too.

ordered my 3090 today.  I see the articles and benchmarks saying only 10% increase, but I think I'm future-proofing (I am one of those people on a 1080) some more than with a 3080, and I'll be able to OC the 3090 to keep pace with the OC'd 3080s.  I also just ordered a 48" OLED to use as a monitor and plan to run FS2020 at 4k ultra settings so I'm ok with the cost.  Plus it's all a gift to myself after finishing a grueling 5 year project.


That sounds like a great plan.

I wish to warn you about the OLED though:  I bought an OLED for a gaming/daily monitor two years ago, within 8 months I had burned-in my browser since I'd leave it on for like 6-8 hours a day, on fark typically.  Just a heads up, OLED burn-in (or rather dimming) is very real, and it really sucks.  I'm using a 49" 4K QLED now, not as great, but I won't kill it either.
 
2020-09-24 3:13:36 PM  

LesserEvil: I realize they tested PCIe 4.0 vs 3.0 on the 3080 and only saw a couple of percent (if that) of gains.... but maybe the 3090 shines on PCIe 4.0? We don't know because they did all their testing on 3.0.

I feel like pointing that out to Gamers Nexus might just make somebody over there pop a blood vessel, before they scrap the whole review to recertify their tests on 4.0. Heh heh.


3090 on PCIe 4 isn't much improved either, in some games they saw decreases of a few frames. PCIe 4 is much more important for the next gen storage devices that hit 7000MBPS speeds (launch in October).
 
2020-09-24 3:16:01 PM  

BumpInTheNight: PawisBetlog: BumpInTheNight: As predicted, its narrowly more capable for gaming then a 3080, and that headway disappears if you tweak a 3080 to more suitable OC.  Definitely not worth the price tag vs a 3080 if your primary goal is gaming, like mine is.

So far its still hard to justify going from a 2080TI to either of them, but if I was itching on a 10xx or lower the 3080 looks like a great card.  There's still the other iterations that will show up later, like the leaked plans for a 20GB edition of the 3080, I've only met one game myself that would eat that and that's Ark, but I bet FS2020 would love this stuff too.

ordered my 3090 today.  I see the articles and benchmarks saying only 10% increase, but I think I'm future-proofing (I am one of those people on a 1080) some more than with a 3080, and I'll be able to OC the 3090 to keep pace with the OC'd 3080s.  I also just ordered a 48" OLED to use as a monitor and plan to run FS2020 at 4k ultra settings so I'm ok with the cost.  Plus it's all a gift to myself after finishing a grueling 5 year project.

That sounds like a great plan.

I wish to warn you about the OLED though:  I bought an OLED for a gaming/daily monitor two years ago, within 8 months I had burned-in my browser since I'd leave it on for like 6-8 hours a day, on fark typically.  Just a heads up, OLED burn-in (or rather dimming) is very real, and it really sucks.  I'm using a 49" 4K QLED now, not as great, but I won't kill it either.


Thanks for the tip!  I have PC set to screen time out at 10 min so hopefully that will help.
 
2020-09-24 3:41:35 PM  

PawisBetlog: BumpInTheNight: PawisBetlog: BumpInTheNight: As predicted, its narrowly more capable for gaming then a 3080, and that headway disappears if you tweak a 3080 to more suitable OC.  Definitely not worth the price tag vs a 3080 if your primary goal is gaming, like mine is.

So far its still hard to justify going from a 2080TI to either of them, but if I was itching on a 10xx or lower the 3080 looks like a great card.  There's still the other iterations that will show up later, like the leaked plans for a 20GB edition of the 3080, I've only met one game myself that would eat that and that's Ark, but I bet FS2020 would love this stuff too.

ordered my 3090 today.  I see the articles and benchmarks saying only 10% increase, but I think I'm future-proofing (I am one of those people on a 1080) some more than with a 3080, and I'll be able to OC the 3090 to keep pace with the OC'd 3080s.  I also just ordered a 48" OLED to use as a monitor and plan to run FS2020 at 4k ultra settings so I'm ok with the cost.  Plus it's all a gift to myself after finishing a grueling 5 year project.

That sounds like a great plan.

I wish to warn you about the OLED though:  I bought an OLED for a gaming/daily monitor two years ago, within 8 months I had burned-in my browser since I'd leave it on for like 6-8 hours a day, on fark typically.  Just a heads up, OLED burn-in (or rather dimming) is very real, and it really sucks.  I'm using a 49" 4K QLED now, not as great, but I won't kill it either.

Thanks for the tip!  I have PC set to screen time out at 10 min so hopefully that will help.


Yah like I really do live at my computer a lot so it was getting abused.  It was really the address bar and a nice vertical bar where fark's comments are a white body.  Oh and my task bar icons.  I'd recommend you have some whatever secondary monitor and put your taskbar & desktop icons on it, I think that'll go a long way to preserving the life of those screens when used as PC monitors.
 
2020-09-24 3:48:05 PM  

Saiga410: But will it play Crysis? MS Flight Simulator 2020


FTFY.

Early yet, but it's looking like MSFS 2020 is highly scalable and 'future proof' enough to make current PCs chug.

https://www.overclock3d.net/reviews/s​o​ftware/microsoft_flight_simulator_pc_p​erformance_review_and_optimisation_gui​de/1
 
2020-09-24 3:57:07 PM  
I also think it is a good idea to wait a few months for the revision a versions to come out. they always need to tweak something in the designs. I will look at what is available at Xmas and maybe make a new card a present to myself. My AMD 5700XT is only 4 months old and fits my needs for now. I rarely play brand new AAA games.
 
2020-09-24 4:15:58 PM  
I've always bought Intel and Nvidia (or voodoo, way back when)

I'm struggling to see a good path forward, right now.  I have a CPU 3 generations behind (7700k), so it struggles to support the 2080 TI and has to be OC'd (by 17% with water cooling), to avoid bottle neck.

Right now I'm not biting on the 3080 because I can't handle it without a cpu/mobo update.

I've always wanted to give AMD a try.  I think they are cheaper, and I like that they support PCIe 4, but are they hotter and do they consume more power?  Also, is there any hope of next-gen DDR support?

Maybe if Intel drops a new CPU with mobo's that can handle PCIe 4 and the next DDR, I'll stick with them and grab a 3080 TI about the same time.
 
2020-09-24 4:19:59 PM  

freakdiablo: Luse: fragMasterFlash: I'm not buying shiat until I see what Big Navi can do.

Has ATI fixed their drivers issues? I hated each and every ATI card I've ever bought due to driver issues, a constant pain in the ass, never had that issue with nVidia. Still have nightmares about not being able to use Anti Aliasing in CoH because of ATI card.

Hate to break it to you, but ATi ain't doing jack these days.  They were bought out by AMD years ago.


No worries. After the last AMD/ATI setup I had I build my Intel/nVidia one and loved it. New build is an Intel 9900k and the card will be a 3080 from what it looks like, going to try to hold out for the TI version should one drop.
 
2020-09-24 4:29:48 PM  

aungen: I've always bought Intel and Nvidia (or voodoo, way back when)

I'm struggling to see a good path forward, right now.  I have a CPU 3 generations behind (7700k), so it struggles to support the 2080 TI and has to be OC'd (by 17% with water cooling), to avoid bottle neck.

Right now I'm not biting on the 3080 because I can't handle it without a cpu/mobo update.

I've always wanted to give AMD a try.  I think they are cheaper, and I like that they support PCIe 4, but are they hotter and do they consume more power?  Also, is there any hope of next-gen DDR support?

Maybe if Intel drops a new CPU with mobo's that can handle PCIe 4 and the next DDR, I'll stick with them and grab a 3080 TI about the same time.


AMD has been announcements early October.  I genuinely believe that this gen coming next month will reach at least parity with Intels for gaming CPU stand points, price wise cheaper no doubt too.

Heat/power nah, in years past yah AMD's processors were slow and hot, but Ryzen has changed all of that.

As a long-time Intel proc purchaser I firmly believe my 9900Ks are the last ones I'll be buying for a while, AMD played a mean game of catch up and Intel has floundered badly the last bit.  We're about to see AMD overtake Intel on the CPU front.  And we might even see AMD approach Nvidia for top-tier GPUs too with big Navi, I'm not as hopeful about that one though as I am with their CPUs.
 
2020-09-24 5:10:35 PM  

phimuskapsi: Right now a 3080 is already roughly 75% more powerful than a 2080Ti


WHAT?????????  NO. Read the graphs:

https://www.techspot.com/review/2105-​g​eforce-rtx-3090/
 
2020-09-24 5:55:54 PM  

FarkingChas: phimuskapsi: Right now a 3080 is already roughly 75% more powerful than a 2080Ti

WHAT?????????  NO. Read the graphs:

https://www.techspot.com/review/2105-g​eforce-rtx-3090/


I miss read the names. Mea culpa.

"The RTX 3080 was a whopping 43% faster than the 2080 Ti, a massive performance uplift. It's also 83% faster than the 2080 and 92% faster than the 1080 Ti. Those are some of the best margins we've seen so far and certainly appear to be a best case scenario for the RTX 3080. "

43% is still pretty massive, and you have to understand that driver development for the cards are in their infancy, they will continue to improve.
 
2020-09-24 6:02:42 PM  

phimuskapsi: FarkingChas: phimuskapsi: Right now a 3080 is already roughly 75% more powerful than a 2080Ti

WHAT?????????  NO. Read the graphs:

https://www.techspot.com/review/2105-g​eforce-rtx-3090/

I miss read the names. Mea culpa.

"The RTX 3080 was a whopping 43% faster than the 2080 Ti, a massive performance uplift. It's also 83% faster than the 2080 and 92% faster than the 1080 Ti. Those are some of the best margins we've seen so far and certainly appear to be a best case scenario for the RTX 3080. "

43% is still pretty massive, and you have to understand that driver development for the cards are in their infancy, they will continue to improve.


43% is still quite generous.  That's the best case increase I've been seeing, 25-43% is more of a realistic broader application delta.  Just going by a bunch of benchmark sites like tom's hardware and guru3d.  I'm only really looking at the 4K results too, beneath that its less of a delta.  No one in their right mind should buy these to pair with a 1080P display for instance hah.

Remember when each generation's halo cards were a good +50-75% faster then previous?  I do :)  I mean we're hitting some truly incredible values here regardless and even though I've got a 2080TI, knowing I can buy a 3080 (some day when stock permits) for less then I paid for the 2080TI is nice.
 
2020-09-24 6:27:34 PM  

BumpInTheNight: So far its still hard to justify going from a 2080TI to either of them


For a gamer, yes. For me, it's was Tuesday easy. My main rig sports two 2080ti's. Why? To run Freecell at 8,000,000,000,000 frames per second? No. To render. The 3090 is an easy choice if your primary usage is 3D rendering. Over twice the CUDA cores, twice the VRAM and 70% of the electrical draw? Easy choice.

The 3090 isn't for gamers. Not really. Yes, the "I want the best at all cost" hipster douchebags will buy them, but really, they're for Blender/Studio/Maya/etc... users like me.
 
2020-09-24 6:30:45 PM  

Ed Grubermann: Over twice the CUDA cores, twice the VRAM and 70% of the electrical draw?


Farked that up. Twice the CUDA cores and VRAM of a single 2080ti, and 70% of the electrical draw of both my 2080ti's (rendering software does not SLI, so card VRAM does not stack).
 
2020-09-24 6:56:26 PM  
Honestly... my monitor is 1080p and 60hz, and it's fine. Maybe I just don't know what I'm missing, but I really can't imagine going higher is worth the cost and the fuss.

Maybe if it was taken to the extreme, with 4k at 120 FPS, I'd never be able to go back. So it's just as well that I don't.
 
2020-09-24 7:02:47 PM  

Ed Grubermann: Ed Grubermann: Over twice the CUDA cores, twice the VRAM and 70% of the electrical draw?

Farked that up. Twice the CUDA cores and VRAM of a single 2080ti, and 70% of the electrical draw of both my 2080ti's (rendering software does not SLI, so card VRAM does not stack).


Absolutely if your professional use-case can consume those 3090s they are just ridiculous stats wise.  Been curious for a while now like are the Quadro cards really getting side-lined in favour of these titans and now the 3090 or do you still see a substantial (yet massive price jump) of using the genuine 'workstation' cards from the quadro family anymore?

Server wise I've packed some with Tesla cards for creating tight GPU accelerated VDI clusters for heavy-ass users that are doing autocad etc, same for machine learning applications.  But I haven't seen a whole lot of work station cards talked about at all as of late.  Like, did people finally get fed up with quadro price points and just pack more titans into the same box for same effect?
 
2020-09-24 7:03:38 PM  

Puglio: Honestly... my monitor is 1080p and 60hz, and it's fine. Maybe I just don't know what I'm missing, but I really can't imagine going higher is worth the cost and the fuss.

Maybe if it was taken to the extreme, with 4k at 120 FPS, I'd never be able to go back. So it's just as well that I don't.


I look back at some of my 1080p screen shots, yah, stay there as long as you can because once you start going higher its a new form of crack ;)
 
2020-09-24 7:52:43 PM  

BumpInTheNight: AMD has been announcements early October. I genuinely believe that this gen coming next month will reach at least parity with Intels for gaming CPU stand points, price wise cheaper no doubt too.


Leaked rumors put it a performance level between a 3070 and 3080, being closer to 3080 levels, and priced about $50 under the 3080.
 
2020-09-24 7:55:29 PM  

Abe Vigoda's Ghost: BumpInTheNight: AMD has been announcements early October. I genuinely believe that this gen coming next month will reach at least parity with Intels for gaming CPU stand points, price wise cheaper no doubt too.

Leaked rumors put it a performance level between a 3070 and 3080, being closer to 3080 levels, and priced about $50 under the 3080.


Aaand I just saw you were talking about CPUs.
 
2020-09-24 8:03:43 PM  

Abe Vigoda's Ghost: BumpInTheNight: AMD has been announcements early October. I genuinely believe that this gen coming next month will reach at least parity with Intels for gaming CPU stand points, price wise cheaper no doubt too.

Leaked rumors put it a performance level between a 3070 and 3080, being closer to 3080 levels, and priced about $50 under the 3080.


Errrff, so close, so damned close!  Maybe next gen.  Still, this is good to know they're really pushing into the top-tier space again.

(But yah for the post itself I was suggesting aungen's hang up about system overhaul should wait to see what AMD's next-gen CPUs bring to the table for sure)
 
2020-09-24 8:18:20 PM  

Ed Grubermann: BumpInTheNight: So far its still hard to justify going from a 2080TI to either of them

For a gamer, yes. For me, it's was Tuesday easy. My main rig sports two 2080ti's. Why? To run Freecell at 8,000,000,000,000 frames per second? No. To render. The 3090 is an easy choice if your primary usage is 3D rendering. Over twice the CUDA cores, twice the VRAM and 70% of the electrical draw? Easy choice.

The 3090 isn't for gamers. Not really. Yes, the "I want the best at all cost" hipster douchebags will buy them, but really, they're for Blender/Studio/Maya/etc... users like me.


Bad news for you. No more SLI. They don't have the connector. Nvidia has said SLI support is dead.
 
2020-09-24 9:02:51 PM  

BumpInTheNight: phimuskapsi: FarkingChas: phimuskapsi: Right now a 3080 is already roughly 75% more powerful than a 2080Ti

WHAT?????????  NO. Read the graphs:

https://www.techspot.com/review/2105-g​eforce-rtx-3090/

I miss read the names. Mea culpa.

"The RTX 3080 was a whopping 43% faster than the 2080 Ti, a massive performance uplift. It's also 83% faster than the 2080 and 92% faster than the 1080 Ti. Those are some of the best margins we've seen so far and certainly appear to be a best case scenario for the RTX 3080. "

43% is still pretty massive, and you have to understand that driver development for the cards are in their infancy, they will continue to improve.

43% is still quite generous.  That's the best case increase I've been seeing, 25-43% is more of a realistic broader application delta.  Just going by a bunch of benchmark sites like tom's hardware and guru3d.  I'm only really looking at the 4K results too, beneath that its less of a delta.  No one in their right mind should buy these to pair with a 1080P display for instance hah.

Remember when each generation's halo cards were a good +50-75% faster then previous?  I do :)  I mean we're hitting some truly incredible values here regardless and even though I've got a 2080TI, knowing I can buy a 3080 (some day when stock permits) for less then I paid for the 2080TI is nice.


You are comparing a 2080Ti, which cost as much as the 3090, to a card that costs half the price. Seeing a 43% increase from a halo to the 'mid-level' next gen card, is impressive.
 
2020-09-24 9:16:08 PM  

phimuskapsi: BumpInTheNight: phimuskapsi: FarkingChas: phimuskapsi: Right now a 3080 is already roughly 75% more powerful than a 2080Ti

WHAT?????????  NO. Read the graphs:

https://www.techspot.com/review/2105-g​eforce-rtx-3090/

I miss read the names. Mea culpa.

"The RTX 3080 was a whopping 43% faster than the 2080 Ti, a massive performance uplift. It's also 83% faster than the 2080 and 92% faster than the 1080 Ti. Those are some of the best margins we've seen so far and certainly appear to be a best case scenario for the RTX 3080. "

43% is still pretty massive, and you have to understand that driver development for the cards are in their infancy, they will continue to improve.

43% is still quite generous.  That's the best case increase I've been seeing, 25-43% is more of a realistic broader application delta.  Just going by a bunch of benchmark sites like tom's hardware and guru3d.  I'm only really looking at the 4K results too, beneath that its less of a delta.  No one in their right mind should buy these to pair with a 1080P display for instance hah.

Remember when each generation's halo cards were a good +50-75% faster then previous?  I do :)  I mean we're hitting some truly incredible values here regardless and even though I've got a 2080TI, knowing I can buy a 3080 (some day when stock permits) for less then I paid for the 2080TI is nice.

You are comparing a 2080Ti, which cost as much as the 3090, to a card that costs half the price. Seeing a 43% increase from a halo to the 'mid-level' next gen card, is impressive.


Whoa man, no the 2080TIs came in much under that price point of the 3090s.  I'm speaking Canadian here but 2080TIs were/are 1400-1500CAD vs a 3090 thats hitting 2200-2500$.

Definitely less then the $1000 a 3080 is commanding right now.
 
2020-09-24 9:18:05 PM  

BumpInTheNight: Whoa man, no the 2080TIs came in much under that price point of the 3090s.  I'm speaking Canadian here but 2080TIs were/are 1400-1500CAD vs a 3090 thats hitting 2200-2500$.

Definitely less then the $1000 a 3080 is commanding right now.


2080Ti's retailed for $1500->1800 USD
3090's retailed for the exact same
3080's retail from $700 to $850-ish

The secondary market, however, is insane right now.
 
2020-09-24 9:19:35 PM  
I'm trying to figure out WHY I would want a 30x0 card in the first place. Ray tracing is nice and all but why? There really isn't any killer games out there that would mandate this. Those games that are out there are most likely tied to some kind of micro transaction store that really just takes away from the game these days.
Computers are actually feeling pretty boring and the new graphics cards just don't feel that exciting. Hell you don't even get big manuals like in the old days with them which had all kinds of neat information in them. Gotta hit up eBay for old computers like that and hope you can find a complete package or all the pieces at ungodly prices sometimes.
 
2020-09-24 9:24:14 PM  

phimuskapsi: BumpInTheNight: Whoa man, no the 2080TIs came in much under that price point of the 3090s.  I'm speaking Canadian here but 2080TIs were/are 1400-1500CAD vs a 3090 thats hitting 2200-2500$.

Definitely less then the $1000 a 3080 is commanding right now.

2080Ti's retailed for $1500->1800 USD
3090's retailed for the exact same
3080's retail from $700 to $850-ish

The secondary market, however, is insane right now.


Are you really sure about that 2080TI price?  Because I've bought two, high-end waterblock-preinstalled ones, and after tax were like 1750$CAD, which would be what, 1400 USD?
https://pcpartpicker.com/trends/price/​video-card/

Yah.

Any ways, the price doesn't matter vs the original conversation about performance diffs between generations any ways.
 
2020-09-24 9:33:16 PM  

Nimbull: I'm trying to figure out WHY I would want a 30x0 card in the first place. Ray tracing is nice and all but why? There really isn't any killer games out there that would mandate this. Those games that are out there are most likely tied to some kind of micro transaction store that really just takes away from the game these days.
Computers are actually feeling pretty boring and the new graphics cards just don't feel that exciting. Hell you don't even get big manuals like in the old days with them which had all kinds of neat information in them. Gotta hit up eBay for old computers like that and hope you can find a complete package or all the pieces at ungodly prices sometimes.


Because you want more than gaming features and want compute and tensor cores for rendering speeds in viewports or machine learning. That is the only reason to upgrade from a Turing card.
 
2020-09-24 9:33:27 PM  

BumpInTheNight: phimuskapsi: BumpInTheNight: Whoa man, no the 2080TIs came in much under that price point of the 3090s.  I'm speaking Canadian here but 2080TIs were/are 1400-1500CAD vs a 3090 thats hitting 2200-2500$.

Definitely less then the $1000 a 3080 is commanding right now.

2080Ti's retailed for $1500->1800 USD
3090's retailed for the exact same
3080's retail from $700 to $850-ish

The secondary market, however, is insane right now.

Are you really sure about that 2080TI price?  Because I've bought two, high-end waterblock-preinstalled ones, and after tax were like 1750$CAD, which would be what, 1400 USD?
https://pcpartpicker.com/trends/price/​video-card/

Yah.

Any ways, the price doesn't matter vs the original conversation about performance diffs between generations any ways.


Did you actually look at the link? It shows prices from $1000 to $2000, so $1500 is accurate. It also shows the 3090 with prices ranging from $1600 to $2000. This is the initial launch, it'll come down.
 
2020-09-24 9:41:04 PM  

phimuskapsi: Did you actually look at the link? It shows prices from $1000 to $2000, so $1500 is accurate. It also shows the 3090 with prices ranging from $1600 to $2000. This is the initial launch, it'll come down.


Sure did.  Still not correct and still clearly saying that the 2080TIs never commanded the same price as a 3090 is asking, bitcoin assholes aside of course, which neither can properly factor.
 
2020-09-24 10:03:58 PM  

Abe Vigoda's Ghost: BumpInTheNight: AMD has been announcements early October. I genuinely believe that this gen coming next month will reach at least parity with Intels for gaming CPU stand points, price wise cheaper no doubt too.

Leaked rumors put it a performance level between a 3070 and 3080, being closer to 3080 levels, and priced about $50 under the 3080.


But they can't do that because Nvidia will counter with a 3070 ti at $100 below the 3080. AMD knows this, they'll have to come out very aggressive and take advantage of the fact that their node process is likely more profitable, at least for the moment.
 
Displayed 50 of 53 comments


Oldest | « | 1 | 2 | » | Newest | Show all


View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking





On Twitter



  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.