Skip to content
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(PC Gamer)   Intel finally breaking into the consumer GPU game   (pcgamer.com) divider line
    More: Interesting  
•       •       •

992 clicks; posted to Geek » on 13 Aug 2020 at 4:58 PM (5 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



26 Comments     (+0 »)
 
View Voting Results: Smartest and Funniest
 
2020-08-13 3:15:58 PM  
Too bad Intel is going thru a reorganization and the Xe division is scheduled to be shuttered by 2023 and Raja Koduri is headed out too: link
 
2020-08-13 5:07:02 PM  
This isn't the first time we've heard this.  Intel has always sucked at GPUs, though, and it will take a great deal of investment to get even close.  They should've acquired PowerVR when it went up for sale a few years back or even Matrox.
 
2020-08-13 5:22:55 PM  

bhcompy: This isn't the first time we've heard this.  Intel has always sucked at GPUs, though, and it will take a great deal of investment to get even close.  They should've acquired PowerVR when it went up for sale a few years back or even Matrox.


Yep and now it's too little too late. It will take them a decade or more to even start competing. And if we've learned anything about Intel, it's they make big announcements and then quietly give up six to twelve months later. They don't have the fortitude.
 
2020-08-13 5:26:49 PM  
Fark user imageView Full Size
 
2020-08-13 5:30:48 PM  
Sounds like someone needs more investors
 
2020-08-13 5:34:15 PM  
Because what the world really needs is yet another video card in the crowded space where they can kind of, barely playing a game at resolutions from four years ago with low/medium settings.  The ones with all the flashy packaging and touted as a gamer card, bought by people who don't understand any of that but insist that their pre-assembled bestbuy or walmart 'rig' that was just 'upgraded' with this card should do better at game X so obviously that game needs to be 'optimized'.

Sometimes I wish steam would let me filter threads by the hardware specs of the original poster, it would cut down so much noise.  Then I remember I can close that browser tab and do something useful with my life, like post this rant here.
 
2020-08-13 5:45:29 PM  
I would love Intel to get their shiat together and offer more competition in the GPU space.  I assume they won't, but secretly hope they expose me as a floundering moron.
 
2020-08-13 5:51:17 PM  
Looks like AMD buying ATI has worked out in the long run.
 
2020-08-13 5:52:54 PM  
Well, not a fan of Intel and I doubt anything will come of this, but it'd be nice. Having only two competitors for both CPUs and GPUs is no fun.
 
2020-08-13 6:11:05 PM  
*snert*

If they hope to do this they better outsource their fabrication to the same people making AMD silicon. Intel has shiat the bed on that front.
 
2020-08-13 6:18:25 PM  

JasonOfOrillia: Looks like AMD buying ATI has worked out in the long run.


Yeah but it took them a long time to get it right. I used ATI cards for a long time, from the Xpression PC2TV all the way to the Radeon 3870, but after that I switched to Nvidia for a few years until the RX 400 series came out to sway me back to them. I'll never be the person that lays out $700 or more for a GPU, but they are really competitive in the $200 to $400 space. It will be interesting to see what "Big Navi" can do, if it ever comes out.
 
2020-08-13 7:49:09 PM  
Show me the money.
 
2020-08-13 8:22:03 PM  

isamudyson: JasonOfOrillia: Looks like AMD buying ATI has worked out in the long run.

Yeah but it took them a long time to get it right. I used ATI cards for a long time, from the Xpression PC2TV all the way to the Radeon 3870, but after that I switched to Nvidia for a few years until the RX 400 series came out to sway me back to them. I'll never be the person that lays out $700 or more for a GPU, but they are really competitive in the $200 to $400 space. It will be interesting to see what "Big Navi" can do, if it ever comes out.


Intel has been doing integrated chips for awhile, this is just 'bigger' in terms of silicon. They also know and understand processor design. I imagine the first couple generations will be a bit rough, but then they'll smooth 'em out. I like it because it could bring those $700+ cards down a peg or two in the price department.
 
2020-08-13 9:14:21 PM  
Every 5 years or so, Intel has announced "big improvements in GPUs".  This is the longest they've kept at it since roughly 2000 (when they eventually shipped boards originally designed by Lockheed Martin), but I'd be shocked if anything is remotely competitive (and of course will be shut down by 2023 regardless of what they have).

JasonOfOrillia: Looks like AMD buying ATI has worked out in the long run.


Barely.  The issue there was that it was a cash buyout of ATI by AMD.  This meant AMD could barely afford to pay the interest on all that debt while producing such hits as Phenon, Bulldozer, and equally uncompetitive Radeon boards*.  I think they are finally in the clear with all the new Ryzen/Epyc money.

Back when they first started putting GPUs and CPUs on the same chip (Bulldozer+polaris?  possibly even weaker GPUs) they announced big plans for "Heterogeneous System Architecture" in 2011.  Unfortunately, nvidia had been quietly building up the real infrastructure (starting with CUDA) since 2008.  If you want to compute on your GPU, you almost certainly want a nvidia.

/* true in practice, but you can get some good deals from AMD
// especially for the basic "1080@60Hz" gaming level
/// don't knock 1080@60Hz.  On typical monitor sizes, the improvements get pretty subtle after that.
 
2020-08-13 9:43:04 PM  

yet_another_wumpus: Every 5 years or so, Intel has announced "big improvements in GPUs".  This is the longest they've kept at it since roughly 2000 (when they eventually shipped boards originally designed by Lockheed Martin), but I'd be shocked if anything is remotely competitive (and of course will be shut down by 2023 regardless of what they have).

JasonOfOrillia: Looks like AMD buying ATI has worked out in the long run.

Barely.  The issue there was that it was a cash buyout of ATI by AMD.  This meant AMD could barely afford to pay the interest on all that debt while producing such hits as Phenon, Bulldozer, and equally uncompetitive Radeon boards*.  I think they are finally in the clear with all the new Ryzen/Epyc money.

Back when they first started putting GPUs and CPUs on the same chip (Bulldozer+polaris?  possibly even weaker GPUs) they announced big plans for "Heterogeneous System Architecture" in 2011.  Unfortunately, nvidia had been quietly building up the real infrastructure (starting with CUDA) since 2008.  If you want to compute on your GPU, you almost certainly want a nvidia.

/* true in practice, but you can get some good deals from AMD
// especially for the basic "1080@60Hz" gaming level
/// don't knock 1080@60Hz.  On typical monitor sizes, the improvements get pretty subtle after that.


Maybe you want to take the Radeon boards out of there as they did as well or better than their Nvidia counterparts (at least with the RX 400 & up) and furthered that with the RX 5000 series. Yes, ray tracing won't come officially until the new cards hit but it's been out for a bit and you don't exactly see it being the "make or break" feature Team Green thought it would be. Heck, Crytek did a demo for ray tracing in CryEngine that ran on Vega 56 just fine.

/1080p is just fine
//refresh rate is more important
 
2020-08-13 10:02:54 PM  

neongoats: *snert*

If they hope to do this they better outsource their fabrication to the same people making AMD silicon. Intel has shiat the bed on that front.


Article says they've confirmed they're outsourcing fabrication.
 
2020-08-13 10:26:55 PM  

isamudyson: yet_another_wumpus: Every 5 years or so, Intel has announced "big improvements in GPUs".  This is the longest they've kept at it since roughly 2000 (when they eventually shipped boards originally designed by Lockheed Martin), but I'd be shocked if anything is remotely competitive (and of course will be shut down by 2023 regardless of what they have).

JasonOfOrillia: Looks like AMD buying ATI has worked out in the long run.

Barely.  The issue there was that it was a cash buyout of ATI by AMD.  This meant AMD could barely afford to pay the interest on all that debt while producing such hits as Phenon, Bulldozer, and equally uncompetitive Radeon boards*.  I think they are finally in the clear with all the new Ryzen/Epyc money.

Back when they first started putting GPUs and CPUs on the same chip (Bulldozer+polaris?  possibly even weaker GPUs) they announced big plans for "Heterogeneous System Architecture" in 2011.  Unfortunately, nvidia had been quietly building up the real infrastructure (starting with CUDA) since 2008.  If you want to compute on your GPU, you almost certainly want a nvidia.

/* true in practice, but you can get some good deals from AMD
// especially for the basic "1080@60Hz" gaming level
/// don't knock 1080@60Hz.  On typical monitor sizes, the improvements get pretty subtle after that.

Maybe you want to take the Radeon boards out of there as they did as well or better than their Nvidia counterparts (at least with the RX 400 & up) and furthered that with the RX 5000 series. Yes, ray tracing won't come officially until the new cards hit but it's been out for a bit and you don't exactly see it being the "make or break" feature Team Green thought it would be. Heck, Crytek did a demo for ray tracing in CryEngine that ran on Vega 56 just fine.

/1080p is just fine
//refresh rate is more important


If refresh rate is more important, then you want an nVidia card. nVidia supports all g-sync monitors (obvs) and 90 free-sync ones. AMD only supports free-sync. 

While the benchmarks are improving, the lack of features is indicated by the cost difference. nVidia supports more hardware, better, and has solid reliability. If it matters to you, Team Green cards use less power, and are quieter (generally) as well. 

I've had both over the years, and nVidia has never given me a problem. I've had three ATI cards die. One in a Mac Pro, one in a laptop and an older GPU for desktops. I still have a 6xx series nvidia that works.
 
2020-08-13 10:43:07 PM  

isamudyson: JasonOfOrillia: Looks like AMD buying ATI has worked out in the long run.

Yeah but it took them a long time to get it right. I used ATI cards for a long time, from the Xpression PC2TV all the way to the Radeon 3870, but after that I switched to Nvidia for a few years until the RX 400 series came out to sway me back to them. I'll never be the person that lays out $700 or more for a GPU, but they are really competitive in the $200 to $400 space. It will be interesting to see what "Big Navi" can do, if it ever comes out.


The 580 for like 169$ most days has got to be the best bang for the buck on the market right now.

It's not the best card, but for 1080p gaming it's rock solid for the money.
 
2020-08-13 11:01:19 PM  

yet_another_wumpus: Every 5 years or so, Intel has announced "big improvements in GPUs".  This is the longest they've kept at it since roughly 2000 (when they eventually shipped boards originally designed by Lockheed Martin), but I'd be shocked if anything is remotely competitive (and of course will be shut down by 2023 regardless of what they have).

JasonOfOrillia: Looks like AMD buying ATI has worked out in the long run.

Barely.  The issue there was that it was a cash buyout of ATI by AMD.  This meant AMD could barely afford to pay the interest on all that debt while producing such hits as Phenon, Bulldozer, and equally uncompetitive Radeon boards*.  I think they are finally in the clear with all the new Ryzen/Epyc money.

Back when they first started putting GPUs and CPUs on the same chip (Bulldozer+polaris?  possibly even weaker GPUs) they announced big plans for "Heterogeneous System Architecture" in 2011.  Unfortunately, nvidia had been quietly building up the real infrastructure (starting with CUDA) since 2008.  If you want to compute on your GPU, you almost certainly want a nvidia.

/* true in practice, but you can get some good deals from AMD
// especially for the basic "1080@60Hz" gaming level
/// don't knock 1080@60Hz.  On typical monitor sizes, the improvements get pretty subtle after that.


If it's something really processor intensive like an emulator, you can do 4K60 easily. I've been playing Zelda BotW at 4K60 with my RX5700 (i
 
2020-08-13 11:01:37 PM  
Dammit. Rest of my comment got eaten. Whatever.
 
2020-08-14 8:34:32 AM  

isamudyson: yet_another_wumpus: Every 5 years or so, Intel has announced "big improvements in GPUs".  This is the longest they've kept at it since roughly 2000 (when they eventually shipped boards originally designed by Lockheed Martin), but I'd be shocked if anything is remotely competitive (and of course will be shut down by 2023 regardless of what they have).

JasonOfOrillia: Looks like AMD buying ATI has worked out in the long run.

Barely.  The issue there was that it was a cash buyout of ATI by AMD.  This meant AMD could barely afford to pay the interest on all that debt while producing such hits as Phenon, Bulldozer, and equally uncompetitive Radeon boards*.  I think they are finally in the clear with all the new Ryzen/Epyc money.

Back when they first started putting GPUs and CPUs on the same chip (Bulldozer+polaris?  possibly even weaker GPUs) they announced big plans for "Heterogeneous System Architecture" in 2011.  Unfortunately, nvidia had been quietly building up the real infrastructure (starting with CUDA) since 2008.  If you want to compute on your GPU, you almost certainly want a nvidia.

/* true in practice, but you can get some good deals from AMD
// especially for the basic "1080@60Hz" gaming level
/// don't knock 1080@60Hz.  On typical monitor sizes, the improvements get pretty subtle after that.

Maybe you want to take the Radeon boards out of there as they did as well or better than their Nvidia counterparts (at least with the RX 400 & up) and furthered that with the RX 5000 series. Yes, ray tracing won't come officially until the new cards hit but it's been out for a bit and you don't exactly see it being the "make or break" feature Team Green thought it would be. Heck, Crytek did a demo for ray tracing in CryEngine that ran on Vega 56 just fine.

/1080p is just fine
//refresh rate is more important


The longevity of the 400/500 series is amazing.  I have a Vega56, but I think the only reason that succeeded financially for AMD was the cryptomining craze (the memory was just too expensive to otherwise make.  I only bought one because it was being discontinued and was a relative bargain).

Nvidia's marketing side is even more scary effective than their engineering side.  Somehow they've managed to take domination of the high end to domination in the midrange regardless of the "nvidia tax".

Personally, I'm convinced that the high refresh craze is yet another trick by nvidia marketing to sell even more boards.  I'd admit that there should be some advantages going from 60Hz to 90Hz (especially for VR) to 120Hz, but don't think you can see the difference between 120Hz and 240Hz (or even 160Hz).  I've said that gamers are the new audiophiles, and they seem driven to prove this.

How much resolution you can see depends a lot on your monitor.  At 1920x1080 you can still see edges on your screen with any desktop monitor regardless of the anti-alias settings.  But with good settings (which keep taking less and less of a hit on your GPU) you have to carefully look for them.  I'm using a 43" 4k TV for a monitor: it might be stuck to 60Hz, but you can see all 4k in all its glory.  And more importantly, refresh only matters for gaming: having that resolution+desktop space makes a lot of difference in actual work (like multiple monitors, but without any breaks between them).

/43" 4k TVs are *cheap* (at least compared to other 4k and high refresh 2k monitors).
// but expect to need to pick up the remote and turn them on and set them to "computer" every time you want to use them
/// and don't expect more than 60Hz, at least not at the cheap point, at least until after a year or two of nextgen consoles being out.
 
2020-08-14 5:10:16 PM  

yet_another_wumpus: Personally, I'm convinced that the high refresh craze is yet another trick by nvidia marketing to sell even more boards. I'd admit that there should be some advantages going from 60Hz to 90Hz (especially for VR) to 120Hz, but don't think you can see the difference between 120Hz and 240Hz (or even 160Hz). I've said that gamers are the new audiophiles, and they seem driven to prove this.


I haven't got anything about 120Hz to try it with, but this site made me a believer of the value in high refresh rate when it comes games that pan horizontally, like first person games where you're on foot:
https://www.testufo.com/

And now that I've seen that effect and the difference, I can't unsee it, even when watching movies and its doing a long panning shot.
 
2020-08-14 5:13:31 PM  
Picked up an RX 580 4GB for $150NZD ($100USD). Very happy.

Terribly excited about Ampere, though. The Turing launch seems to have taught Nvidia that there's a limit to screwing your customers and they won't just take anything.

Since Pascal:
Nvidia > AMD on every metric ignoring price
AMD > Nvidia on every metric including price

But, more importantly, FTA:
Intel promises enthusiast gaming GPU to take on AMD and Nvidia in 2021. It's a ray tracing three-way

Who's Ray?
 
2020-08-14 6:16:49 PM  

dyhchong: Picked up an RX 580 4GB for $150NZD ($100USD). Very happy.

Terribly excited about Ampere, though. The Turing launch seems to have taught Nvidia that there's a limit to screwing your customers and they won't just take anything.

Since Pascal:
Nvidia > AMD on every metric ignoring price
AMD > Nvidia on every metric including price

But, more importantly, FTA:
Intel promises enthusiast gaming GPU to take on AMD and Nvidia in 2021. It's a ray tracing three-way

Who's Ray?


I'm guessing some poor guy who fell out of a window and they are seeing who does a better outline?
 
2020-08-14 9:31:24 PM  

BumpInTheNight: yet_another_wumpus: Personally, I'm convinced that the high refresh craze is yet another trick by nvidia marketing to sell even more boards. I'd admit that there should be some advantages going from 60Hz to 90Hz (especially for VR) to 120Hz, but don't think you can see the difference between 120Hz and 240Hz (or even 160Hz). I've said that gamers are the new audiophiles, and they seem driven to prove this.

I haven't got anything about 120Hz to try it with, but this site made me a believer of the value in high refresh rate when it comes games that pan horizontally, like first person games where you're on foot:
https://www.testufo.com/

And now that I've seen that effect and the difference, I can't unsee it, even when watching movies and its doing a long panning shot.


If you want a real issue with horizontal panning, check out the Oblivion introduction.  Not only does it have panning, but towers zip by at a different rate (which really look bad).

https://www.youtube.com/watch?v=JGhlg​4​JqvQw (about 1:56 a tower goes by)

Unfortunately, the framerate here is so bad that normal panning looks bad.  But Oblivion works pretty well at 30Hz or so if you just want to wander and drink in the scenery.  But the introduction teaches you right away to crank down the eyecandy (well it did if you were using a 2011 or earlier GPU).
 
2020-08-14 9:45:32 PM  
<--- poking around and very slowly learning CUDA and OpenGL. Does anyone else do that?
 
Displayed 26 of 26 comments

View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking




On Twitter



  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.