Skip to content
Do you have adblock enabled?
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(wccftech.com)   AMD introduced its Ryzen 7000 series X3D performance processors, presentation brought to you by Apple MacBook Pro and its non-AMD chip   (wccftech.com) divider line
    More: Awkward, X86, MacBook Pro, Macintosh, AMD's CES, Advanced Micro Devices, Twitter, Third-party production teams, Laptop  
•       •       •

630 clicks; posted to STEM » on 06 Jan 2023 at 4:10 AM (12 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



31 Comments     (+0 »)
View Voting Results: Smartest and Funniest
 
2023-01-06 6:30:17 AM  
Not as funny when you learn it was a contractor running the slide deck.
 
2023-01-06 7:10:48 AM  

blodyholy: Not as funny when you learn it was a contractor running the slide deck.


Even more so when you consider that CES hosted the event, not AMD or any of the other tech companies that gave presentations during the conference.
 
2023-01-06 7:32:08 AM  
AMD had no say with the computer running the presentation.

Using an Apple does have a positive. If the computer suffered a slowdown or crashed, AMD can use that to hype up their own processor. If it was a computer with an AMD processor and it crashed, that would be a problem.
 
2023-01-06 8:03:58 AM  
I've been pretty happy with my pandemic-purchased 5900X (meaning I had to pay a scalper $$$ off of StockX just to get it), but the 7950X3D looks like a fantastic, no-compromise processor.

Hmmm... let's see how my tax refund shapes up.
 
2023-01-06 8:16:29 AM  
I'm still rocking my 9900K for my gaming desktop, if I was shopping for new though yes let's see how these doubled-cache variants do.

Had my fingers crossed that AMD would also show up with some new desktop GPU that could slug it out directly with a 4090 but that was wishful wants, its too bad, ah well still using a 2080TI for a while longer.


LesserEvil: I've been pretty happy with my pandemic-purchased 5900X


Last summer I was looking to replace my headless server and AMD was doing a fire sale on the 5000 series CPUs prior to the release of the 7000s, got a 5950X for a really decent price and I'm very happy with the thing.  Paired it with an EVGA CLC 360mm AIO cooler, I think it hits about 75c under max load.  Impressive for sure.
 
2023-01-06 8:54:37 AM  

LesserEvil: I've been pretty happy with my pandemic-purchased 5900X (meaning I had to pay a scalper $$$ off of StockX just to get it), but the 7950X3D looks like a fantastic, no-compromise processor.

Hmmm... let's see how my tax refund shapes up.


What are you doing that uses compute? I've got a 5900x and my triple monitor setup is not CPU limited in any fashion. I've got a 6800xt and it can barely handle 75fps@1440 on 3 screens. Enabling FSR doesn't even tax the CPU and 64GB of ram.

/built a working time machine out of popsicle sticks and bubble gum
 
2023-01-06 9:23:02 AM  

blodyholy: Not as funny when you learn it was a contractor running the slide deck.


Why would you need a computer to run a slide deck?

That's like a gas powered pinhole camera...
 
2023-01-06 9:27:15 AM  

LesserEvil: I've been pretty happy with my pandemic-purchased 5900X (meaning I had to pay a scalper $$$ off of StockX just to get it), but the 7950X3D looks like a fantastic, no-compromise processor.

Hmmm... let's see how my tax refund shapes up.


Too little, too late. My furnace died on Christmas morning, and I had to spend this year's discretionary money on a different kind of space heater. Hopefully, I'm good for another 29 years, since the A/C was part of the package, and I had the water heater replaced last fall.
 
2023-01-06 10:28:23 AM  

LesserEvil: I've been pretty happy with my pandemic-purchased 5900X (meaning I had to pay a scalper $$$ off of StockX just to get it), but the 7950X3D looks like a fantastic, no-compromise processor.

Hmmm... let's see how my tax refund shapes up.


Point of order: You probably don't need a CPU upgrade, especially if the most taxing thing you do is play games. Those 16c/32t guys are massive overkill for most purposes. Gamers are generally better off with the highest-clocked models in a series, rather than the highest core count versions. Save the money and blow it on your next GPU.

And you might not need a GPU upgrade either; even the mid-range GPUs these days are as or more powerful than what the consoles that AAA games target. A PS5's graphics horsepower is about equal to a 2070ti. More FPS at 4k is nifty and all, but massive shifts in 3D gaming engines mostly tend to coincide with new console generations and we're still in the middle of the current generation.

It's weird to say this, but we've gotten to the point where gaming isn't a meaningful driver of PC improvements, albeit mostly because everything is SO tied up in console capability.
 
2023-01-06 11:15:20 AM  

eyeq360: AMD had no say with the computer running the presentation.

Using an Apple does have a positive. If the computer suffered a slowdown or crashed, AMD can use that to hype up their own processor. If it was a computer with an AMD processor and it crashed, that would be a problem.


Bill Gates gets the BSOD
Youtube UjZQGRATlwA


/set a world record for the most consecutive high fives
 
2023-01-06 11:32:42 AM  

likefunbutnot: LesserEvil: I've been pretty happy with my pandemic-purchased 5900X (meaning I had to pay a scalper $$$ off of StockX just to get it), but the 7950X3D looks like a fantastic, no-compromise processor.

Hmmm... let's see how my tax refund shapes up.

Point of order: You probably don't need a CPU upgrade, especially if the most taxing thing you do is play games. Those 16c/32t guys are massive overkill for most purposes. Gamers are generally better off with the highest-clocked models in a series, rather than the highest core count versions. Save the money and blow it on your next GPU.

And you might not need a GPU upgrade either; even the mid-range GPUs these days are as or more powerful than what the consoles that AAA games target. A PS5's graphics horsepower is about equal to a 2070ti. More FPS at 4k is nifty and all, but massive shifts in 3D gaming engines mostly tend to coincide with new console generations and we're still in the middle of the current generation.

It's weird to say this, but we've gotten to the point where gaming isn't a meaningful driver of PC improvements, albeit mostly because everything is SO tied up in console capability.


This is a thing I wish I would have known when I bullt out the 5900x. I would have gotten a less expensive CPU that would have gotten maxed out by the games along with the best GPU possible.
 
2023-01-06 12:20:50 PM  
Hey look, it is an Apple fanboi trying to bash AMD about something that has nothing to do with them or the relevant product.

Here is a hint, presentations require almost no processor. It doesn't matter what laptop you are using (or the processor inside it) for a presentation.
 
2023-01-06 12:39:50 PM  

question_dj: This is a thing I wish I would have known when I bullt out the 5900x. I would have gotten a less expensive CPU that would have gotten maxed out by the games along with the best GPU possible.


I have an absurd PC because I have a camera that takes 8k video. THAT is something that needs hardware absurdity, but also takes me down some weird paths, like having both an Arc 750 and an RTX3060 installed (the Arc has hardware codec support for the native video out of my camera, which isn't present in $2000 nVidia cards).

If you're chasing gaming highs, the first thing you have to consider at this point is the limitation of your monitor(s). Do you have a Samsung Neo9 or something that approximates 4k 240Hz? Are you planning to game across multiple 4k displays? No? Then you probably don't need that RTX4090. There is little point in trying to futureproof your GPU for gaming, because the next generation is always going to introduce something or other that the previous generation didn't have (e.g. raytracing, DLSS 3). When you buy on the extreme end of GPUs in particular, you're also adding issues with heat and power delivery. My workstation uses about 1100W when everything is running and has to be plugged in on its own breaker; this can make the ambient temperature in the room where I use it as much as 15F higher. PC Hobbyist sites and streamers don't really talk about that part of things. Anyway, buy a midrange GPU unless you're also shooting for an insane display as well. The $400 thing you'll get will smoke a PS5 or whatever the new XBoxs are called and will be perfectly acceptable even at 4k and amazing at 1440p.
 
2023-01-06 12:43:10 PM  

question_dj: LesserEvil: I've been pretty happy with my pandemic-purchased 5900X (meaning I had to pay a scalper $$$ off of StockX just to get it), but the 7950X3D looks like a fantastic, no-compromise processor.

Hmmm... let's see how my tax refund shapes up.

What are you doing that uses compute? I've got a 5900x and my triple monitor setup is not CPU limited in any fashion. I've got a 6800xt and it can barely handle 75fps@1440 on 3 screens. Enabling FSR doesn't even tax the CPU and 64GB of ram.


Not everyone games on their setup. I run 3D Studio Max and AutoCAD/ProE on mine w/3 screens and live rendering on. No games, ever.
My current guts can barely keep up, and I will likely need to upgrade this year.
I render aircraft carriers, auto engines, generators and internal machinery of other sorts.
More horses puling the load would be a good thing.

/Han shot first
 
2023-01-06 1:19:30 PM  
Ahem...

It looks bad when a major technology company is specifically calling out a competitor in a tech presentation, and that very presentation was done on the the competitor's hardware you are dissing. This is called optics. It devalues the messaging significantly, even if they didn't deliberately use the competitor's hardware. Maybe it points out how ubiquitous and easy-to-use your competitor's products are? It also leads to massive amounts of publicity for said competitor.

Did I miss anything?

/oh beans!
 
2023-01-06 1:58:35 PM  

question_dj: This is a thing I wish I would have known when I bullt out the 5900x. I would have gotten a less expensive CPU that would have gotten maxed out by the games along with the best GPU possible.


My working theory is overbuy on the CPU so that you can upgrade the GPU in a few years at the point that the CPU/new-GPU are balanced at the resolutions you play at. If you don't you will end up with a whole new build instead of just a GPU swap.
 
2023-01-06 2:21:41 PM  

avian: Hey look, it is an Apple fanboi trying to bash AMD about something that has nothing to do with them or the relevant product.

Here is a hint, presentations require almost no processor. It doesn't matter what laptop you are using (or the processor inside it) for a presentation.


Well, that's a switch. Usually it's an Apple Hater trying to bash Apple about something that has nothing to do with them or the relevant product.
 
2023-01-06 2:23:13 PM  

djfitz: Ahem...

It looks bad when a major technology company is specifically calling out a competitor in a tech presentation, and that very presentation was done on the the competitor's hardware you are dissing. This is called optics. It devalues the messaging significantly, even if they didn't deliberately use the competitor's hardware. Maybe it points out how ubiquitous and easy-to-use your competitor's products are? It also leads to massive amounts of publicity for said competitor.

Did I miss anything?


Apple isn't a competitor for AMD?
AMD sells chips. Apple sells computers and phones. You can't buy an Apple processor or GPU.
 
2023-01-06 2:28:37 PM  

BumpInTheNight: I'm still rocking my 9900K for my gaming desktop, if I was shopping for new though yes let's see how these doubled-cache variants do.

Had my fingers crossed that AMD would also show up with some new desktop GPU that could slug it out directly with a 4090 but that was wishful wants, its too bad, ah well still using a 2080TI for a while longer.


LesserEvil: I've been pretty happy with my pandemic-purchased 5900X

Last summer I was looking to replace my headless server and AMD was doing a fire sale on the 5000 series CPUs prior to the release of the 7000s, got a 5950X for a really decent price and I'm very happy with the thing.  Paired it with an EVGA CLC 360mm AIO cooler, I think it hits about 75c under max load.  Impressive for sure.


I upgraded my main system from a 6700K, and the difference was night and day. I recently picked up an RTX 3090 to replace my 3070, which was problematic running AI models at home. The 3090 breezes through and never complains about VRAM now :D

question_dj: What are you doing that uses compute? I've got a 5900x and my triple monitor setup is not CPU limited in any fashion. I've got a 6800xt and it can barely handle 75fps@1440 on 3 screens. Enabling FSR doesn't even tax the CPU and 64GB of ram.


I develop games and do a lot of other things with my main system (including the above mentioned AI work). When I first got it, I was in the middle of processing 16,000+ screen caps for a project, and the extra cores were a godsend. I can always use more processing.

likefunbutnot: Point of order: You probably don't need a CPU upgrade, especially if the most taxing thing you do is play games. Those 16c/32t guys are massive overkill for most purposes. Gamers are generally better off with the highest-clocked models in a series, rather than the highest core count versions. Save the money and blow it on your next GPU.

And you might not need a GPU upgrade either; even the mid-range GPUs these days are as or more powerful than what the consoles that AAA games target. A PS5's graphics horsepower is about equal to a 2070ti. More FPS at 4k is nifty and all, but massive shifts in 3D gaming engines mostly tend to coincide with new console generations and we're still in the middle of the current generation.

It's weird to say this, but we've gotten to the point where gaming isn't a meaningful driver of PC improvements, albeit mostly because everything is SO tied up in console capability.


I wish I could find the time to play games. I did play through a big chunk of Portal RTX last week during break, to check out my 3090, and also played some High On Life. My main system is primarily development, but when I want to play games, it's "no compromises" - A 7950X3D would be dev and gaming nirvana.

All that said, this current generation of video cards is likely a hard pass for me. The RTX 4090 is certainly impressive, but overpriced - and yet it is the best price/performance card of any of the current generation, including the RDNA3 cards. AMD is having issues with their stock coolers on the 7900XT and XTX cards, Nvidia turned into Scrooge with VRAM... and even relabeled as a 4070ti, the "4080 12GB" is really a 4060ti (in a sane world), but Jensen doubled down on screwing its customers by claiming the RTX 3000 cards were still the "current" stack, and RTX 4000 cards were the super-premium cards - just so they could sell the dust collectors they rushed out for crypto miners after the boom already faded.
 
2023-01-06 2:50:09 PM  

avian: djfitz: Ahem...

It looks bad when a major technology company is specifically calling out a competitor in a tech presentation, and that very presentation was done on the the competitor's hardware you are dissing. This is called optics. It devalues the messaging significantly, even if they didn't deliberately use the competitor's hardware. Maybe it points out how ubiquitous and easy-to-use your competitor's products are? It also leads to massive amounts of publicity for said competitor.

Did I miss anything?

Apple isn't a competitor for AMD?
AMD sells chips. Apple sells computers and phones. You can't buy an Apple processor or GPU.


If Apple isn't a competitor why did AMD take so much time making comparisons to Apple?

What I take away from this is that AMD needs to catch up to Apple, and that current Apple hardware is probably faster than AMD products. That makes AMD look bad, whether you want to frame this as some strict definition of competition.
 
2023-01-06 3:21:46 PM  

likefunbutnot: question_dj: This is a thing I wish I would have known when I bullt out the 5900x. I would have gotten a less expensive CPU that would have gotten maxed out by the games along with the best GPU possible.

I have an absurd PC because I have a camera that takes 8k video. THAT is something that needs hardware absurdity, but also takes me down some weird paths, like having both an Arc 750 and an RTX3060 installed (the Arc has hardware codec support for the native video out of my camera, which isn't present in $2000 nVidia cards).

If you're chasing gaming highs, the first thing you have to consider at this point is the limitation of your monitor(s). Do you have a Samsung Neo9 or something that approximates 4k 240Hz? Are you planning to game across multiple 4k displays? No? Then you probably don't need that RTX4090. There is little point in trying to futureproof your GPU for gaming, because the next generation is always going to introduce something or other that the previous generation didn't have (e.g. raytracing, DLSS 3). When you buy on the extreme end of GPUs in particular, you're also adding issues with heat and power delivery. My workstation uses about 1100W when everything is running and has to be plugged in on its own breaker; this can make the ambient temperature in the room where I use it as much as 15F higher. PC Hobbyist sites and streamers don't really talk about that part of things. Anyway, buy a midrange GPU unless you're also shooting for an insane display as well. The $400 thing you'll get will smoke a PS5 or whatever the new XBoxs are called and will be perfectly acceptable even at 4k and amazing at 1440p.


Oh, I need a 4090 to run 1440 on 3 monitors at 144fps. Waiting for the availability to get back to something. The 6800 xt does not cut it and fsr is dumb
 
2023-01-06 3:28:39 PM  

djfitz: avian: djfitz: Ahem...

It looks bad when a major technology company is specifically calling out a competitor in a tech presentation, and that very presentation was done on the the competitor's hardware you are dissing. This is called optics. It devalues the messaging significantly, even if they didn't deliberately use the competitor's hardware. Maybe it points out how ubiquitous and easy-to-use your competitor's products are? It also leads to massive amounts of publicity for said competitor.

Did I miss anything?

Apple isn't a competitor for AMD?
AMD sells chips. Apple sells computers and phones. You can't buy an Apple processor or GPU.

If Apple isn't a competitor why did AMD take so much time making comparisons to Apple?

What I take away from this is that AMD needs to catch up to Apple, and that current Apple hardware is probably faster than AMD products. That makes AMD look bad, whether you want to frame this as some strict definition of competition.


What you should take away from the article is that the writer is an Apple fanboi that tried to make the AMD presentation about Apple. When the writer is so obviously biased, the entire article is suspect.
Name a single thing that Apple sells that AMD sells a competitive product for. I'll wait....
 
2023-01-06 3:35:37 PM  

djfitz: current Apple hardware is probably faster than AMD products


It's not nearly so clear-cut as that. M-series hardware is generally comparable with mid-tier Intel and AMD SKUs in each segment where they each offer products. There are a couple areas where there is an advantage to Apple, specifically for content creation software, where it's easier to target the zero extra combinations of hardware in Apple silicon compared to Intel and AMD and nVidia CPU and GPU architectures in combination; and Apple CPUs are also low power compared to mainstream PC parts. There is a trade off for that, though, which is that Apple computers no longer have the capability for any sort of internal expansion. They have to be purchased with all the RAM and internal storage they'll ever have, and you have to be OK with Apple's integrated GPU as well. Apple is also allergic to decent cooling on its computers, so while its CPUs are fast-ish, they throttle constantly because of thermal limits

Apple makes a decent case for its less expensive notebook computers, where an M-whatever that will run for 16 hours and actually does have a decent GPU, but a less expensive Apple is still a $1000+ computer and if you don't care about battery life or actually expect to add RAM or storage, it's not a good choice. It's also true that relatively few people NEED a PC with an i5/i7-equivalent CPU and that level of performance can be found much, much cheaper.
 
2023-01-06 3:39:39 PM  

question_dj: I need a 4090 to run 1440 on 3 monitors at 144fps


I did qualify my post by addressing gaming interests, although I feel your pain with regard to to driving three screens worth of pixels.
 
2023-01-06 4:42:50 PM  
I had been given to believe that the 5800X3D was the last of the AMD4-compatible CPUs.

Is this no longer the case?
 
2023-01-06 5:08:01 PM  

PartTimeBuddha: I had been given to believe that the 5800X3D was the last of the AMD4-compatible CPUs.

Is this no longer the case?


New CPU lines will be supporting AM5 socket (and DDR5) moving forward, so the Ryzen 7000 CPUs (Zen4) support that.

That's not to say AMD won't crank out chips for AM4 sockets when they see fit, for example they may whip up a custom CPU for a large OEM application that uses Zen4 cores and the Zen3's IO chiplet. As I understand it, they did incorporate this capability. It's more likely such custom configurations may be thrown at mobile applications, though, such as laptops, or consoles where a large OEM might want some cost savings at the expense of performance.

AM4 was a nice platform, though... and AMD has been selling the heck out of their Ryzen 5000 and 3000 series CPUs.
 
2023-01-06 8:24:10 PM  

LesserEvil: New CPU lines will be supporting AM5 socket (and DDR5) moving forward, so the Ryzen 7000 CPUs (Zen4) support that.


Thanks, but, since AM4 supported 7900x and 7950x processors, does it also support 7900x3d and 7950x3d?
 
2023-01-07 1:19:52 AM  

PartTimeBuddha: LesserEvil: New CPU lines will be supporting AM5 socket (and DDR5) moving forward, so the Ryzen 7000 CPUs (Zen4) support that.

Thanks, but, since AM4 supported 7900x and 7950x processors, does it also support 7900x3d and 7950x3d?


AM4 supports 5800X3D, 5900X and 5950X in the enthusiast range. 7xxx series require an AM5 modbo + DDR5 memory.

/if buying a newer, now discounted AM4 processor as an upgrade
//make sure you can get a BIOS update if your board is more than like 1 year old
 
2023-01-07 3:06:47 AM  

fragMasterFlash: PartTimeBuddha: LesserEvil: New CPU lines will be supporting AM5 socket (and DDR5) moving forward, so the Ryzen 7000 CPUs (Zen4) support that.

Thanks, but, since AM4 supported 7900x and 7950x processors, does it also support 7900x3d and 7950x3d?

AM4 supports 5800X3D, 5900X and 5950X in the enthusiast range. 7xxx series require an AM5 modbo + DDR5 memory.

/if buying a newer, now discounted AM4 processor as an upgrade
//make sure you can get a BIOS update if your board is more than like 1 year old


Nearly all AM4 motherboards can be flashed without a CPU installed. Usually there's one particular USB port where you stick a FAT format flash drive with the firmware update. The board checks and runs the update process. Definitely check documentation because the process is implemented differently by the OEMs but it's not a big deal
 
2023-01-07 4:20:18 AM  
Meh. Claiming your chip is better doesn't mean your competition's chips are useless. You think every Toyota employee drives a Toyota?

/eel-free hovercraft since '98
 
2023-01-07 7:17:38 AM  

Russ1642: Meh. Claiming your chip is better doesn't mean your competition's chips are useless. You think every Toyota employee drives a Toyota?


I know a couple Toyota employees, and I can tell you that Toyota offers an incredible employee lease program, so the bulk of their employees do drive Toyotas.

Like, all you gotta do is insure it and put gas in it. They take care of everything else.
 
Displayed 31 of 31 comments

View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking




On Twitter


  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.