Yeah, I know that it chews up a CPU that's below recommended specs. The listed required CPU is NOT enough, really. Not even close, for any game settings.
It's an AMD ... uhhhhhh. It isn't an X3D. Can't remember the model, and I'm not at home to look. It's a 12/24-core with a max clock speed just short of 6 GHz. Plenty of cache in a CPU like that, of course.
Will an Intel wafer with similar numbers really run worse than what I have? I would think perhaps it might run slightly higher in terms of utilization, but it should still be well within tolerances, right?
I would think that the Nvidia GPU would be more of an issue. The drivers aren't great, yet (but not as bad as the Intel ARC drivers). We should see serious improvement over the next couple of months of driver updates.
I have 32 GB of DDR5. I was contemplating 64, but no games are going to utilize more than 32, for at least the next 6 or 7 years. I'll be building a new PC around that time, when the 70-series comes out.
And say, if an Intel CPU makes the game run much worse, we shouldn't be blaming the SSD, right?