I too hope that AMD comes out with a fantastic 5070 or better competitor. I especially hope this works its way into next gen consoles, assuming they stick with AMD.
That said, I don't understand what seems like a common reviewer focus (not blaming Windows Central) on raster and pixel moving performance as the only important metric. Is it just because "that's how it's always been?" Granted, for older games, that is all that matters, but for those games that often already run at over 120fps at 4K anyway, incremental gains there are fairly unimportant.
What I care about are FPS and latency in modern games with full path tracing (path tracing replaces the need for baked in lighting solutions -- that's the holy grail of gaming graphics because it will release a huge portion of development time and effort currently spent/wasted on lighting while simultaneously creating more beautiful and realistic looking graphics). This includes pretty much all the new and upcoming AAA games, where high framerates with full graphical effects active are typically the most difficult to achieve.
Testing Cyberpunk 2077 in Raytracing Overdrive mode with DLSS and framegen off, or comparing the 5080 to 4080 without using DLSS 4 and enhanced generation is missing the point. Yes, it can be helpful for the niche purpose of comparing with older cards for performance in older games that don't support modern features, but that's all.
What matters for real world gaming today and going forward requires testing these cards with DLSS/FSR and Frame Generation running at their highest settings. That's what matters to real world use, assuming latency remains good. "Good" latency appears to require a base framerate before frame generation of at least about 30fps, maybe a bit higher for competitive gamers. The high framerates users care about of 120fps+ is NOT needed for low latency gaming -- that's a visual effect. This is because if the base framerate w/o frame generation is below about 30fps, there is a growing risk that the framegen will be filling frames that don't reflect the user's most recent input in a noticeable way (perceptible latency) or that miss or distort fast in-game events. Otherwise, at higher base framerates they are only adding frames that improve visuals without adversely impactful to gameplay latency.
If frame generation doubles or triples the framerate, AS LONG AS IT DOES THAT WITHOUT ADDING NOTICEABLE LATENCY OR REDUCING IMAGE QUALITY, that is 100% valid. I don't understand reviewers ignoring this. It's like reviewing an airplane for how fast it can taxi on the runway. These new cards fly because they do things older cards can't do or can't do anywhere near as well.
I know many people disagree with me. I have never heard a good reason as to why. Please beat me up on this if you disagree, but please also explain. If you think I'm wrong on this, then teach me so I understand.