Did you even read the article? Lowering the resolution does not magically give you a higher framerate. Stop placing so much importance on the GPU. The CPU is the bottleneck here, and the Series S and X are almost identical in that aspect.
I'm not saying 60FPS on Xbox Series X|S is impossible or can't eventually happen, but it has very little to do with tinkering with the game's visual settings or resolution.
Yes I read your article, but you clearly didn't read my 5 sentences. I listed several things like turning down certain effects (particle effects) that would reduce the load on the cpu. Your article reads like a dmg control piece. Just my opinion tho. As for the CPU being a bottle neck get out of here. I maybe an armchair developer but I am a System Engineer so hardware isn't out of my wheel house. But I'll humor your CPU bottleneck claim.
Tell me what's the main things devs do to maintain a stable frame rate?
Dynamic resolution and temporal reconstruction. Both lower the rez to give head room for higher frame rates. The reason for this is because of last gen. The CPU on those APU were for tablets so devs had to put a lot more process on the GPU. Fast forward to now and the main thing holding performance back on any console is devs refusing to update their rendering pipeline for modern hardware. There are many reason for this, money, games were already in development, they are using an off the shelf engine (unreal), etc.
Starfield in particular is on a notorious poorly optimize engine and I know they said they modernized it but we saw the results when they first showed Starfield. You build a house on a shake foundation and thats what you get. Nobody was expecting to be wowed by the graphics but we all know that engine chugs.
That being said simply turning down the resolution would give them more room in their rendering budget. Like that's just a simple fact, you don't have to know crap about tech to know this. I know every PC gamer knows this. Lowering the viewing distance reduces draw calls which again give you more room in your rendering budget. Lowering the amount of particle effects reduces load on the cpu. Todd said they were using GI in the game, you can reduce the amount of bounces it calculates, reducing the load on the cpu, and again give you more room in your rendering budget.
And that stuff is low hanging fruit. Things that you can implement fairly quickly with little disruption. They could add FSR 2.0 and VRS. Heavy lifts for sure but if you were trying to get good frame rates you would have tried to include this things (FSR 2.0 might have been a little late for them to use). They could have made a mode with an unlocked frame rate and just made it so you had to have a TV VRR to use it. There are so many ways to crack this nut and Todd had the support of a trillion dollar company with an army of engineers.
I say all this to drive home the point that if Todd wanted that game to run at 60 fps it would be running at 60 fps and nothing about his vision or creative direction would have change.
But that's my problem with your article. Todd is on record multiple times saying he doesn't care for high frame rates. So it is what it is. That's what he wanted and thats what we are getting. Phil already said it was the creative vision not a hardware problem, so IDK even know why you wrote this frankly. It was debunked by the Xbox CEO before you even published it. My opinion on your article is just that, an opinion. But those things they could do to improve frame rate without changing the creative vision of the game are all facts.