- Aug 3, 2015
- 7
- 0
- 0
I have Windows 11 on a Lenovo Legion 5 Pro AMD laptop with a GTX 3060 discrete GPU. When I'm not gaming and the 3060 is turned off, it uses Radeon integrated graphics. A 100W USB-PD power supply is plenty to keep it charged, and even on battery it lasts a good long time.
Problem is, if I then launch a game - even something which performs passably on integrated graphics - it uses the 3060, which consumes more power than USB-PD can provide. It switches over to battery and quickly eats up the charge. It seems that Windows's decision of "should I use the discrete GPU for this app?" is based on "is it a game?" and not "do I have enough power available?" Yes, I could go into System>Display>Graphics and manually change the game from High Performance to Power Saving each time, and then change it back when I'm using the huge power brick, but that's a hassle.
I would *like* Windows to be smart enough to see that I am not using the huge power brick and to therefore not use the power-hungry 3060 GPU. Is there any way (within Windows, or a third-party app) to automate this?
Problem is, if I then launch a game - even something which performs passably on integrated graphics - it uses the 3060, which consumes more power than USB-PD can provide. It switches over to battery and quickly eats up the charge. It seems that Windows's decision of "should I use the discrete GPU for this app?" is based on "is it a game?" and not "do I have enough power available?" Yes, I could go into System>Display>Graphics and manually change the game from High Performance to Power Saving each time, and then change it back when I'm using the huge power brick, but that's a hassle.
I would *like* Windows to be smart enough to see that I am not using the huge power brick and to therefore not use the power-hungry 3060 GPU. Is there any way (within Windows, or a third-party app) to automate this?