Clicky

Turning on Variable Refresh Rate Fixes High Idle Power Consumption of RX 7900 XTX

rx 7900 xt

Typically, if you are using a desktop computer with a discrete GPU, your GPU would consume a few watts when idle but can easily surpass 100W or even more when performing intensive GPU loads. But if your GPU is consuming more than 50 watts at idle, there is definitely something wrong with it.

Most of the time it’s the driver’s fault and unfortunately, to this date, we are facing such problems on modern GPUs. Except for Nvidia, both AMD and Intel GPU users have reported that their GPU is consuming over 50 watts or sometimes even 100 watts when doing a simple task like watching a youtube video like you are doing right now.

This problem is prevalent more on the higher-end cards like the RX 7900 GPUs from the RDNA 3 lineup where the GPU can consume over 100 watts on idle on higher refresh rate monitors. But this has a very simple solution which some of you might be doing unknowingly but is the main culprit of high power usage.

As per the findings of Computerbase, this problem is linked to the Variable Refresh Rate feature. Variable Refresh Rate is a technology that syncs your monitor’s refresh rate to the frame rates of your game or any other such content.

This is available in the form of Adaptive Sync, AMD Freesync, and Nvidia G-Sync which are now available in most gaming monitors today and can be turned on via GPU software.

computerbase table

In the tests conducted, when VRR was turned on, the power consumption of the RX 7900 XTX was reduced by 81% on a 4K monitor running at a 144Hz refresh rate. Similarly, with dual monitors, the power consumption went down to 71%. So, a big reduction in idle power which was constantly at 100+ watts.

It also benefitted window movement and watching youtube videos even though the latter saw only a 5% power reduction. However, once HDR was on, it was a different story.

Apparently, this did not only affect the 7900 XTX but even the previous-gen RX 6000 GPUs. The RX 6800 XT saw a whopping 79% power reduction coming down to only 9 watts from 43 watts previously. This was exactly the case with the RX 6700 XT.

Other scenarios also some power usage reduction but unfortunately, VRR was not able to affect the power consumption in the dual monitor setup. Similarly, we saw such a trend with the RTX 3060 Ti even though not so drastic but here the GPU was not having such problems from the start. With other Nvidia GPUs, VRR negatively affected the power consumption in half of the cases and benefitted in other scenarios.

Still, there are reports that this is happening to even some Nvidia GPU owners where some users have reported that their GPUs were consuming more than 100 watts on dual monitor setup. But this is not very common here and as you can see from the chart, Nvidia GPUs are much more power efficient.

However, with Intel. Things went worse with ARC A770 not consuming more power than before. The A770 was already consuming more than 40 watts on idle and now it saw another 10% increase in power usage. This is a big concern that has been existing for a while now and needs to be fixed ASAP.

In conclusion, except for Nvidia GPUs for the most part, both AMD and Intel need to work on this problem so that the idle power consumption can be as low as possible similar to when the VRR is turned on. So, if you are using a monitor that supports VRR, it is recommended that you turn it on through your GPU software and monitor the power consumption.

Xtremegaminerd
Logo