Nvidia RTX 4000 rumor raises more fears over power consumption

Nvidia’s next-gen RTX 4090 flagship might have some telling power consumption, at least at first glance according to the latest rumor – but there’s more to it than meets the eye (plus better news further down the line). the RTX 4000 range).

This all comes from a well-established hardware leaker on Twitter, Kopite7kimi, who outlined some alleged power details for Nvidia’s Lovelace chips, from the AD102 (the flagship) down (as spotted by VideoCardz (opens in a new tab)).

See more

So the theory is that the AD102 GPU will have a ‘power cap’ of 800W, which sounds seriously heavy and worrying (and reminiscent of earlier speculation that Lovelace will be very power demanding).

However, what we need to keep in mind here is that this power limit refers to the maximum possible power for the GPU, and in reality, the nominal TDP will be considerably lower than that of basic graphics cards. We’ll come back to dig deeper into what that means in a moment.

The best news is that while the AD102 is all the way up at 800W, Nvidia is supposedly dropping with the GPU that’s supposed to drive the RTX 4080, with the AD103 sitting at 450W (again, this is the maximum possible power). AD104 is then set at 400W and AD106 at 260W.

Kopite7kimi also mentions mobile GPUs, with the AD103 and AD104 mobile chips apparently set to operate with a 175W cap, and with AD106 mobile we’re theoretically looking at 140W.


Analysis: putting things into perspective in terms of power

Okay, so before we get too worried about that whopping 800W figure slapped on the AD102 chip, let’s remember this is just a rumor, and we’ve seen a number of speculations about the where Lovelace’s power consumption could end up. But then again, that doesn’t necessarily mean the RTX 4090 – which is supposed to be the first graphics card Nvidia will launch with the next-gen Lovelace family – will be that monstrous.

As noted above, 800W refers to the maximum power possible for the GPU, and the rated TDP will be less than that, with only more advanced third-party graphics cards pushing things much faster with clock speeds ( and powerful cooling) approaching this ceiling (also leaving room for overclocking).

The other key point with AD102 is that the RTX 4090 will have a scaled down version of the GPU, with the vine believing it will run with 16,384 CUDA cores (assumed maximum for the chip is 18,432). There will be other models, most likely an RTX 4090 Ti, and we might even see an RTX Titan for Lovelace above – and perhaps only the Titan will push to fully exploit that maximum power limit.

When you factor in those two factors, the RTX 4090 will likely be much lower in actual power consumption, and it will be the full AD102 models – and the fastest high-end cards, for that matter. – which will increase their consumption to approach this mark of 800 W (if indeed it is correct in the first place).

So don’t start worrying about the RTX 4090 power-wise just yet – the GPU could easily match some of the other recent rumors we’ve heard such as 600W for the TDP, although speculation around a 450 W TDP seems more fragile in this new light (these TDPs are also claims by Kopite7kimi, in fact, and RedGamingTech on YouTube in the latter case).

Another interesting point to note here is that the maximum power drops to 450W for the RTX 4080. Very recently, Kopite7kimi said that the 4080 might nestle at around 420W for its TDP, but this new information offers a tentative suggestion that could be a bit lower than that (earlier rumors had us expecting around 400W). And that would be better news for the much larger number of gamers who will want to buy an RTX 4080, compared to the more specialized RTX 4090 with its undoubtedly exorbitant price.

To sum up, let’s not yet get carried away with worries about possible power demands, although the possibility of necessary PSU upgrades – and indeed better cooling solutions to maintain the PC’s interior at a reasonable temperature – clearly remains a concern. for those looking for a high-end Lovelace GPU.

Comments are closed.