A few years ago NVidia’s then top end titan XP drew around 250 watts at load with AMD's competing Radeon 7 weighing in at 300 watts but now team red's current best the Radeon RX 6950xt has increased to 330 watts while NVidia’s 3090 TI has a TDP of a whopping 450 watts may as well just come out of a flame broiler toasted GPU and the expectation is that the upcoming RTX 4080 featuring NVidia’s new ADA Lovelace architecture could clock in at around 400 or 500 watts while the 4090 could suck down as much as 600 watts of power on its own but why does more power automatically have to be the way we try to get more performance we'll tell you right after we thank corsair for sponsoring this video if you're looking for a new headset check out corsair's hs65 surround wired gaming headset featuring high quality custom tuned 50 millimeter neodymium audio drivers to deliver excellent sound with the range you need to hear everything on the battlefield you'll communicate to your teammates with terrific clarity thanks to the omnidirectional microphone with flip to mute functionality get out of here learn more at the link below now one big reason that manufacturers might not be paying too much attention to how much power their cards are guzzling is simply because they don't particularly have to I mean sure you can advertise a desktop card as being power efficient or having a good cooling solution but at the end of the day the thing that's going to sell cards is performance of AMD and NVidia would much rather compete on FBS benchmarks
Company by saying hey our GPU uses 15 less power and this trend might continue due to the rise of chip lets in CPUs and GPUs instead of the use of one big monolithic chip dye if you don't know chip lets are modular chip pieces that can be combined to act as a single processor and are gaining popularity in fabs because they have better yield meaning a defect on the wafer will only affect one small chip let rather than a whole complete processor it's likely that chip lets will allow companies to build bigger GPUs more profitably something that AMD in particular seems quite interested in and we can't rule out that NVidia may move in that direction someday either of course this doesn't mean there isn't an upper limit to how much power a card of the future will draw high wattage power supplies are expensive and expecting folks to save up for both an expensive GPU as well as a 200 plus dollar power supply might just be too much to ask not to mention companies that build pre-built pcs won't be too happy about having to spend extra money on nicer power supplies which they've traditionally cheap out on and even though there are plenty of spacious gaming oriented cases on the market an insanely high wattage card means a super bulky cooling solution that would take up an unpalatable amount of space or spit out unacceptable amounts of heat not ideal if you're in a small poorly ventilated room like you are right now hey that's the end of this video guys thanks for watching.