Put it this way Custom PC had a round up of graphics cards a month or two ago and they had a bar graph at the end of the group test that listed the power consumption for each system. Note that I said system and not graphics card. From what I can gather they had one computer comprised of a Core 2 x6800 @ 3.19Ghz, 2Gig PC2-8000 Ram and an Intel D975XBX2 motherboard. The only difference in each system was the graphics card which I assume they just swapped out.
They found that while running 3dmark06 the Radeon 2900XT system consumed at peak 355 watts and the 8800 Ultra system consumed 312 watts. When the system had the Radeon installed it was the most power hungry of all the other systems. AMD state on their website that the 2900 Series requires a 550 Watt PSU. What is strange though, is that although the 520Watt Corsair is 30 Watts short of the recommended wattage for use with a 2900 card, it is still certified for use.
Nvidia state that a single 8600GT should have a 350watt PSU or better, yet in the same system as above the system with a 8600GT in it only consumed 180Watts.
On this basis we are buying power supplies which provide more power than many of us require and spending in general more money on larger capacity power supplies. Having said that I guess they "recommend" PSUs which have a much higher output than is actually needed because they have to cover their asses (which is fair enough) and factor in older power supplies which lose their capacity to provide a reliable level of power output as they age.
Despite all this, when dealing with the issue of power supply I would rather stick to what is recommended and play it safe sort of speak and have some headroom rather than go on something presented in a magazine no matter how much I enjoy/trust the mag even if what they show is correct and proper.