Read more.Quote:
Industry sources claim falling orders for team red.
Printable View
Read more.Quote:
Industry sources claim falling orders for team red.
It looks like a 1 month bump at the time the 970/980 was in full swing......I wouldn't draw too much from it yet.
I will start to worry if the new cards do require a LCS though....especially in light of the TDPs for the 900 series.
I do wonder if some of it is a overall AMD image issue. Their shyness from the desktop market may have a larger effect than I expected. Seeing more news of upcoming Kaveri refreshes, while no news of a replacement for the FX line doesn't bode well in that area either.
Seriously why is such a big deal being made about this? For comparison, 290X has a listed 290W TDP and does perfectly well on air cooling. Also worth consideration is the fact AMD values are essentially never-exceed values as far as gaming is concerned, Nvidia values have become something more akin to SDP on recent cards; this can be seen even across Nvidia's own generations where a negligible real-world power saving is accompanied by a massive spec TDP drop. And hence, TDPs are not comparable when you have nothing else to go on anyway.Quote:
The rumour mill also suggests a toasty 300W TDP that will, most likely, require a liquid cooling solution as standard.
Also, the whole 300W thing came from a profile page talking about some high-level design aspects - that in no way translates to 'I designed 380X which draws 300W'. Rounding up, most current high-end GPUs fall into 300W class.
Yikes, 300W TDP... not something likely to fit in with small form factors or quiet computing. The rest of the industry seems to be moving beyond "turn it up to 11, sod the TDP" to improve performance, seems AMD's graphics wing need to do the same.
I will say from my limited group of PC building friends I know many who don't even look at AMD.
Personally I have usually gone for best value which has lead AMD cards for quite a while.
If I was buying right now it would be a 970, but personally I don't see it as a big enough jump on my 7950.
So in short I would guess there is a large number of nvidia fans, a smaller number of AMD fans and a good number in the middle like myself.
With that it would be a huge shame to lose either from the market, one big player would be a disaster.
GTX 970. 145W TDP apparently: http://www.techpowerup.com/reviews/M...Gaming/25.html
Considering you can't buy reference cards, look at the MSI numbers (but even reference far exceed TDP):
168W Average
192W Max gaming
213W Max Furmark.
Yeah. 145W. :rolleyes:
Also see the 290x which is 290W TDP;
258W gaming avg
294W gaming max
328W furmark peak
300W AMD TDP != 300W Nvidia TDP. And just to reiterate, 290/x are already nearly 300W TDP and are fine.
Add about a third to Nvidia TDP to get what AMD would call TDP.
Edit: Using some back-of-envolope maths, 300W AMD TDP it seems would be about 225W Nvidia TDP, for what it's worth. Which is incidentally lower than the 250W rumoured TDP of the GM200 GPU. Because we know how accurate these rumours must be, so that must also be true!
192 > 145 is a ~25% decrease. 300-25%=225.
The appearance and packaging of a graphics card seems important to buyers (or else graphics card makers are wasting a fortune on fan shrouds, LEDs and too many pins on power connectors). That being the case, low-cost liquid cooling could be a marketing feature, not a problem. How many AIO CPU watercoolers are actually necessary?
The '290W' Hawaii does fine on air. I doubt 10W extra, even if true, would change that even slightly. I agree it might appear that way to some people, especially if the rumour mill picks it up as a negative somehow. And there's no defending the stock Hawaii cooler was poor, but if all AMD wanted was a reasonably cooled card in future, all they would need to do is allow custom coolers from the start rather than forcing a poor stock cooler.
This appeared at the right time for nvidia, hope its just an accident like the GTX 970 ROP/L2 counts!!!!
The problem is, no matter how you dress it up, there is still over 100watts difference for a similar perfoming GPU, and that is unacceptable for most people. Why buy something that puts out more heat and uses up alot more power? In the age of people being more eco-aware, AMD's approach is extremely wasteful, and shouldn't be defended. It is the denial by AMD themselves by letting things get ever worse in this respect that have lead them down this path in the first place. It annoys me since I'd like to buy an AMD card when I next upgrade if things were close enough, to support them (currently an HD5850 owner), but I'm not going to blindly do this with those kind of figures.
People were quite happy buying Nvidia cards when they were less efficient and its funny how most systems are lucky to see 400w even at the wall.
Do they have working Linux drivers yet?
No?
Well then. Green please.
I was comparing the way in which AMD and Nvidia measure TDP, showing how the rampant hysteria over the 300W rumour, and lets just reinforce the fact it's a rumour, is silly. It's also worth remembering that while power draw is an integral of efficiency, performance is another, and hence it's remarkably short-sighted to dismiss a card as inefficient before even having performance numbers to hand.
Maxwell is a far newer GPU architecture than the current iteration of GCN so it would be a surprise were it not more efficient. Further to what CAT said, AMD have frequently had more efficient cards (and have historically nearly always been the first on to a new manufacturing process by a considerable margin), and yet I don't remember the same slating of Nvidia cards in those days; power measurements, if present at all, were a barely-mentioned footnote in reviews. Progress has been slower over the past few years with things like lithography challenges slowing efficiency improvements, but despite it being a long cycle, it's not all that different to the leapfrogging we're used to in the GPU market.
E.g. the 580 was Nvidia's flagship when the 7970 was released: http://www.techpowerup.com/reviews/AMD/HD_7970/29.html
Heck, the 5870 on the same 40nm node as the 580 was already far more efficient despite being out long before the 580.
I don't see how it can be classed as defending some 'wasteful approach', this architecture was designed long before Maxwell and was fine in terms of efficiency for its debut, far more efficient than Nvidia's incumbent flagship, and really quite close to what Nvidia released in response. OTOH if the 290X was released this week as a response to Maxwell then it would be a fair criticism. But it wasn't. So it's not.
I wonder if those figures include laptops where nVidia has traditionally run been well ahead, and ownership and desktop/laptop ratio has changed at all over the years (I've had to due to mobility demands).