Read more.Primed for gaming at 4K.
Read more.Primed for gaming at 4K.
Excellent article, and especially great to see the breakdown of per-frame performance as that is crucial knowledge for this kind of setup.
I was wondering though why you didn't do dual card runs to show a comparison to a real counter-part ? The 690 and 7990 are dual cards sure, but are not really the competition to this, they are old hat, was expecting 780Ti SLI and R90X X-fire to see how this card compares to it's competition
Although I wonder how the 780Ti will fare with only 3GB frame buffer, with even basic FSAA on it would use a huge chunk of that at 4K
this is a great card considering the performance, if I was into business of heavy graphics I would go with two of these!
AMD scored high in every test I mean EVERY test but the price is something I won't look at it for well 5-6 years so good guy AMD nice job but not going to buy.
Great review Hexus, love the detail around smoothness and not just speed.
Congrats to AMD on building a dual-GPU card that has none of the usual problems of a dual-GPU card (micro-stuttering, heat, noise) - genuinely impressive and innovative piece of engineering. Never in a million years going to buy one though!!
"I want to be young and wild, then I want to be middle aged and rich, then I want to be old and annoy people by pretending that I'm deaf..."
my Hexus.Trust
TBH, I don't blame them, I've never seen any product other than computer equipment where such large cables are required for the amount of power (I know it's LV, high current, but even so...). Considering I'm well within specs to wire my multiple kW oven with 6mm^2 cable and PSUs typically use 18 AWG (IIRC that's about 0.85mm^2). 2*8-pin is 10 ground wires and 6 12V wires. Giving 5.1mm^2 of current sourcing wire and 8.5mm^2 of current sinking wire.
If you then add the fact that using multiple wires increases the surface area allowing better heat dissipation and thus increased current carrying capacity, it seems to me like AMD were fairly logical about it.
Thinking about it, the number of cables required seems even stranger when you think about the OCB. There's no way that the power traces have the same area as the wire carrying the power to the card. The PCI-E spec seems quite strange on power requirements to me - but I'd imagine it's so the cheapest PSUs with dodgy cable can still provide what they're supposed to.
I wonder what the implications are for budget PSUs though - i.e. the ones where they're only barely meeting the spec.
I appreciate that no-one will be buying a card this expensive and pairing it with a £40 1kw PSU, but you have to assume that more and more cards will start to use more than 375W as time goes on.
That's fine where they give the actual power requirements - we can spec accordingly and rightly hold up bad supplies that don't make it. It's where they spec 3x the requirements 'just in case' that make it confusing for all. Can you imaging the AMD/nVidia recommended system PSU if they followed their usual pattern of recommending something like a 600W supply for a small efficient card?
Just think, guys - give it 2 years, and there'll be a single-chip card running at this spec for £500. It's how it always is with these crazy "halo" GPUs; this is the tech demo for the future. And, looking at the results, I can't wait for it to be now.
when you consider the cost of the titan, which isn't a patch on this, for a very high end card, isn't that expensive(yes, i know it is out of most peoples pockets, mine especially)in comparison. seems the titan isn't the best card for gaming either.
Well, of course, but that's nothing new! My point is that over-speccing can mitigate against low-quality components that can't actually achieve the necessary sustained power requirements, but will it be sufficient to mitigate against cables that can't take the required current?
I'm sceptical on that latter point. Are low-spec PSU manufacturers, even at the 1kw bracket, going to put in wires of sufficient gauge to carry double the standard current?
I think it might be time for AMD and nVidia to maybe think about reducing the wattage of their cards.
The power and cooling requirements are getting kinda ridiculous really.
Even sticking to PCIE spec, 375W for a single component? Even the top line of enthusiast grade Cpu's don't use more'n about 125W and the gamer grade's are generally under 80W.
They should have fitted this with 3 x 8-pin connectors. Strangely enough, it seems the card only pulls 26W from the PCIE slot according to computerbase (http://www.computerbase.de/2014-04/a...chmark-test/8/, first time I've seen measuring actual card consumption rather than total system draw AFAIK).
According to W1zzard (http://www.techpowerup.com/reviews/A...295_X2/28.html), the 3D clocks are running at 1.15V so it doesn't look like AMD did any binning. Can't understand why AMD do so little binning whereas Nvidia seem to bin like crazy. Of course often than is to our advantage (no way Nvidia would ship a R9-290 which could be unlocked to RR9-290X for instance), but for halo products like this with a very limited run, I don't see why they couldn't have binned the chips and saved 30-50W or whatever.
There are currently 1 users browsing this thread. (0 members and 1 guests)