Read more.AMD unleashes high-end Cayman on NVIDIA's Fermi. Find out who wins in this in-depth review.
Read more.AMD unleashes high-end Cayman on NVIDIA's Fermi. Find out who wins in this in-depth review.
Something really wants me to say "First! ", since i've never done it before .
But on a more relevant note, am disappointed that they couldn't push something better out. I was hoping that there'd be a decent contender for the GTX580 to push prices down since i'm in the market but it doesn't look that way.
The only question now is what to go for ¬.¬
The fact that tsmc canceled their 32 nm process they had to re-use the existing 40 nm process, and scale back the very ambitious plans for this gpu have obviously had a major effect.
Will be interested to see what the proper northern islands chips can do in 2011, when they move to the 28 nm process at global foundries, should end up cheaper as well what with AMD owning half of gf, and tsmc allegedly giving sweeter deals to nvidia for wafers.
*̡͌l̡*̡̡ ̴̡ı̴̴̡ ̡̡͡|̲̲̲͡͡͡ ̲▫̲͡ ̲̲̲͡͡π̲̲͡͡ ̲̲͡▫̲̲͡͡ ̲|̡̡̡ ̡ ̴̡ı̴̡̡ *̡͌l̡*
Originally Posted by Winston Churchill
6970 has lower Bang4Watt than the year old 5870? With only a 10-20% increase in power? What happened? The amount of shaders make no sense either, I was very much expecting 3200 shaders
3870 (320 shaders) -> 4870 (800 shaders) -> 5870 (1600 shaders) -> 6970 (1536 shaders)?
Very disappointing and very underwhelming on both the green and the red teams.
Why base the 5870 pricing for comparision ay £235, it's been available to buy for £195 from Ebuyer for weeks now, and they STILL have 115 or so in stock, so it's not unlike the price is based on a handful of units - it's readily avalable at that price, which makes it more attractive that the 6950 if you ask me.
Not that I blame you (ridiculous naming schemes and all), but there are a couple of instances on the final page where you refer to the 6850/6870 when (I think!) you mean the 6950/6970.
Hopefully stocks will be good and I can pick up a 570 or 6950 in the new year
So in summary, at common resolutions of 1920x1200 or below, the 6970 is between +/- 10% of the gtx570 depending on game. The large variation makes deciding which card to go for a headache
Will you be doing a crossfire review? Other sites are showing the 6950 scaling pretty well, with 75-100% increase at common resolutions.
Error found in review btw:
http://www.hexus.net/content/item.ph...=27983&page=16 - 6950 compared to gtx570 should be -16.6% instead of +16.6%
Yup, CrossFire examination on its way
I was hoping the 6950 was going to be stiff competition for the 570 so that the 570 would come down in price...
based on this review, it isn't and the extra for the 570 is well worth it
still, no massive rush yet as there still aren't any games I can't play on my GTX280 that I can see any difference in
The thing is that the HD6950 2GB is most likely going to compete against the GTX560. It would not surprise me if it drops under £200.
Even if the GTX560 is based on a GF104 with all its shaders enabled I suspect that the HD6950 would still be faster in most cases.
On top of this the GTX570 is still a 520MM2 part with 3 billion transistors. The HD6970 is a 2.64 billion part with is 389MM2 in size.
Didn't bother reading the rest of it after that. Sorry to say that Hexus will be getting removed from my review reading list. How much more are you going to let nVidia push you around on reviews?We like to benchmark with 'out-of-the-box' performance with driver-default settings. But there's been plenty of hoo-hah over AMD reducing driver-default image quality for post Catalyst 10.9 drivers. The situation is documented here. Bearing this in mind, we've manually adjusted the image-quality settings for the Radeon HD 6950 - disabled Catalyst A.I and slid filtering quality to High Quality - and shown it in all the graphs. It's identified by the IQ suffix. This, we hope, provides an apples-to-apples comparison to NVIDIA's default image quality.
FYI.
NONE of nVidia's IQ settings change a damn thing.
http://forum.beyond3d.com/showthread.php?t=59133
The game that started all this - Trackmania - has shown up to have worse IQ on nVidia cards
http://forums.guru3d.com/showpost.ph...&postcount=197
Shameful. What next? How much further are you going to bend over for this lot?
Erm, they're not? The highlighted bar throughout the review is the one with AMDs default settings: the one with the increased IQ is an additional bar which, to my mind, shows a site that's willing to go above and beyond in providing detailed analysis of the cards they're reviewing. If Hexus hadn't benchmarked the 6950 with both sets of settings people would accuse them of AMD bias. The review also comments at least once on how little difference the increased IQ settings make to the performance of the card. And Hexus has published several articles explicitly stating that they've compared the two sets of IQ settings extensively and can't see any difference themselves. So let's have a little for the extra effort to run a card through the full set of benchmarks twice, huh?
As to my own point: were people really expecting the 6970 to challenge the GTX580 for single GPU supremacy? When there was a rumour of 1920 shaders it seemed possible, but since it was pretty much confirmed that it would be a 1536 shader card I think any possibility of it putting GTX580s nose out of joint was confined to the dustbin. This is clearly a pragmatic card: it may have had a month or so's delay due to manufacturing issues, but that's nothing compared to the delays NV had with the massive Fermi dies. AMD very sensibly architected a card that would give a performance boost over the incumbent cards at the same launch prices. I'm not sure why we keep expecting miracles with each new launch: hardware development goes in cycles and we had a huge leap in bang-4-buck from ATI just a couple of years ago with the 4 series - it'll be a while before we get another leap ahead like that...
Er.. you should have just read the rest of the review. The vast majority of the results are done with default settings (as ATI intended), while they have also thrown in some 'IQ' setting results to show the effect of moving the slider on the 6950. If anything this would annoy nVidia more, as they are using a (slightly) lower quality setting for the default results.
The changes, along with statements from relevant parties, are already discussed in the article linked to from the part you quoted.
Ok I was overly hasty, and obviously Tarinder has attempted to come to a reasonable solution by only doing it for the 6950, but that isn't really the point. What about the next review? All HQ settings or not? Does this just become another unfair advantage that nVidia has done NOTHING to deserve?
Don't think that just because it's only 1-5fps that is being lost, that it will be that in all cases. nVidia is absolutely insidious and will use everything they can to get an unfair advantage.
This is just the latest in a long line. So yeah, I understand it isn't easy with the pressure they put on, but if it isn't stopped now it's just going to get worse.
There are currently 1 users browsing this thread. (0 members and 1 guests)