i pay good money to be "professionally excited"
i pay good money to be "professionally excited"
VodkaOriginally Posted by Ephesians
Wow, that was a bit of disappointment but then I suppose I was expecting a huge leap in performance than the minor improvement over a 9800 GX2.
Lets hope ATi/AMD can shine brightly at this particular moment just to give NV a kick up the backside.
Wait until they make a dual gpu version of this... then you will see the jump
The reason why they haven't released a dual gpu version already is a business reason; it won't maximise expected profit...
we can only prey and hope
I feel most are upset with the price, I think this card is very good for a single solution
Decidedly underwhelming from my point of view - I had high hopes for this card.
It gives the 9800GTX a thorough spanking but then that was a load of rubbish anyway...
The only significant upside is that the idle power draw is impressive, and let's face it that's what really matters in a flagship high-performance card, right??
That is the main part where it has an advantage over the GX2 however still dont believe it warrents the insane price premium.
Remember tho these prices are initial sales prices were (as has been mentioned) all the enthusiasts jump ont he bandwagon and buy it to be the first ones with the new tech. Surely they will fall by 40-60 in a month or so?
I reckon the GTX260 is the way forward. Good performer, fairly low power and seems to be a much more sensible price bracket although we will have to wait and see.
Well the bang per buck graph at the end makes me glad I have a 3870...
I'd be interested in seeing the minimum framerate figures, it seems like more than just average framerate is needed to differentiate cards that appear to perform similarly.
that price premium will make the elecricity savings worthwhile after quite a few years, so it's not really worth it. unless you're gonna use that card for like 10 years..
This'll have to drop £150, and the GTX260 £100 for them to be big sellers.
This could be NVidia's Pentium 4. Do AMD (ATI) have a chip that can take advantage of NVidia's mistake (sticking at 65nm) like they did with Intel? It would be nice if they did, because it would force NVidia to do what they should have already done.
There are currently 1 users browsing this thread. (0 members and 1 guests)