maybe ill upgrade to a 9800 after they have all launched their new cards and the prices of the older 9800 has dropped. But for now, my ti4200 is fine
Yes
No
Waiting for ATi
Will wait for price drop
maybe ill upgrade to a 9800 after they have all launched their new cards and the prices of the older 9800 has dropped. But for now, my ti4200 is fine
No.
http://www.hardocp.com/article.html?art=NjA2LDQ=
What we were most surprised by is the FarCry performance. We simply expected better considering this is a 16 pipeline card that is able to execute 2 shader instructions per cycle and is supposed to have increased shader performance. It was disappointing that the 6800Ultra was only able to go up in quality by one AA level over the 9800XT in this game. We were hoping to see a larger improvement in performance over the 9800XT, but in FarCry this just wasn’t the case.
Like for like it's 25% faster in FarCry than 9800XT which is good but it's not that good IMO. I cannot help but feel a little disappointed and was hoping to see a 50%+ improvement. Will wait for ATI 16 pipeline part the X800 XT.
No chance at those prices.
Buying a top of the range card is nearly pointless. I've just ordered a 9600 Pro - and I bet it'll last as long as it's 9800 brothers and sisters. Sure it won't be quite as fast, but technologically they are similar, and so as new D3D revisions are made they'll all start to choke.
Still, I guess if you have the money there's a lot of eye candy to be had.
Also, people should wait to see reviews of retail boards before making a final decision.
M8 - this card isn't very good at LOW resolutions, it slaughters everything there is @ 1600*1200!Originally Posted by iMc
I will be getting one but won't be paying
BTW - That review was probably one of Hexus's lower reviews, it only covered 1024*768 and totally skipped the fact the card is amazing at high resolutions (1600*1200).
G4 PowerMac - Tiger 10.4 - 512MB RAM
MacBook - 2Ghz - 1GB RAM - 120GB HDD
Rotel RC970BX | DBX DriveRack |2x Rotel RB850
B&W DM640i | Velodyne 1512
i totally agree with that and im still more than happy with my 9500 proOriginally Posted by kez
too much disposalable income i spose
Having looked at a few more reviews....i was abit shocked to find that it didnt really have a very big improvement over the 9800XT. As with some people above i really expect to absolutly blow away the now older generation of cards - and that fact that it actually gets beaten by the 9800XT says a lot.
Not too keen on spending over £150 really...so come september ill have to look at the all round prices and maybe pick up a second hand r9800pro...
| XP1600-m | ASUS AN78X Deluxe | r9700 pro | 2x512mb pc37000 |
"Compared to the GeForce 6800 Ultra, the former high-end models Radeon 9800XT and FX 5950 Ultra often seem like nothing more than cheap mainstream cards... "Originally Posted by blockers
We must be readng different reviews mate.
And remember, this is a very early look at the card, with driver imporvments a plenty to come.
Waiting for the ATi *RESULTS*
Im not doubting ati will produce a strong reply but I must say unlike the 5900, i am very impressed with 6800 results.
3D Mark 2k1 - 20661
If you get a customer, or an employee, who thinks he's Charles Bronson, take the butt of your gun and smash their nose in.
sorry about talking a load of tosh earlier!
i think that for what your getting though there isnt enough improvement for the money.
having said that ill snap one up if ne1 wants to get me one . and a new psu :s
HEXUS|iMc
i will save up and get one seeing as they are loads faster than my aging fx5800 ultra
Figures look great - I still love my CRTs (won't be parted from 1600x1200) - so the figures look good to me
Whats worrying me is still the Nvidia driver 'optimizations'. In Firingsquads review they mention that they still couldn't turn off the 'brilinear' filtering in UT2004 - the sky wouldn't even render in their test level.
The 480W figure makes me wonder. Nvidia apparently says its the 12v line that the cards need. My old, stuck in a cupboard, unused Qtec 550W provides 14A on the 12V, while my 480W TrueBlue provides 22A - so what do they mean? Are they just saying 480W to cover the Qtecs (who quote 'peak' usage, not max) or do they mean a real 480W supply?
Now go away before I taunt you a second time.
I'd say they are just covering their ass, even a cheap 480w should be able to power it, whereas a quality 360w probably won't have any difficulties.
They state such a large psu because smaller ones would have the OCP trip when the power goes on becuase the start up current can be a lot more than the normal running current. This happens with shuttle psu's and 9800 pro cards for some people - they have to get round it by delaying the cards power by 2 seconds, either manually with the power plug or using a timing circuit (which is on the bit tech forums if any one needs it).
The two molex's however is pointless there is no way the card draws more power than the cable is rated for. The only good reson I can think of is that the cards PCB can not handle that much current if only one conenctor is used, in which case there are better ways than to use two molex's and I expect a few people will get out Mr. Soldering Iron and join the two together so only one molex is used.
I think people are expecting far too much from this card (I blame nvidias marketing department - they are evil and the reson I hate nvidia).
The core speed is ONLY 400mhz, its made using 13nm low-k, it WILL clock upto about 750mhz+ depending on the pcb design which is what Ive found to limit my 9700np to 450mhz. I dont expect nvidia to redesign there card anytime soon instead I think they will gradually raise the clockspeeds untill it hits the point where the yeilds get so low its uneconomical with that design (probably about 600-650mhz).
The card is overpriced for what it is (just another speedbump, no noticably good new features.. just speed - the result of the new features. I expected them to fix there IQ problems (there AA and AF are _ _ _ _ _ compared to ATI's). Also they should remove the trilinear optimisations (which are not worth the peformance gain compared to image quality sacrifice) and remove the cheating crap from there drivers...
14k in 3dm2k3 is however very impressive, considering its not THAT much better in other tests.
The r420 will no doubt be much faster..
Last edited by SilentDeath; 15-04-2004 at 01:43 AM.
There are currently 2 users browsing this thread. (0 members and 2 guests)