An aquaintence is looking at upgrading from his Radeon 9000. I advised him toward the 9600Pro, and just overclock that, but another guy's telling him to go for the FX5700Ultra. Who's right?
An aquaintence is looking at upgrading from his Radeon 9000. I advised him toward the 9600Pro, and just overclock that, but another guy's telling him to go for the FX5700Ultra. Who's right?
Why not advice him to get the AIE 9800SE which is as fast a 9600 Pro, has AIW and has a chance of turning into a full 9800?
I suppose it depends on what he wants, I would get a 9600 pro as ATI cards seem to work better with DirectX9 titles from what I have read.
I can't remember where but i've read a review on the 9600XT vs the 5700 Ultra and the 9600XT wins hands down
I think i just googled for the 5700 Ultra and a review came up there.
As Loyal said, go for the 9800SE AIW as it's as fast as the 9600XT and you have the chance of softmodding it to a 9800PRO AIW.
Better bang for buck in my opinion![]()
Pcworld are doing an Leadtek FX5900 for £189 atm id suggest going for one of them coz thats wot im after![]()
Note the country. Importing is too difficult (Online dealers are usually unwilling to use normal postage on international orders, if they support international at all, and then, on top of that, there's the 20% import duty), so it really isn't worth it.
well, the 9600XT is not available ANYWHERE supposedly. The 5700U isnt bad. But you can find 5900(non Ultras) for the same price in a lot of places. I like the eVGA's 5900 that are clocked at 400/850 stock.
Never noticed thatOriginally posted by eldren
Note the country. Importing is too difficult (Online dealers are usually unwilling to use normal postage on international orders, if they support international at all, and then, on top of that, there's the 20% import duty), so it really isn't worth it.
Anyways....from what ive heard the 5700U aint all that so id go for the 9600XT
If you look on the net, you should be able to get a 9600pro for about £120
Early indications are that the FX5700ULTRA is faster than the 9600XT (which is little more than a 9600PRO with a slightly o/c'ed core). However the 5700U is still based on the flawed FX archy and as such may hurt you, esp in the future as nVidia and game developers aren't going to always modify code and drivers for FX owners. Not only that but the FX5700U use a lot more power, are bigger, produce a lot more heat and won't o/c as well as the 9600 are known to. As also said price is a big factor here as 5700U are around 30%+ more than a 9600P and you can often get a 5900 for around the same price which is easily better.
So there is no right or wrong, from what we know the FX5700U is slightly better than the 9600XT (or lightly o/c'ed 9600PRO) but it costs more and has a few downsides. When you factor in prices the 9600P seems the clear choice unless you can find a good deal on a 9500P or any 9700.
I've heard this a couple times now. What exactly is wrong with it?Originally posted by Austin
... based on the flawed FX archy ...
Put simply the way in which it handles things (esp the all important DX9 code) is simply poorer than ATI's DX9 cards. The GF-FX semm to require a lot fo special treatment from the game developer and nVidia in order for them to perform anywhere near their Radeon counterpart which is generally cheaper, cooler running, less power hungry and doesn't require such fast clock speeds. The key parts behind DX9 are the programmable pixel and vertex shaders and ATI's in general are simply more efficient. Not only that but nVidia decided to give their hw two DX9 modes using 64bit or 128bit precision which in theory sounds good. However ATI's 96bit precision has turned out to be a great choice as you get the same quality of output (indistinguishably) as 128bit but closer to the speed of 64bit. nVidia need to keep chopping and changing within a game but basically the 64bit looks a little poor while 128bit is needlessly slow. ATI's DX9 cards have also been out longer, are more compatible and more mature. You can kind of see the problems nVidia must have had when you consider the slowest perf card of their previous generation (GF4TI4200) can easily take on their current high mid-range offering the FX5600ultra, needless to say it seems nVidia no longer produce the GF4TI. FX5700ultra which still isn't available should address that though. That is pretty simplified but carries the principles.
the GF4ti4200 was the mid, to mid-high range card in their last generation. Remember the GF4MX's... were the slowest in the last generation.
ATI seems to have a more powerful pixel shader. Especially at PS2.0.
Also, game developers like Carmack and Microsoft's DX team had been asking for 128bit precision. Which is probably overkill, but based on the way things always doubled.... 16, 32, 64, 128 was the next presumable step. ATI made a wise (maybe lucky) decision to go with 96 bit which is perfectly adequate, but offered much better performance b/c of the exponential size(workload) increase as you go up in precision.
But I do think with Nvidia's relationships with game developers (& coding specific for the FX) the FX will stay competitive (5700U or higher) until the next generation. Who knows what will happen then, but my guess is Nv will take back the crown they've had for a long time before the 9700 came out. Then it will be a nice & easy upgrade for me. Update drivers and swap cards.
P.S. I think Nv screwed their naming up this time. Most people last time saw the clear difference between the TI's and MX's.
This time by producing low end cards like the FX5200 & FX5600 with the FX terminology brings down the perception of their high end cards FX5900+. B/c people say the FX line sucks. But the fx5700Ultra & higher are pretty good cards.
Last edited by chrisf6969; 10-11-2003 at 06:59 PM.
I know what you mean about the 4200 being the mid-range offering but I can't really call a moderately enhanced GF2 (ie the GF4MX) their last generation even if they did label it GF4. Really the 'last generation' was GF3 as GF4 was simply an improved GF3 so I suppose a lot depends on what people class as a new generation? Anyway it is true that the FX5700ULTRA and up are pretty good cards overall but until nVidia can address the FX's design shortfalls the Rads certainyl remain the better choice. People always said 3dfx would regain the title but they never did. However I agree that nVidia will improve but it will take time as it seems they need to go back to the drawing board ... with how quickly things change in the gfx industry we may see a surprise entry take the perf initiative ...
My guess is the NV40 will be an 8x2 (up to 16 or min 8 fills per pass) architecture and possibly blow the doors off ATI... compared to the 4x2 of the current FX gen. and 8x1 of current ATI
The low end GF4MX460 was comparable to the GF2Ultra (and GF3ti200) in many cases. Which is decent performance. And the 440/420 was more along the lines of a Gf2GTS.
P.S. I'm really getting sick of this "ATI vs. Nv which card is better"... its so overdone. FOR EVERYONE OUT THERE.... IT DEPENDS... ON THE GAMES YOU PLAY AND THE PRICE YOU'RE LOOKING TO PAY.
You really cant go wrong with anything higher than a 9500+ or a 5700+ from either camp. Well except for the 9800SE they kinda suck.
I'm surprised there hasnt been more FX vs P4 yapping lately.
ATI should be in a better position to release the next big step in gfx card perf. All they need to do is impliment 0.13mu on a 9800PRO/XT and either very fast DDR or some DDR-II which should all be very easy for them. The DX9 Radeons generally use lower clcoks and often give superior perf to the nVidia counterparts, again giving ATI the advantage. They also have the added bonuses of having their hw out for much longer than nVidia and also not having to work very hard at all to get the cards to perform optimally. Who'll be first to an increase in pipes (etc) is anyone's guess but even if nVidia do that they may find it fails to exceed the 9800PRO/XT on 0.13mu.
As for the GF4MX they not only increased speed (with faster clocks) but also enhanced a number of weaknesses that the GF2 archy had. Specifically the GF4MX significantly improved '2D' image quality, dual display, hw DVD playback and std'ised the implimentation of TVout. In terms of raw speed the GF4MX440 were equal to the GF2ultra while the MX460 took it a notch further. MX420 was actually comparable to a GF2GTS! The MX460 was still quite a way behind a GF3TI200 and those unlike the MX460 were VERY o/c'able and sported MUCH better AA+AF as well as DX8.
I understand what you mean about the endless debates over 'nVidia vs ATI' and 'AMD vs Intel' but there's plenty of fresh things to add much of the time. Of course you always get some mindless fan-boys but most of us are simply interested in deciding each part's weak and plus points, and there's nothign wrong with that. In the GF3 vs Rad8500 days things were very close and although they went about it in very different ways it was easy to draw parallels esp in pure perf. In the end GF4TI sorted that one out, they were easily the best choice for a good while. Things are now much more complex however but it is abundantly clear that ATI have the superior hw in practically every way INCLUDING the all important pricing.
9800SE hardly suck ... with the 256bitDDR 380/340 version (which seems pretty common) you get perf easily above the 9600PRO and on par with the new 9600XT and usually for a lower price too. Sure unlike the 9600 cards o/c'ing is virtually nill and the 256bitDDR is almost totally wasted BUT there is a chance of turning the 9800SE into a 9800PRO/XT, whether that's a 1% chance or a 50% chance it a feather in its cap. 9800SE are as good as the 9600, I'd prefer 9600 (series) myself but the 9800SE are FAR from a bad choice!
There are currently 1 users browsing this thread. (0 members and 1 guests)