http://www.ngohq.com/home.php?page=A...read&arc_id=46
Personally, I kinda doubt it - from what i've seen there's nothing to prompt nVidia to produce one bar perhaps adding a 512mb card to the 78' range.
Printable View
http://www.ngohq.com/home.php?page=A...read&arc_id=46
Personally, I kinda doubt it - from what i've seen there's nothing to prompt nVidia to produce one bar perhaps adding a 512mb card to the 78' range.
Well, AFAIK the x1800XT beats the 7800GTX by a good margin, but maybe I've seen an isolated result. If ATi have taken the crown back across the higher range, then I don't imagine it'll take Nvidia long to retake pole position by retaliation with the 7800U.
I've read a number of benchmarks now and i see no huge gains tbh (not overall anyway, in isolation yes). Plus of course the XT won't appear from another month (!). I'll worry when my GTX doesn't cut it in something - so far, i've yet to see it happen!Quote:
Originally Posted by specofdust
http://www.theinquirer.net/?article=26720
So far nvidia has a cooler, smaller, cheaper, available card - ati has.. er.. another month to go.
Remember ATI promised this:
http://www.theinquirer.net/?article=26617
Intresting reads there. While I'm dissapointed that the 7800 beat the X1800 there in most tests(well, in some tests the 1800 was slightly better, but in some tests the 7800 was over 10FPS better, so the 7800 is probably marginally better overall), it at least gives fans of ATi a high end option that won't dissapoint. I'd been hoping that by the time I get round to a new graphics card ATi would have something that not only supported SM3 but that kicked ass, the X1800's seem to do that, even if they're performance is roughly the same as the 7800's
Quote:
Originally Posted by specofdust
Don't get me wrong - i've got no desire to see nVidia win (I chop and change, my last card was a 6800 ultra and before that a 9700 Pro) anything. I'd rather see them constantly fighting it out as it can only benefit us as the end user. I think ATI may have an edge in HDR+AA in games but it's largely unproven in real-world games right now, and seems to depend on programming technique. In many ways i'd expect the ATI boards to be quicker - they're clocked very high after all - than they appear to be in benchmarks right now. I firmly believe you shouldn't sit in one camp or the other - just go for the best solution for the money. Right now, i'm still thinking nVidia have the edge on performance/price.
When a new generation of core comes out, and they have to clock it high to start off I get concerned. Like, if you remember, the 9600Pro's are clocked at 400Mhz be by default, but the 9700/9800Pro's were able to be clocked at like 270 or something, cos the cores were so good. I know newer manufacturing processes allow them to have the cores higher overall, but a high core on a new launch seems to be an indication often that the card can't go much higher.
As for the actual performance of the X1800's, I don't think we'll know for sure till a few new driver releases have been done, the next gen games like Quake 4 and FEAR are out, and we see them hit shelves in real numbers.
I'm guessing that Quake 4 will be nVidia's domain (OpenGL) as per usual. FEAR is a whole new ballgame and i've no idea which way it'll swing.. It's probably the only game making me worry if it'll be ok @ 1600x1200. I've not tried the demo..
Yeah, I just hope there aren't any sneaky "optimisations" that basicly negate the well engineered things in ATi cards in order to bring them down in line with Nvidia cards.
I have no(thats a lie, I have a little) problem with optimisations, when they're optimisations. But when its a case of forcing all cards to use do stupid things like use look-up tables when its totaly unneccesary for ATi cards, which have a faster and better way of doing things, is just wrong. You're gimping my cards performance 'cos another company paid you money.
True, though if ATI/NVidia are concerned about it all they do is write an exception in the next drivers to force it to use the better way :p
Yeah, hopefully so. Gimping cards is seriously immoral(well, not seriously, but it is).
Not sure about Nvidia drivers, but thankfully ATi drivers are released often, so any baddness can be undone, I also believe there was an ATi doom patch released, so that could be ported to Quake4.
Yeah that does suck. In mitigation, once a game is out then driver updates can usually 'work around' such problems. I don't think the D3 engine excludeds ATI cards in any way, it's just that carmack basically specced nVidia's card for them! The same wasn't true for half life 2 (i think) but certainly a lot money made sure that the ATI path was 'optimal'. At one time we all thought 6800 performance with half life 2 was going to be abysmal (hell people we saying my ultra would suck at it) but it didn't.
We will see a 512MB NVIDIA card and that should clear it all up
Yes and only cost 500quid... eeeek
Agree, I've been quite amazed at the lack of 512MB 7800 card, I mean, its the card that could use 512MB most, considering there are quite a few other 512MB cards, it just seems stupid to me that the 7800 hasnt got a 512MB version.
Until there is a definite need for an everest-like 512MB frame buffer, bringing out a 512 card isn't really smart unless there are preconfigured plans made prior to a product project.
Its really not suprising to see ATI leading, they do have the clock advantage.. a VERY big clock advantage. I'm eager to see if the production yield for such niche clocks would actually stack up for a successful hard launch in 1 month time.
whether 7800 Ultra would actually arrive, would really depend on whether ATI will be able to bring their production up and prices down. If ATI gets their XT to maybe 20 quids difference to a 7800GTX.. that would warrant enough threat to Nvidia to bring their 'rumoured' flagship cards out. Currently there is very little reason to introduce 7800 Ultra (esp with all the preoverclocked GTX around)..
Personally I also don't buy the idea that 7800 Ultra will just be a super overclocked GTX. The yield just wouldnt stack up... G71 would be very different, in my opinion that is..