http://www.hexus.net/content/item.php?item=3668
Now those are the kind of beanz I could eat all day long
![]()
![]()
![]()
![]()
![]()
Sorry about the emotion meltdown...
...but I am feeling very emotional right now
http://www.hexus.net/content/item.php?item=3668
Now those are the kind of beanz I could eat all day long
![]()
![]()
![]()
![]()
![]()
Sorry about the emotion meltdown...
...but I am feeling very emotional right now
.
.
I kept 6 trusted serving men, they taught me all I knew.
There names were what and where and why and how and when and who.
(I also had the HEXUS forums on speed dial just in case)
That is more like the card SHOULD perform in all games. Glad to hear it. Now the competition will have to step up to the mark![]()
Not bad. How about a situtation where the MC enters a 'learn mode' when you start a new application? It would be very cool if during this learn mode the usage is analysed and the MC auto-optimises for that application![]()
Yeah agreed - they need to make this more 'generic' so that per-application optimization isn't necessary (as it usually lags behind game releases by quite a bit). As I always said, I expected much more from this card given the huge clock speed advantage it has over the GTX. That said, however, gains still seem to be marginal when compared to much lower clocked GTX running a beta driver.. It still seems that ATI's high prices don't reflect best bang for buck - at least to me anyway.
So ATM we're waiting for the new catalyst drivers, and the new detonators for both cards to provide performance gains. I don't think they'll be much in it once the dust settles (yet again).
Note to fan-boyz: I have no brand loyalty, the card i own is based on best performance per buck, I own both ATI and nVidia cards!
I'm not particularly impressed wiht the new ATI offereings. They cost a fortune in comparisont to their alternate Nvidia model.
I'm also not 100 % sure why they are working on increasing performance on a game that no-one is really all that iinterested in now. It's a bit like working to increase 3dmark 05 performance; we really have moved beyond it.
In general I think that this is a very good sentimentOriginally Posted by dangel
(although there are some cases where I am sure 'individual attention' is warranted)
It has been the source of much confusion over the years as to just how much engineering effort nVidia would put in to making a single game work better in a very specific configuration
A more general approach is much better - hence the OpenGL fix (remember that this performance boost works for all OpenGL games - not just one application)
The guys who came up with this have confirmed that it is the 'first of many' such changes that have been made possible by having a seriously programmable memory controller
The last time I can recall a boost anywhere near as large was with FarCry (Cat 5.1 to 5.2 if I remember correctly) where performance in certain maps jumped by over 20%
That was also achieved through enhanced memory management
However, in those days, it was a serious effort to implement a change of that scale
IIRC the HEXUS tests ran a series of five 'The Way Its Meant To Be Played' games and the X1800XT scored wins in 4 out of 5 games (at the gamers' resolution of 1600x1200 with 4x/8x)
The only test that the 7800GTX won was Chronicles of Riddick and - with the updated memory technique from ATI - even that lead is much smaller than it was
One failure with Chronicles of Riddick as a benchmark test is that you cannot 'force a shader model mode'
After 2.0 - it switches to 2++
In an idea world, this would enable SM3 paths for both ATI and nVidia
However, in my experience, it will not run the same path for both cards - and that is a serious failing - especially because ATI claim that the X1000 series is 'Shader Model 3 done right'
The later/more advanced SM3-enabled games show just where the money has been spent:-
As far as price goes - if you offer the fastest product on the panet...
...then surely you can charge a small (£50 ?) premium ?
.
.
I kept 6 trusted serving men, they taught me all I knew.
There names were what and where and why and how and when and who.
(I also had the HEXUS forums on speed dial just in case)
Yes, but - back on the topic you quoted me talking about - the issue is how ATI's optimization works - and this isn't generic which is a problem if ATI can't push out optimization as fast as games releases. I don't care whether the Chronicles of Riddick plays well on X card - what i want to know is how it plays with the next big game coming outOriginally Posted by Bania
Well precisely - but in ATI's case the reason they've never done well in OpenGL wasn't hardware related (and the same is true of linux support) - and it's taken all these years for the them to get around to doing something about it.Originally Posted by Bania
Again, great - but still the issue remains, can they make this generic so that I don't have to wait for them before playing my shiny new gameOriginally Posted by Bania
![]()
Hurrah! So where can i buy it then? Oh, darn it. Not again. TBH with you if ATI _hadn't_ managed to beat a card that's sat on the market for as long as the 7800GTX has they'd be in deep do-do. Personally, I was more shocked they didn't pull a massive lead with their die-shrink and clock speed advantage alone from the get-go.Originally Posted by Bania
...See previous comments on how much I care about Riddick (which, BTW I wasn't particularly impressed with!)Originally Posted by Bania
...and nVidia claim ATI don't etc etc.. Personally, if MS are happy with how something works (and, after all, it's them making the decisions about certification) and the games work well then ignore the FUD if I were you. nVidia and ATI take different approaches to SM3.0 and both have 'workarounds' for anything that counts (and i'm well aware of ATI's counter claims so please spare me). Both these companies act like children to each other.Originally Posted by Bania
Hmm FEAR - that's one game that perplexed me - on the one hand the gameplay was fun but I really didn't like the engine (and not just because it favour's ATI hardware) because, visually, it didn't strike me as amazing. The only bit of the game that impressed me graphically was the ending (which I guess I'd better keep quiet about for fear upsetting people) - otherwise it wasn't any more impressive than Half Life 2 (which had better, more complex architecture, nicer outdoor stuff(tm) and far better physics) or Doom 3/Quake 4 (graphically better again, esp. with all that loverly organic stuff if we ignore it's limited outdoor sections) - both of which perform much better (on both ATI and nVidia architecture). Is it a good game engine? Not convinced. Anyway, sorry - yes i can see it performs better on the new ATI card which is out soon. But again, i've already played and finished the gameOriginally Posted by Bania
Well, yes, if you *offer* it.. And when it's actually out we'll see what the price premiums are - no doubt nVidia will respond with cunning pricing (hey, they can they've probably already recouped costs of engineering the 7800 series) and perhaps even a die-shrink of the 7800. Me? Well i've had months of gaming on a great card already and I expect dropping another one in the machine when they're cheap will keep me going until the real battle commences - i.e both nVidia's and ATI's new parts next year. I think that like many other people, the ATI card came too late to the table and doesn't justify the expense of swapping out (particularly if you want crossfire and not SLI in the future) for what gains there are. For those that are looking to buy in a month, the decision will be harder - but it's not one i have to makeOriginally Posted by Bania
![]()
There are currently 1 users browsing this thread. (0 members and 1 guests)