Results 1 to 7 of 7

Thread: Doom3 Update: X1800XT takes 7800GTX at 1600x1200 with 4x/8x

  1. #1
    Senior Member Andrzej's Avatar
    Join Date
    Oct 2005
    Location
    London, UK
    Posts
    621
    Thanks
    0
    Thanked
    4 times in 3 posts

    Doom3 Update: X1800XT takes 7800GTX at 1600x1200 with 4x/8x

    http://www.hexus.net/content/item.php?item=3668


    Now those are the kind of beanz I could eat all day long






    Sorry about the emotion meltdown...

    ...but I am feeling very emotional right now
    .
    .

    I kept 6 trusted serving men, they taught me all I knew.
    There names were what and where and why and how and when and who.


    (I also had the HEXUS forums on speed dial just in case )

  2. #2
    not posting kempez's Avatar
    Join Date
    Aug 2005
    Location
    Basingstoke
    Posts
    3,204
    Thanks
    0
    Thanked
    0 times in 0 posts
    That is more like the card SHOULD perform in all games. Glad to hear it. Now the competition will have to step up to the mark
    Check my project <<| Black3D |>>
    Quote Originally Posted by hexah
    Games are developed by teams of talented people and sometimes electronic arts

  3. #3
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    30,749
    Thanks
    1,788
    Thanked
    3,288 times in 2,647 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish
    Not bad. How about a situtation where the MC enters a 'learn mode' when you start a new application? It would be very cool if during this learn mode the usage is analysed and the MC auto-optimises for that application

  4. #4
    Lovely chap dangel's Avatar
    Join Date
    Aug 2005
    Location
    Cambridge, UK
    Posts
    8,398
    Thanks
    412
    Thanked
    459 times in 334 posts
    • dangel's system
      • Motherboard:
      • See My Sig
      • CPU:
      • See My Sig
      • Memory:
      • See My Sig
      • Storage:
      • See My Sig
      • Graphics card(s):
      • See My Sig
      • PSU:
      • See My Sig
      • Case:
      • See My Sig
      • Operating System:
      • Windows 10
      • Monitor(s):
      • See My Sig
      • Internet:
      • 60mbit Sky LLU
    Yeah agreed - they need to make this more 'generic' so that per-application optimization isn't necessary (as it usually lags behind game releases by quite a bit). As I always said, I expected much more from this card given the huge clock speed advantage it has over the GTX. That said, however, gains still seem to be marginal when compared to much lower clocked GTX running a beta driver.. It still seems that ATI's high prices don't reflect best bang for buck - at least to me anyway.

    So ATM we're waiting for the new catalyst drivers, and the new detonators for both cards to provide performance gains. I don't think they'll be much in it once the dust settles (yet again).

    Note to fan-boyz: I have no brand loyalty, the card i own is based on best performance per buck, I own both ATI and nVidia cards!
    Crosshair VIII Hero (WIFI), 3900x, 32GB DDR4, Many SSDs, EVGA FTW3 3090, Ethoo 719


  5. #5
    Senior Member
    Join Date
    Sep 2003
    Posts
    593
    Thanks
    0
    Thanked
    1 time in 1 post
    I'm not particularly impressed wiht the new ATI offereings. They cost a fortune in comparisont to their alternate Nvidia model.

    I'm also not 100 % sure why they are working on increasing performance on a game that no-one is really all that iinterested in now. It's a bit like working to increase 3dmark 05 performance; we really have moved beyond it.

  6. #6
    Senior Member Andrzej's Avatar
    Join Date
    Oct 2005
    Location
    London, UK
    Posts
    621
    Thanks
    0
    Thanked
    4 times in 3 posts
    Quote Originally Posted by dangel
    ...per-application optimization isn't necessary...
    In general I think that this is a very good sentiment
    (although there are some cases where I am sure 'individual attention' is warranted)

    It has been the source of much confusion over the years as to just how much engineering effort nVidia would put in to making a single game work better in a very specific configuration

    A more general approach is much better - hence the OpenGL fix (remember that this performance boost works for all OpenGL games - not just one application)

    The guys who came up with this have confirmed that it is the 'first of many' such changes that have been made possible by having a seriously programmable memory controller

    The last time I can recall a boost anywhere near as large was with FarCry (Cat 5.1 to 5.2 if I remember correctly) where performance in certain maps jumped by over 20%

    That was also achieved through enhanced memory management

    However, in those days, it was a serious effort to implement a change of that scale


    IIRC the HEXUS tests ran a series of five 'The Way Its Meant To Be Played' games and the X1800XT scored wins in 4 out of 5 games (at the gamers' resolution of 1600x1200 with 4x/8x)

    The only test that the 7800GTX won was Chronicles of Riddick and - with the updated memory technique from ATI - even that lead is much smaller than it was

    One failure with Chronicles of Riddick as a benchmark test is that you cannot 'force a shader model mode'

    After 2.0 - it switches to 2++

    In an idea world, this would enable SM3 paths for both ATI and nVidia

    However, in my experience, it will not run the same path for both cards - and that is a serious failing - especially because ATI claim that the X1000 series is 'Shader Model 3 done right'

    The later/more advanced SM3-enabled games show just where the money has been spent:-




    As far as price goes - if you offer the fastest product on the panet...

    ...then surely you can charge a small (£50 ?) premium ?
    .
    .

    I kept 6 trusted serving men, they taught me all I knew.
    There names were what and where and why and how and when and who.


    (I also had the HEXUS forums on speed dial just in case )

  7. #7
    Lovely chap dangel's Avatar
    Join Date
    Aug 2005
    Location
    Cambridge, UK
    Posts
    8,398
    Thanks
    412
    Thanked
    459 times in 334 posts
    • dangel's system
      • Motherboard:
      • See My Sig
      • CPU:
      • See My Sig
      • Memory:
      • See My Sig
      • Storage:
      • See My Sig
      • Graphics card(s):
      • See My Sig
      • PSU:
      • See My Sig
      • Case:
      • See My Sig
      • Operating System:
      • Windows 10
      • Monitor(s):
      • See My Sig
      • Internet:
      • 60mbit Sky LLU
    Quote Originally Posted by Bania
    In general I think that this is a very good sentiment
    (although there are some cases where I am sure 'individual attention' is warranted)

    It has been the source of much confusion over the years as to just how much engineering effort nVidia would put in to making a single game work better in a very specific configuration
    Yes, but - back on the topic you quoted me talking about - the issue is how ATI's optimization works - and this isn't generic which is a problem if ATI can't push out optimization as fast as games releases. I don't care whether the Chronicles of Riddick plays well on X card - what i want to know is how it plays with the next big game coming out

    Quote Originally Posted by Bania
    A more general approach is much better - hence the OpenGL fix (remember that this performance boost works for all OpenGL games - not just one application)
    Well precisely - but in ATI's case the reason they've never done well in OpenGL wasn't hardware related (and the same is true of linux support) - and it's taken all these years for the them to get around to doing something about it.

    Quote Originally Posted by Bania
    The guys who came up with this have confirmed that it is the 'first of many' such changes that have been made possible by having a seriously programmable memory controller
    Again, great - but still the issue remains, can they make this generic so that I don't have to wait for them before playing my shiny new game


    Quote Originally Posted by Bania
    IIRC the HEXUS tests ran a series of five 'The Way Its Meant To Be Played' games and the X1800XT scored wins in 4 out of 5 games (at the gamers' resolution of 1600x1200 with 4x/8x)
    Hurrah! So where can i buy it then? Oh, darn it. Not again. TBH with you if ATI _hadn't_ managed to beat a card that's sat on the market for as long as the 7800GTX has they'd be in deep do-do. Personally, I was more shocked they didn't pull a massive lead with their die-shrink and clock speed advantage alone from the get-go.

    Quote Originally Posted by Bania
    The only test that the 7800GTX won was Chronicles of Riddick and - with the updated memory technique from ATI - even that lead is much smaller than it was
    ...See previous comments on how much I care about Riddick (which, BTW I wasn't particularly impressed with!)

    Quote Originally Posted by Bania
    However, in my experience, it will not run the same path for both cards - and that is a serious failing - especially because ATI claim that the X1000 series is 'Shader Model 3 done right'
    ...and nVidia claim ATI don't etc etc.. Personally, if MS are happy with how something works (and, after all, it's them making the decisions about certification) and the games work well then ignore the FUD if I were you. nVidia and ATI take different approaches to SM3.0 and both have 'workarounds' for anything that counts (and i'm well aware of ATI's counter claims so please spare me). Both these companies act like children to each other.

    Quote Originally Posted by Bania
    The later/more advanced SM3-enabled games show just where the money has been spent:-
    Hmm FEAR - that's one game that perplexed me - on the one hand the gameplay was fun but I really didn't like the engine (and not just because it favour's ATI hardware) because, visually, it didn't strike me as amazing. The only bit of the game that impressed me graphically was the ending (which I guess I'd better keep quiet about for fear upsetting people) - otherwise it wasn't any more impressive than Half Life 2 (which had better, more complex architecture, nicer outdoor stuff(tm) and far better physics) or Doom 3/Quake 4 (graphically better again, esp. with all that loverly organic stuff if we ignore it's limited outdoor sections) - both of which perform much better (on both ATI and nVidia architecture). Is it a good game engine? Not convinced. Anyway, sorry - yes i can see it performs better on the new ATI card which is out soon. But again, i've already played and finished the game

    Quote Originally Posted by Bania
    As far as price goes - if you offer the fastest product on the panet...

    ...then surely you can charge a small (£50 ?) premium ?
    Well, yes, if you *offer* it.. And when it's actually out we'll see what the price premiums are - no doubt nVidia will respond with cunning pricing (hey, they can they've probably already recouped costs of engineering the 7800 series) and perhaps even a die-shrink of the 7800. Me? Well i've had months of gaming on a great card already and I expect dropping another one in the machine when they're cheap will keep me going until the real battle commences - i.e both nVidia's and ATI's new parts next year. I think that like many other people, the ATI card came too late to the table and doesn't justify the expense of swapping out (particularly if you want crossfire and not SLI in the future) for what gains there are. For those that are looking to buy in a month, the decision will be harder - but it's not one i have to make
    Crosshair VIII Hero (WIFI), 3900x, 32GB DDR4, Many SSDs, EVGA FTW3 3090, Ethoo 719


Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •