Page 2 of 2 FirstFirst 12
Results 17 to 26 of 26

Thread: Interesting Article On The Inquirer About The ATi \ Nvidia High End Cards.

  1. #17
    Senior Member Richdog's Avatar
    Join Date
    Sep 2003
    Location
    Global
    Posts
    573
    Thanks
    4
    Thanked
    13 times in 11 posts
    Hmm very interesting article, but I don't believe ATI is going to die, not at all. ATI has made too big a name for itself over the last couple of years for it to collapse, gained a very large fanbase and group of hardcore users who will buy them no matter what (im not saying thats always a good or an intelligent thing).

    What the author of that article doesn't take into account is that a large amount of developers have also taken the 3dc compression tech on board, and when used in games that provides a 20%-30% performance boost, and looks awesome to boot. HL2 will use it, Far Cry will use it (via patch), Serious Sam 2 and undoubtedly lots more.

    And does he says the ATI cards don't support geometric instancing (I skimmed it and he seemed like he said that, cirrect me if i'm wrong)? Well they do... you can enable it on Far Cry for gawds sakes...

    Geometry Instancing
    This release of CATALYST introduces the support of Geometry Instancing. This new feature provides assistance when a game has to render many objects where the geometry is highly similar. Geometry Instancing allows the VPU to create multiple objects from a single geometric model, rather than passing an entire new model for each item on the screen. This increases the rendering speed of images such as leaves, or grass. http://www2.ati.com/drivers/Catalyst...ase_Notes.html
    Now lets think about PS3.0 vs PS 2.5... up until the R500 is released will the difference be all that great? R500 is supposed to be getting released in Q1 or Q2 2005... are you trying to tell me that in the space of a few measly months months graphics are going to change so much that Nvidia hardware will look vastly different to ATI hardware? I'm betting not a chance, Doom 3 shows that they can't even fully utilize DX9 fully, never mind fully exploiting PS3.0 within a such a short time. Graphics tech always progresses about 2x faster than game tech (i.e. when we see the new features actually being used), don't forget that.

    All i'm saying is that ATI may have missed PS3.0, but the gap between the next tech isn't great, and it still has a few tricks up its sleeve like the aforementioned 3dc tech... have you seen the screenshots of how freaking awesome that looks? Give me super-detailed gameworlds over a few fancy PS3.0 effects anyday. PS4.0 won't be too long in coming, and when it does ATI will undoubtedly be ready with all guns blazing and a fully supportive GPU.

    Oh and here's our thread over at EOCF if anyone wants to see some more viwpoints. http://forums.extremeoverclocking.co...0&page=1&pp=20
    ASUS ROG G751 w. 980M

  2. #18
    Member
    Join Date
    Aug 2004
    Posts
    127
    Thanks
    0
    Thanked
    0 times in 0 posts
    im going to stick with my X800 XT order, was thinking of 6800 Ultra for a while, but gone off the idea of its high power usage, also its main "advantage" (the feature SM3.0) is unlikely to be active in any games in the near future, all you need to do is look at ATi's Ruby video to see how good 2.0 looks which also makes you question how long it will be until we see SM3.0. Plus if ATi's new opengl overhaul is going to improve its performance % you cant really go wrong

    another thing is that games have ONLY just started to use dx9 and SM2.0 and not even fully, so it seems the video card technology is way ahead of the game gfx technology as always which i think some ppl forget most of the time while reading crappy threads they start.

    both cards own but i think i'll go with the X800 XT this time round and upgrade in another few years when the game engine technology has passed my X800 by

  3. #19
    Registered+
    Join Date
    Aug 2004
    Posts
    35
    Thanks
    4
    Thanked
    0 times in 0 posts
    *wonders whether Charlie Whatisface placed an order with his broker to buy ATI shares just b4 posting the article*

    :-D

  4. #20
    Sublime HEXUS.net
    Join Date
    Jul 2003
    Location
    The Void.. Floating
    Posts
    11,819
    Thanks
    213
    Thanked
    233 times in 160 posts
    • Stoo's system
      • Motherboard:
      • Mac Pro
      • CPU:
      • 2*Xeon 5450 @ 2.8GHz, 12MB Cache
      • Memory:
      • 32GB 1600MHz FBDIMM
      • Storage:
      • ~ 2.5TB + 4TB external array
      • Graphics card(s):
      • ATI Radeon HD 4870
      • Case:
      • Mac Pro
      • Operating System:
      • OS X 10.7
      • Monitor(s):
      • 24" Samsung 244T Black
      • Internet:
      • Zen Max Pro
    IIRC missing out on SM3.0 has a tactical manouvre on the part of ATI, they knew virtually nothing would make any use of it, and including support for it in the GPU would add to the transistor count needlessly.

    By the time SM 3.0 is being used, their next chipset will be out and using a smaller manufacturing process will mean that the overheads won't have anywhere near as much of a design impact.

    Nvidia stole the march on ATi mainly through superior driver development (the hardware will always be roughly comparable assuming nobody produces a clanger like the FX series again), something ATi is now doing a lot to amend..

    Who will win? The consumer, that's who. Whatever either company comes up with is going to be pushing on graphics card development onward and upward whilst driving the prices down further.

    The only way I can see something radically changing is if a third party comes in with something completely new and upsets the applecart, much like AMD have done with the A64 and nVidia did with the TNT2 chipset on a fairly stagnent market..

    Good times ahead for most of us, bad times ahead for our wallets
    (\__/)
    (='.'=)
    (")_(")

  5. #21
    KDH
    KDH is offline
    Member
    Join Date
    Oct 2003
    Posts
    120
    Thanks
    0
    Thanked
    0 times in 0 posts
    Quote Originally Posted by Byatt
    Read that last night, it's an interesting article with a lot of valid points to make - but I think there's a couple of points which are overly stated - namely the doom3/source thing - I think it's a little pre-emptive to say that doom3 will dominate in terms of being used by developers, and in the same way there's that big OpenGL driver overhaul expected from ATi in september.

    Secondly - on ATi being scared of SLI and having no answer - this doesn't really matter, the number of people with SLI setups is going to be comparable to those with P4EEs and FX-53s - I really don't think it's going to hurt them that much.

    Thirdly - ATi not competing in the low-mid range area - well he said a lot about ATi not being able to deliver, which is interesting - I'd consider the 9800Pro midrange now, and the 9600/below to be low - and they're both good cards (superb card in the case of the 9800) - I'm really not convinced that ATi loses out in this area...although there's currently no ATi answer the the vanilla 6800.

    He also doesn't mention the considerable brand loyalty that ATi has built up with retail consumers - especially in the enthusiast market - just look at these boards - we'll almost always recommend a 9800 Pro over a 5900XT, or an XT PE over a 6800u.

    I thought he was spot on with the ps3.0 thing though - and I agree with him - it does seem very much as if Nvidia have won this round in the retail market (by how much is not yet clear) , but ATi's reputation due to R300 will see them through, much like nvidia's Ti4x00 series saved them from the disaster of the fx series.
    Just picking at this - but, the 6600 so far seems to be head and shoulders above the 9800pro/XT, the x600's results are desultory at best.

  6. #22
    Comfortably Numb directhex's Avatar
    Join Date
    Jul 2003
    Location
    /dev/urandom
    Posts
    17,074
    Thanks
    228
    Thanked
    1,026 times in 677 posts
    • directhex's system
      • Motherboard:
      • Asus ROG Strix B550-I Gaming
      • CPU:
      • Ryzen 5900x
      • Memory:
      • 64GB G.Skill Trident Z RGB
      • Storage:
      • 2TB Seagate Firecuda 520
      • Graphics card(s):
      • EVGA GeForce RTX 3080 XC3 Ultra
      • PSU:
      • EVGA SuperNOVA 850W G3
      • Case:
      • NZXT H210i
      • Operating System:
      • Ubuntu 20.04, Windows 10
      • Monitor(s):
      • LG 34GN850
      • Internet:
      • FIOS
    the unreal 3 engine lives & breathes SM3.0, unreal3-based games _WILL_ look worse on ATI

  7. #23
    Sublime HEXUS.net
    Join Date
    Jul 2003
    Location
    The Void.. Floating
    Posts
    11,819
    Thanks
    213
    Thanked
    233 times in 160 posts
    • Stoo's system
      • Motherboard:
      • Mac Pro
      • CPU:
      • 2*Xeon 5450 @ 2.8GHz, 12MB Cache
      • Memory:
      • 32GB 1600MHz FBDIMM
      • Storage:
      • ~ 2.5TB + 4TB external array
      • Graphics card(s):
      • ATI Radeon HD 4870
      • Case:
      • Mac Pro
      • Operating System:
      • OS X 10.7
      • Monitor(s):
      • 24" Samsung 244T Black
      • Internet:
      • Zen Max Pro
    It will look worse on the X800 series you mean, as by the time Unreal 3 comes out the R500 core based cards will be out which will support SM3.0..
    (\__/)
    (='.'=)
    (")_(")

  8. #24
    Junior Senior Member Aaron's Avatar
    Join Date
    Jul 2003
    Location
    London, England
    Posts
    1,516
    Thanks
    0
    Thanked
    1 time in 1 post
    I think the point that the article was trying to make though was that the 6800 series beats the X800 series if you don't intend to buy the latest graphics card when each revision comes out. The point was that the 6800 is a better medium-long term buy, since most people only buy a new graphics card every couple of years.

  9. #25
    Sublime HEXUS.net
    Join Date
    Jul 2003
    Location
    The Void.. Floating
    Posts
    11,819
    Thanks
    213
    Thanked
    233 times in 160 posts
    • Stoo's system
      • Motherboard:
      • Mac Pro
      • CPU:
      • 2*Xeon 5450 @ 2.8GHz, 12MB Cache
      • Memory:
      • 32GB 1600MHz FBDIMM
      • Storage:
      • ~ 2.5TB + 4TB external array
      • Graphics card(s):
      • ATI Radeon HD 4870
      • Case:
      • Mac Pro
      • Operating System:
      • OS X 10.7
      • Monitor(s):
      • 24" Samsung 244T Black
      • Internet:
      • Zen Max Pro
    Yeah, but Unreal 3 isn't even due until 2006, by which point the entire arguement is pointless..
    (\__/)
    (='.'=)
    (")_(")

  10. #26
    Senior Member
    Join Date
    Jul 2003
    Location
    London
    Posts
    888
    Thanks
    9
    Thanked
    4 times in 4 posts
    Quote Originally Posted by KDH
    Just picking at this - but, the 6600 so far seems to be head and shoulders above the 9800pro/XT, the x600's results are desultory at best.
    Yeah, I'd agree with that - the x600 is a horrible card in my opinion, hoping ATi will find some time to make something like an x700 - some part with a 256bit mem interface with 8-12 pipes...where are you getting numbers from the compare it to the 9800 Pro? I can only find the nvidia guff about it destroying the x600xt - quel suprise, it's double the pipes.

    I'd also be interested as to what price the 6600 will release at.

Page 2 of 2 FirstFirst 12

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Replies: 6
    Last Post: 26-03-2004, 02:45 AM
  2. Question about ATI cards
    By Benjamin in forum Graphics Cards
    Replies: 12
    Last Post: 10-01-2004, 11:25 PM
  3. ATI Tweaks?
    By Gr44 in forum Graphics Cards
    Replies: 8
    Last Post: 28-12-2003, 04:20 AM
  4. Always read the label
    By Zak33 in forum PC
    Replies: 5
    Last Post: 15-08-2003, 01:01 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •