Page 1 of 2 12 LastLast
Results 1 to 16 of 25

Thread: NVIDIA's GeForce 6800 and 6800 GT GPUs

  1. #1
    Rys
    Rys is offline
    Tiled
    Join Date
    Jul 2003
    Location
    Abbots Langley
    Posts
    1,479
    Thanks
    0
    Thanked
    2 times in 1 post

    Post NVIDIA's GeForce 6800 and 6800 GT GPUs

    With NVIDIA's NV40 GPU launching in April, with the 6800 Ultra the first product based on it to be evaluated, it's taken some time to get the rest of the range out for everyone to look at. I managed to snag both the 6800 GT and plain 6800 reference boards, both on AGP, to compare to the Ultra and ATI's current X800 lineup, in our usual reference board examination. Here's a snippet.

    What strikes me most is the performance difference the entire range of product from both IHVs has over the outgoing generation of parts. They all make 9800XT and 5950 Ultra look very very silly indeed. Something for everyone from £200 upwards.
    Check it out in full here.

    Rys
    MOLLY AND POPPY!

  2. #2
    Registered User
    Join Date
    Jan 2004
    Posts
    8
    Thanks
    0
    Thanked
    0 times in 0 posts
    This review was a little lame imo.

    Geforce 6800 Ultra cards are stock clocked at 400 and not 450. Geforce 6800 Ultra Extreme Edition is 450, which should not be allowed imo. Cos finding one is even less likely than x800 xts, and if you do find one it'll cost you an extra £100. So were the clocks of the ultra card you were using?

    You also forgot to include Far Cry benchmarks which would show x800 still dominates. Other factors influencing people's buying decisions include, heat and power consumption. And the fact that Geforce 6800 Ultra owners need 2 molexes to power the card, and an extra pci slot for decent cooling (forget asus' substandard cooling).

    Why was 1600x1200 only done in 4aa and 8af? Some people would think that 1600x1200 doesn't even need 4aa. Plus you're forgetting temporal aa which ati has, which would give further improvements.

    And the omission of 1280x1024 is noticeable too. A lot of people have 17" and 19" gaming monitors which support this as it's native resolution.

    x800 series also wipes the floor with the 6800 series in the RTHDRIBL v1.2 test. Which is a true indication of directx9. Hopefully HL2 will prove ati owners that their purchased the superior card.

    Vis a vis the doom3 opengl argument (pre emptive strike ) ati is developing it's opengl driver to fully utilise the power of it's x800 core.

    edit:

    Oh I also forgot it's quite a bit louder than the x800 and it definitely won't fit in a shuttle without some modding.
    Last edited by venom1969; 23-07-2004 at 12:22 AM.

  3. #3
    HEXUS.net Webmaster
    Join Date
    Jul 2003
    Location
    UK
    Posts
    3,108
    Thanks
    1
    Thanked
    0 times in 0 posts
    This review was a little lame imo
    The rest of your post was relatively constructive and objective, that first sentence was completely useless. Please engage your brain before posting again

  4. #4
    Registered User
    Join Date
    Jan 2004
    Posts
    8
    Thanks
    0
    Thanked
    0 times in 0 posts
    I did engage my brain. I stated what I thought of it, then followed that up with the reasons why.

    The main thing that's bugging me is what clock speed the ultra's were running at. Can anyone verify this?

  5. #5
    Rys
    Rys is offline
    Tiled
    Join Date
    Jul 2003
    Location
    Abbots Langley
    Posts
    1,479
    Thanks
    0
    Thanked
    2 times in 1 post

    Thumbs up

    Quote Originally Posted by venom1969
    This review was a little lame imo.

    Geforce 6800 Ultra cards are stock clocked at 400 and not 450. Geforce 6800 Ultra Extreme Edition is 450, which should not be allowed imo. Cos finding one is even less likely than x800 xts, and if you do find one it'll cost you an extra £100. So were the clocks of the ultra card you were using?
    I'll quote from the review, first page. "Currently, NVIDIA state that a 6800 Ultra is a full four quad NV40 (AGP part) running between 400 and 425MHz, with 256MB of GDDR3 memory running between 1100 and 1200MHz on a 256-bit memory bus, with a pair of DVI ports. Any 6-series board that meets those specs is a 6800 Ultra."

    So an Ultra is anywhere between 400 and 425 and there's no set clock in that range, it's up to the AIB. The Ultra used in the review was clocked at 400/1100.

    Quote Originally Posted by venom1969
    You also forgot to include Far Cry benchmarks which would show x800 still dominates. Other factors influencing people's buying decisions include, heat and power consumption. And the fact that Geforce 6800 Ultra owners need 2 molexes to power the card, and an extra pci slot for decent cooling (forget asus' substandard cooling).
    I didn't forget. We don't use Far Cry in reviews of reference boards. Your retail board considerations are quite valid. I covered the heat and power requirements of the Ultra in the Ultra review. This review was for the GT and the plain 6800 and I cover their coolers and power requirements. The review title and the article content give that away somewhat. Retail board reviews cover the specifics of that AIB's chosen cooler, so that's where you need to look for that information.

    Quote Originally Posted by venom1969
    Why was 1600x1200 only done in 4aa and 8af? Some people would think that 1600x1200 doesn't even need 4aa. Plus you're forgetting temporal aa which ati has, which would give further improvements.
    Look at the Ultra review for some 1600x1200 with 8AA and 16AF numbers, if you want to see it struggle 1600x1200 certainly does need geometry anti-aliasing in some games titles. I cover temporal AA in the X800 article. Again, it's a review of GeForce 6-series boards, not ATI hardware. They got their own article.

    Quote Originally Posted by venom1969
    And the omission of 1280x1024 is noticeable too. A lot of people have 17" and 19" gaming monitors which support this as it's native resolution.
    Again, read the Ultra's own review. I use 1280x1024 there. You can find that here.

    Quote Originally Posted by venom1969
    x800 series also wipes the floor with the 6800 series in the RTHDRIBL v1.2 test. Which is a true indication of directx9. Hopefully HL2 will prove ati owners that their purchased the superior card.
    I take it you're a fan of ATI hardware? RTHDRIBL isn't a true indication of DX9 since it ostensibly uses hand written shaders, something that's not too common. It's a nice test that shows off what can happen when shaders cause resource management issues on certain hardware.

    Quote Originally Posted by venom1969
    Vis a vis the doom3 opengl argument (pre emptive strike ) ati is developing it's opengl driver to fully utilise the power of it's x800 core.
    And I sincerely hope they create a brilliant OpenGL driver, since consumers of their hardware should have the OpenGL performance the hardware is capable of.

    edit:

    Quote Originally Posted by venom1969
    Oh I also forgot it's quite a bit louder than the x800 and it definitely won't fit in a shuttle without some modding.
    That depends on the AIB's cooler and again, it's a reference board review, not retail. Depends on the Shuttle too. Their P-series chassis's take a 6800 Ultra without a problem.

    I get the impression that you took the intended focus of the article the wrong way. It's a review of the 6800 GT and the plain 6800, nothing more. The 6800 Ultra and ATI's X800 cards go their own articles where they were focussed on, you can find them using the site navigation system. Things you seemed to think were missing from the review are present elsewhere.

    We can't cram everything into one big article, but we do cover it all elsewhere (hopefully).

    You obviously like ATI's gear, and rightly so, their hardware is awesome. But at the same time we can't be that kind of biased when writing, we need to stay impartial.

    Anyway, welcome to the forums, stick around, we could use more thoughtful posters like you!

    Rys
    MOLLY AND POPPY!

  6. #6
    Registered User
    Join Date
    Jan 2004
    Posts
    8
    Thanks
    0
    Thanked
    0 times in 0 posts
    Quote Originally Posted by Rys
    Anyway, welcome to the forums, stick around, we could use more thoughtful posters like you!
    Thanks for the welcome!


    While I appreciate the reasoning why you can't cram everything into a single article. Most people will be basing there judgements of reading this single review. And not comparing it to other reviews you have completed.

    Could be the fact that I bought an x800 xt pe just a few days before the doom3 benchmarks came out. It's a cracking card but, everyone who owns one will be slightly miffed that a card £100 cheaper is kicking it's butt.

    And the fact that the card is @ 400/1100 clocks bodes even more well for 6800 owners. Most have been able to push their card beyond that spec.

    Which card will you be buying for doom3?

  7. #7
    Sublime HEXUS.net
    Join Date
    Jul 2003
    Location
    The Void.. Floating
    Posts
    11,819
    Thanks
    213
    Thanked
    233 times in 160 posts
    • Stoo's system
      • Motherboard:
      • Mac Pro
      • CPU:
      • 2*Xeon 5450 @ 2.8GHz, 12MB Cache
      • Memory:
      • 32GB 1600MHz FBDIMM
      • Storage:
      • ~ 2.5TB + 4TB external array
      • Graphics card(s):
      • ATI Radeon HD 4870
      • Case:
      • Mac Pro
      • Operating System:
      • OS X 10.7
      • Monitor(s):
      • 24" Samsung 244T Black
      • Internet:
      • Zen Max Pro
    None, I wouldn't ever spend that much money just to play one game

    Anyone who knows about the ATI hardware and the D3 performace will probably be waiting for the opengl patch anyway and won't be too worried

    Nice review there Rys, gives people in the market for a new card a lot to think about
    (\__/)
    (='.'=)
    (")_(")

  8. #8
    Rys
    Rys is offline
    Tiled
    Join Date
    Jul 2003
    Location
    Abbots Langley
    Posts
    1,479
    Thanks
    0
    Thanked
    2 times in 1 post
    I'd get a 6800 GT at this point in time, if I were spending my own money. Spending > £300 for a video card is something I swore off a long time ago, and it's hard to justify it when the £300-level cards are doing so well.

    I try and link back to the other existing articles, in any new one, to help the reader absorb as much info as they'd like to take in.

    Cramming them all together just isn't feasible for so many reasons (time and effort, page count (nobody reads 30+ page reviews in full, no matter how I'd like them to), time we have with the cards, time needed to review other products).

    I think we get the balance right. I don't think any other site just did one sole article on next-gen GPU performance, for the same reaons. Almost everyone split it up like we did.

    And your X800 XT PE is an awesome buy, be happy with it. It'll play Doom3 just fine I wouldn't be miffed at HardOCP's results at all, I'd be very pleased at the X800 XT PE's performance, since it's nothing to be sniffed at, and you just know it whoops serious ass in other games too.

    Rys
    MOLLY AND POPPY!

  9. #9
    Cyber whore
    Join Date
    Oct 2003
    Location
    the seven sea,s
    Posts
    519
    Thanks
    0
    Thanked
    0 times in 0 posts
    • bouncin's system
      • Motherboard:
      • Abit an8 32x
      • CPU:
      • x2 4400 @2.7
      • Memory:
      • 2gb ocz el platinum
      • Storage:
      • 2 x 74 gb raptors raid 0 1 x 250 gbcaviar
      • Graphics card(s):
      • Gainward bliss 7950gx2
      • PSU:
      • ocx powerstream 600w
      • Case:
      • Eclipse
      • Operating System:
      • xp
      • Monitor(s):
      • samsung LE40A565
      • Internet:
      • 20MB virgin media
    Yeah i think every1 is basing too much on these few top games being released.

    i play a lot of games other than the doom3 types and hopefully will find my x800xt will perform aswell as the nvidia if not better in these. i mean isnt doom3 one of these the way its meant to be played titles?

    But i gotta say fairplay to nvidia cos i looks like they have finally got their act together and people may actualy be getting performance closer to what they pay for.

  10. #10
    Registered User
    Join Date
    Jan 2004
    Posts
    8
    Thanks
    0
    Thanked
    0 times in 0 posts
    Woah Rys has done it again. This time with the Aopen 6800 Ultra review.

    I ask you, why bother include the x800 pro in benchies against the 6800 Ultra? It doesn't make sense. These products aren't even the same league (cept with far cry benchies ). The way you're presenting the review gives nvidia an overwhelming win against the x800.

    Hmm... I wonder why? x800 pro is a castrated x800 xt pe. It only has 12 pipes vs. the 16 of the gt and ultra. Be fair Rys and do a proper apples to apples comparison. Ati vs Nvidia, the flagship products should be used in reviews. No point benching the lower class ati product against nvidia's topdog. It gives buyers an unbalanced point of view. It's the average joe I'm concerned about.

  11. #11
    Rys
    Rys is offline
    Tiled
    Join Date
    Jul 2003
    Location
    Abbots Langley
    Posts
    1,479
    Thanks
    0
    Thanked
    2 times in 1 post
    We use retail boards in our retail card reviews, reference boards aren't used. We haven't had an XT PE retail example yet. The minute it appears, it gets included.

    We do so for many reasons. However, the XT PE is benchmarked against the 6800 Ultra in a number of reference board articles at HEXUS, read those if you wish to see how they stack up.

    That we haven't had a sample of XT PE from an AIB is quite telling in some respects.

    Rys
    MOLLY AND POPPY!

  12. #12
    Sublime HEXUS.net
    Join Date
    Jul 2003
    Location
    The Void.. Floating
    Posts
    11,819
    Thanks
    213
    Thanked
    233 times in 160 posts
    • Stoo's system
      • Motherboard:
      • Mac Pro
      • CPU:
      • 2*Xeon 5450 @ 2.8GHz, 12MB Cache
      • Memory:
      • 32GB 1600MHz FBDIMM
      • Storage:
      • ~ 2.5TB + 4TB external array
      • Graphics card(s):
      • ATI Radeon HD 4870
      • Case:
      • Mac Pro
      • Operating System:
      • OS X 10.7
      • Monitor(s):
      • 24" Samsung 244T Black
      • Internet:
      • Zen Max Pro
    Yeah, they're too busy selling the few they have to give one away
    (\__/)
    (='.'=)
    (")_(")

  13. #13
    Registered User
    Join Date
    Aug 2004
    Posts
    1
    Thanks
    0
    Thanked
    0 times in 0 posts

    GPU vs. CPU

    Dear members,

    I am desperate to discover what sort of impact the cpu / memory bottleneck will have on the newer GPU;s.

    My question is due to the fact, I dont have enough money to buy an awsome all out new AMD FX processor with super low latency ram ect.. , ... but i can scrape together to buy a new 6800 gt... . What sort of impact will a slower CPU / memory config have on the new graphics cards, how heavily do they rely on cpu and memory, nobody out there has done a test showing the newer graphics cards framerates on slower systems...


    so what the deal yo yo... ?

    Stewie Griffin.

    p.s. Dont yu think its crazy were all out there buying new systems, or spending £300 just to play half life 2 or doom 3, hehe... craziness

  14. #14
    Registered User
    Join Date
    Jan 2004
    Posts
    8
    Thanks
    0
    Thanked
    0 times in 0 posts
    Depends on what cpu you currently own. Doom3 and HL2 will excel on newer graphics cards. Also have you read Doug Lombardi's latest comments? He says the x800s are beating the equivalent nvidia model by 30%. So have a long hard think before you go to the dark side.

    If you want a graphics card that will play Directx 9 games to their full potential. IMO there is only one choice. Far Cry and soon HL2 will show Ati dominating. The stalker engine will also be using dx9. And a lot of games will be spawned from these engines. Also id has been very successful in the past with licensing engines. This time round I foresee several problems. The 60hz tic factor, limiting fps and limited physics, doesn't bode well for id's latest incarnation. Throw in a lack of simultaneous models on screen (not more than 5/6) and you have a fairly limited next gen engine.

    HL2 will not have a 60hz limit or frame capping. Even Far Cry doesn't contain this limit. It's a structure of the engine and it's a mistake IMO. So hedge your bets wisely. You don't want to spend a lot of money and not get the best value card. Currently the 6800GT is the best pound for pound card. When you're thinking of only one game with a limited engine, with a limited frame rate, plus all the above reasons I gave above. To me the choice is obivous.

  15. #15
    Real Ultimate Power! Grey M@a's Avatar
    Join Date
    Oct 2003
    Location
    Newcastle
    Posts
    4,625
    Thanks
    52
    Thanked
    156 times in 139 posts
    • Grey M@a's system
      • Motherboard:
      • Gigabyte Z97X Gaming 7
      • CPU:
      • i7 4790K (With H100i cooling)
      • Memory:
      • Corsair Vengeance Pro 16GB DDR3 (2 x 8GB)
      • Storage:
      • Samsung 840 Pro 128GB SSD, 1TB Cavier Black WD HD, 4TB Cavier Black WD HD
      • Graphics card(s):
      • MSI R9 390X Gaming Edition 8GB
      • PSU:
      • SuperFlower Leadex GOLD 850W Fully Modular
      • Case:
      • Corsair 650D
      • Operating System:
      • Windows 8.1 Pro x64
      • Monitor(s):
      • 24" LG 24GM77-B 144Hz
      • Internet:
      • 100MB Virgin Media Cable
    Vis a vis the doom3 opengl argument (pre emptive strike ) ati is developing it's opengl driver to fully utilise the power of it's x800 core.
    Proof is in the pudding isn't it, no matter what ATi card I have tried or bought in the past and returned, all had shodding, if not down right disgraceful OpenGL performance, and as the majority of the classics are in OpenGL I won't touch an ATi card till its fixed. The remark on the ATi doing drivers to make OpenGL work shouldn't have to be the case, the core's logic should be able to handle it wihtout the need of hacking and tweaking drivers.

    Basically it boils down to ATi for DirectX and at this moment in time, Nvidia for OpenGL performance.

  16. #16
    Registered User
    Join Date
    Jan 2004
    Posts
    8
    Thanks
    0
    Thanked
    0 times in 0 posts
    Well ati are repaying their users faith with some improved performance across all opengl titles with the 3.9 beta cats. Users have reported large fps gains for opengl games such as Call to Duty and RTCW. These opengl titles are dominated by ati. The single and only reason why ati aren't doing so well is the extreme use of stencil shadows in doom3. Turn the stencil shadows off and you'll see that ati will take the lead again.

Page 1 of 2 12 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. AOpen Aeolus GeForce 6800 Ultra
    By $h@d0w in forum SCAN.care@HEXUS
    Replies: 9
    Last Post: 24-08-2004, 03:05 PM
  2. MSI GeForce 6800 for £188
    By Jonty in forum Retail Therapy and Bargains
    Replies: 32
    Last Post: 23-07-2004, 08:14 PM
  3. Replies: 4
    Last Post: 09-07-2004, 05:04 PM
  4. Replies: 0
    Last Post: 09-07-2004, 10:41 AM
  5. NVIDIA's GeForce 6800 Ultra GPU
    By DR in forum HEXUS Reviews
    Replies: 31
    Last Post: 17-04-2004, 02:59 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •