Page 2 of 2 FirstFirst 12
Results 17 to 31 of 31

Thread: nVidia - Caught cheating?

  1. #17
    Senior Member
    Join Date
    May 2004
    Location
    Rochester, NY
    Posts
    1,041
    Thanks
    4
    Thanked
    8 times in 8 posts
    • oralpain's system
      • Motherboard:
      • DFI "Blood Iron" P35-T2RL
      • CPU:
      • Intel Pentium E2140 @ 400x8 (3.2GHz), 1.375v
      • Memory:
      • Crucial Ballistix DDR2 800 CL4 @ 500MHz (DDR 1000), 4-4-4-12-T2, 2.3v
      • Storage:
      • 2x Seagate ST3250410AS
      • Graphics card(s):
      • NVIDIA 8800GTS (G92) 512 @ 783MHz core, 1836MHz shader, 1053Mhz memory, stock cooling 70% fan speed
      • PSU:
      • Seasonic SS-500GB
      • Case:
      • Antec P182, with some small modifications
      • Monitor(s):
      • ASUS VW222U
      • Internet:
      • Time Warner "Road Runner" Cable - 16 megabit downstream, 1 megabit upstream
    Everybody optimises, it's a good idea, untill the optimizations bring about a noticeable loss in quality.

    I had a 6800GT @ 420/1128 but moved to a hard modded X800pro > X800XT when I went PCI-E. The NVIDIA card had slightly better AF quality at the settings I used, but I did run with most of the optimisations off, and performance did suffer a bit because of it. Either way I don't think it's a big deal as long as you can choose to run with fewer optimisations. Some game looked exactly the but ran much faster with the optimizations on. Some looked liked crap and needed the optimizations turned off to look good.

    I've never used a G70, but it seems all NVIDIA has to do is allow the optimizations to be disabled in the control panel like they did with the NV40s.

  2. #18
    Treasure Hunter extraordinaire herulach's Avatar
    Join Date
    Apr 2005
    Location
    Bolton
    Posts
    5,618
    Thanks
    18
    Thanked
    172 times in 159 posts
    • herulach's system
      • Motherboard:
      • MSI Z97 MPower
      • CPU:
      • i7 4790K
      • Memory:
      • 8GB Vengeance LP
      • Storage:
      • 1TB WD Blue + 250GB 840 EVo
      • Graphics card(s):
      • 2* Palit GTX 970 Jetstream
      • PSU:
      • EVGA Supernova G2 850W
      • Case:
      • CM HAF Stacker 935, 2*360 Rad WC Loop w/EK blocks.
      • Operating System:
      • Windows 8.1
      • Monitor(s):
      • Crossover 290HD & LG L1980Q
      • Internet:
      • 120mb Virgin Media
    Although there probably is a reason, at least in part it is probably due to the driver not knowing what architecture its running on and so making som ebad optimaisation decisions, or no optimisation decisions whatever. For example,. for all we know there could be a new way of buffering something or other thats driver enabled but not used here since the driver doesnt know what card its running on.

  3. #19
    ATI Technologies exAndrzej's Avatar
    Join Date
    Dec 2004
    Location
    London, UK
    Posts
    555
    Thanks
    0
    Thanked
    0 times in 0 posts
    I have to say, with all due respect, that I think you guys might be missing the key point

    Forget anything to do with history or the 'ATI Vs nVidia' situation - we are competitors in a great market and we both enjoy pushing as hard as we can - it is genuinely fun !


    In this case, all you need to look at is the way that the 6800 and 7800 series cards decide is (a) the right place to change detail level (i.e. drop off quality) and (b) consider the way that the sampling works (or otherwise) from a users' point of view

    Everything else is smoke & mirrors

    If you could implement the 'shimmer driver' effect on a 6800 series card - how many more frames a second would it deliver ?

    In certain cases, some sites have made the argument that the improvement in frame rate caused by these definite quality changes could be as high as 30%

    The focus then becomes, how many more frames a second does a user have to expect to get in order to spend £400 a upgrade ?

    If they are basing their purchasing decision on a gap that the review sites show...

    ...but in fact it is not an apples-2-apples comparison because the newer/more expensive card is (apparently) doing measurably less work...

    ...then would they have bought the new card if the gap were smaller ?

    Ultimately, would a user have spent the money if the 'real world' gap between two nVidia cards were smaller ?


    Side issue... if it was a bug - and being worked on in future driver releases - then it would be in the driver release notes - yes ?
    .
    "X800GT... snap it up while you still can"
    HEXUS
    ......................................August 2005

  4. #20
    Senior Member
    Join Date
    Oct 2004
    Location
    United Kingdom
    Posts
    254
    Thanks
    0
    Thanked
    0 times in 0 posts
    Well considering that it has already been cleared up in a beta driver that nVidia has released, where the shimmering has gone completely, without losing the mysterious 30% performance. I would say that it was something made from nothing.


    Beta driver 78.03 again here are some links that disprove this whole fiasco

    http://www.hardforum.com/showthread.php?t=947072

    http://www.nvnews.net/vbulletin/showthread.php?t=55812

    Remember the billionaire/trillionaire problems that the other company had last year with the x800 cards?
    Because of Munich, God gave us Georgy Best

  5. #21
    Senior Member Tobeman's Avatar
    Join Date
    Apr 2005
    Location
    IN YOUR FRIDGE, AWPIN' YOUR NOOBS
    Posts
    1,823
    Thanks
    34
    Thanked
    11 times in 11 posts
    Quote Originally Posted by Mesce
    THEY'RE REFERENCE CARDS, as in, not cards you can buy on the market. You might get lucky and find one here and there, but 2 reference gt's and 2 reference 7800gtx's?.. Not going to happen.
    You mean like stock nvidia cards? They are a plenty on eBay

  6. #22
    Rys
    Rys is offline
    Tiled
    Join Date
    Jul 2003
    Location
    Abbots Langley
    Posts
    1,479
    Thanks
    0
    Thanked
    2 times in 1 post
    Ignoring Andrzej's spin, this was brought up long before Fudo decided to catch up with the rest of us.

    Understanding of the issue would go a long way to allow insightful and informative commentary on it, and keep this thread interesting. I don't see much of that at all.
    MOLLY AND POPPY!

  7. #23
    Rys
    Rys is offline
    Tiled
    Join Date
    Jul 2003
    Location
    Abbots Langley
    Posts
    1,479
    Thanks
    0
    Thanked
    2 times in 1 post
    Quote Originally Posted by Mesce
    Well considering that it has already been cleared up in a beta driver that nVidia has released, where the shimmering has gone completely, without losing the mysterious 30% performance. I would say that it was something made from nothing.

    Beta driver 78.03 again here are some links that disprove this whole fiasco

    http://www.hardforum.com/showthread.php?t=947072

    http://www.nvnews.net/vbulletin/showthread.php?t=55812

    Remember the billionaire/trillionaire problems that the other company had last year with the x800 cards?
    It not gone completely, and your previous assertion that it only happened in Quality mode and with AF off is completely wrong, too. You're right that it's not 30% of performance, though, but it's not a free fix either.

    And the trilinear/bilinear issues are optimisations common to both IHVs in currently shipping WHQL drivers.
    MOLLY AND POPPY!

  8. #24
    Xcelsion... In Disguise. Xaneden's Avatar
    Join Date
    Nov 2004
    Location
    United Kingdom
    Posts
    1,699
    Thanks
    0
    Thanked
    0 times in 0 posts
    Nvidia are definately going to be in the doghouse then (somewhat unfairly).

    *smirks behind ATi banner *
    New Sig on the Way...

  9. #25
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,039
    Thanks
    1,880
    Thanked
    3,379 times in 2,716 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish
    We will see. ATI have always been keen to be compared in the image quality sense as well as outright speed. I have no doubt that thorough reviews will surface with the R520 et al that compare both speed and image quality with the g70 and previous generation cards with the latest drivers. Then it will be up to us, the informed purchaser, to chose the card which best fits our demands of price, speed and image quality.

    Personally I'm still waiting for the competition so that such a comparison can be carried out. I don't care about optimizations (I *love* Z-culling!) per se. ATI don't need to jump on these accusations, just produce a card that is better and let the reviewers show us just how much better it is.

  10. #26
    Xcelsion... In Disguise. Xaneden's Avatar
    Join Date
    Nov 2004
    Location
    United Kingdom
    Posts
    1,699
    Thanks
    0
    Thanked
    0 times in 0 posts
    I agree Kalniel, I doubt ATi will make any comments over this, for fear of prior 'incidents' with their cards being brought up. All they need to do is make the R520 a speed demon, and they're set to make a come-back

    Also, it makes them look polite and somewhat the more sophisticated party involved. A brawl over this wouldn't look good for them.
    Last edited by Xaneden; 31-08-2005 at 03:01 PM.
    New Sig on the Way...

  11. #27
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,039
    Thanks
    1,880
    Thanked
    3,379 times in 2,716 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish
    Quote Originally Posted by Andrzej
    I have to say, with all due respect, that I think you guys might be missing the key point

    Forget anything to do with history or the 'ATI Vs nVidia' situation - we are competitors in a great market and we both enjoy pushing as hard as we can - it is genuinely fun !


    In this case, all you need to look at is the way that the 6800 and 7800 series cards decide is (a) the right place to change detail level (i.e. drop off quality) and (b) consider the way that the sampling works (or otherwise) from a users' point of view

    Everything else is smoke & mirrors

    If you could implement the 'shimmer driver' effect on a 6800 series card - how many more frames a second would it deliver ?

    In certain cases, some sites have made the argument that the improvement in frame rate caused by these definite quality changes could be as high as 30%

    The focus then becomes, how many more frames a second does a user have to expect to get in order to spend £400 a upgrade ?

    If they are basing their purchasing decision on a gap that the review sites show...

    ...but in fact it is not an apples-2-apples comparison because the newer/more expensive card is (apparently) doing measurably less work...

    ...then would they have bought the new card if the gap were smaller ?

    Ultimately, would a user have spent the money if the 'real world' gap between two nVidia cards were smaller ?


    Side issue... if it was a bug - and being worked on in future driver releases - then it would be in the driver release notes - yes ?
    Is this a case of kettle, pot, black, calling, the [rearrange]? Are ati doing exactly the same? "ATI doesn’t even list it in the release notes for their Catalyst drivers"

    http://www.hardwareanalysis.com/cont.../article/1812/

  12. #28
    ATI Technologies exAndrzej's Avatar
    Join Date
    Dec 2004
    Location
    London, UK
    Posts
    555
    Thanks
    0
    Thanked
    0 times in 0 posts
    Quote Originally Posted by kalniel
    ...kettle, pot...
    Quote Originally Posted by Xcelsion
    ...doubt ATi will make any comments...


    I do love you guys...

    ...but almost everyone seems to have missed the real point completely !!!

    My earlier post point out AS CLEARLY AS POSSIBLE that this is not an ATI Vs nVidia issue

    The question is not 'who is best' or 'who cheated' and I thought I stressed that strongly

    I will say it again, when comparing the 6800 and 7800 (at launch) you must focus only on :-
    1) Where does the change happen (i.e. why does the shimmering effect start in different places between 6800 and 7800) ?
    2) Why is it as pronounced as it is ?

    Quote Originally Posted by Andrzej
    ...Everything else is smoke & mirrors...
    Some of the stories that I have read on this subject are simply comical

    Calls for 'mass refunds' on the basis of shimmering are plain daft - as are people who think that this is something that will get solved with driver updates

    Some shimmering happens naturaly because of what the game asks the card/driver to do

    Imagine a game scene with some 'noisy' area - possibly black and white pebbles - that is not on the main/primary level of detail (i.e. you are not staring at it close up)

    As you move through this area of the game, many variables are constantly changing

    As a result - with no optimisations at all (vendor is irrelevant here) - you can still get shimmering because of the way that the underlying texture is changing

    THIS IS JUST HOW IT IS - AND IS NOT LIKELY TO CHANGE - YOU NEED TO ACCEPT IT AND MOVE ON

    That said, by looking at the different points where the shimmering takes place between the 6800 and 7800 (at launch) you can see that a 'weaker' detail level has been used earlier on the 7800

    The only question is 'With all that additional horsepower available in the new design - why does it appear that the AF workload was reduced ?'

    That is is

    No more - just focus on that

    To quote a member of the HEXUS Hierarchy...

    "Why is it - that whenever it’s discovered that NVIDIA seems to have a 'bug' in its drivers, which negatively affects image quality - that this seems to dramatically improve the benchmark performance of its products?
    As the author of the original James Bond novels, Ian Fleming wrote:
    Once is happenstance. Twice is coincidence. Three times is enemy action..."
    Last edited by exAndrzej; 13-09-2005 at 09:28 AM.
    .
    "X800GT... snap it up while you still can"
    HEXUS
    ......................................August 2005

  13. #29
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,039
    Thanks
    1,880
    Thanked
    3,379 times in 2,716 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish
    We love you too Andrzej, even if I have no idea how to pronounce your name, mine is the english version of the same I think

    I probably misread what you were saying before - thinking you were calling NVidia out for not posting about it in the driver release notes, when infact they have (I think) now posted about it in release notes where ATI have not (yet - and possibly irrelevant anyway).

    In your comparisons you seem to be saying that NVidia used this optimization to increase the percieved increase in performance over the last generation of NVidia cards. I had missed your point that they seem to be applying *more* optimization to the 7800 than the 6800. I'm still not sure if you are saying they actually have, or just that the drivers at time of lauch for each of the two are different.

    If the same driver applies more optimization to the 7800 than the 6800 then you have a point, but I think it's a bit unfair to consider different drivers. I have no qualms about NVidia or ATI increasing the speed of their drivers over time, as I think most review sites compare video cards under the same drivers for fairness sake.

    The answer to your question "'With all that additional horsepower available in the new design - why does it appear that the AF workload was reduced ?" is simply because they believed such a reduction in workload would increase performance without affecting image quality. If anyone thinks they are wrong on that all they need to do (as said a few posts ago) is produce a rival card/driver that has better image quality themselves.

  14. #30
    ATI Technologies exAndrzej's Avatar
    Join Date
    Dec 2004
    Location
    London, UK
    Posts
    555
    Thanks
    0
    Thanked
    0 times in 0 posts
    Quote Originally Posted by kalniel
    ...they believed such a reduction in workload would increase performance...
    That is cool

    However, in terms of making a buying decision, not having the 'shimmer driver' opptimisation (i.e. reduced workload) on the 6800 series...

    ...means that the gap is bigger than it might have been if the release driver for 7800 also reduced the workload for the 6800 series

    Does that make sense ?

    That is why I included the quote:-
    Once = It just happens
    Twice = Coincidence
    Thrice = Deliberate action

    You asked about a 'fix' for ATI - but that would indicate a fix were needed

    We work closely with developers to make sure we do things in a way that pleases them (and they are the authors !)

    We also provide a 'switch' for Catalyst AI so that you can turn off optimisations and see what would happen in a 'vanilla' state

    Our guys tried it with the scenes in question - and they still exhibited the shimmer (i.e. no change from Catalyst AI enabled to Catalyst AI off - indicating that Catalyst AI was giving exactly the single-frame image that the game was trying to generate)

    The reason, as I tried to explain, is that when you pull samples out of a 'noisy' background while you are on the move - you will naturally get this effect

    That is why focusing on the effect is a mistake (as I pointed out)

    Focus instead on where it happens - that will explain what is really going on

    It is possible that there is a perfectly logical reason why the workload appears to have been reduced on the 7800 with the launch driver...

    ...and I have open ear/mind should anyone have a few minutes to explain it to me...

    ...but this discussion has been running for a while and I have still not seen a post to explain it

    I am here to be shot at/discuss views/help with issues etc - and I am definitely open to alternative interpretations... but I have not heard any yet on this subject



    BTW: It is pronounced An-Zhay - or Anjay - or AJ - depending on how lazy you want to be... but anything that makes me turn around is considered a 'hit' when you have suprious 'z' letters in the middle of your name
    .
    "X800GT... snap it up while you still can"
    HEXUS
    ......................................August 2005

  15. #31
    Banhammer in peace PeterB kalniel's Avatar
    Join Date
    Aug 2005
    Posts
    31,039
    Thanks
    1,880
    Thanked
    3,379 times in 2,716 posts
    • kalniel's system
      • Motherboard:
      • Gigabyte Z390 Aorus Ultra
      • CPU:
      • Intel i9 9900k
      • Memory:
      • 32GB DDR4 3200 CL16
      • Storage:
      • 1TB Samsung 970Evo+ NVMe
      • Graphics card(s):
      • nVidia GTX 1060 6GB
      • PSU:
      • Seasonic 600W
      • Case:
      • Cooler Master HAF 912
      • Operating System:
      • Win 10 Pro x64
      • Monitor(s):
      • Dell S2721DGF
      • Internet:
      • rubbish
    Quote Originally Posted by Andrzej
    ...means that the gap is bigger than it might have been if the release driver for 7800 also reduced the workload for the 6800 series

    Does that make sense ?
    It does.. but what confuses me is the article suggests the workload is reduced for the 6800 series cards as well:

    Quote Originally Posted by inquirer
    It's interesting to note that older Geforce 5800 Ultra won't suffer from this, just the new cards that [ are] 6800 or 7800 based.
    (bold and grammatical correction mine)

    ps - please don't anyone think I'm a NVidia fanboy - I love both companies, the fact I have a greencard in my computer is solely down to the fact the ATI cards are so good people don't part with them so easily
    Last edited by kalniel; 13-09-2005 at 01:31 PM.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. More Nvidia debacles?
    By davidstone28 in forum Graphics Cards
    Replies: 2
    Last Post: 20-08-2005, 12:21 AM
  2. NVIDIA to launch 'open driver'?
    By Steve in forum HEXUS News
    Replies: 3
    Last Post: 18-08-2005, 09:17 PM
  3. NVIDIA to sponsor the $1,000,000 CPL
    By Nick in forum HEXUS News
    Replies: 0
    Last Post: 14-02-2005, 01:29 PM
  4. Replies: 3
    Last Post: 07-12-2004, 11:45 AM
  5. Replies: 0
    Last Post: 07-12-2004, 10:44 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •