Nvidia Cheating In Benchmark Results

TenSixTenSix Join Date: 2002-11-09 Member: 7932Members
<div class="IPBDescription">Blamed on "Driver Bug"</div> Slashdot Article:
<a href='http://slashdot.org/articles/03/05/15/0620209.shtml?tid=152&tid=185&tid=137' target='_blank'>http://slashdot.org/articles/03/05/15/0620...tid=185&tid=137</a>

The Article:
<a href='http://www.extremetech.com/article2/0,3973,1086025,00.asp' target='_blank'>http://www.extremetech.com/article2/0,3973...,1086025,00.asp</a>

The Screenshots of "Garbled" Frames:
<a href='http://www.extremetech.com/article2/0,3973,1086862,00.asp' target='_blank'>http://www.extremetech.com/article2/0,3973...,1086862,00.asp</a>

SUMMARY:
Nvidia may be coding in clipping panes, to reduce the workload of the Geforce in certain demo's and benchmarks. In 3DMark 2003 these clipping planes were strategically placed around the path the camera took through the demo scene, but when the camera was moved off the normal path graphical glitches were encountered. These graphical glitches were caused by clipping brushes.

Now, the same thing could have been done in the Doom 3 demo as well. It was a demo right? Meaning the player is on a "track", so Nvidia could have possibly put in clipping brushes around the track, making the workload of the card immensly lower by not rendering what the player wouldnt see. And giving it much higher benchmark scores.

They also used a method of not clearing the frame buffer to increase FPS results, the problem is this causes major problems when going "off the track". See the below quote:
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Clearing the buffers (color and Z) involves writing zeros to all memory locations in the new back buffer and Z-buffer so that the GPU can begin drawing the next scene with a "clean slate." But since not all memory locations will contain visible pixels, parts of the buffers remain dirty -- based on the assumption that the camera's unvarying movement path won't catch the errors. The advantage to not clearing buffers is the bandwidth savings of avoiding those writes to memory. This frees up precious memory bandwidth to be working on other things, like texture/texel reads and writes of visible parts of the scenes. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->




QUOTES:
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->The problem is that, because a given workload is identical from run to run, hardware vendors can carefully study it and make optimizations to maximize performance, some legitimate, and others unsavory. In these types of deterministic tests, the camera movement is essentially "on a rail," (think of the opening credits sequence in Half-Life where you ride the tram into the depths of the base). And because the camera's every move is a known quantity, hardware vendors are able to figure out exactly what will, and won't be visible to the camera. This is where vendors can look for places to cut corners.

A developer version of 3DMark03 version 3.2 allows the tester to pause playback, and then move freely through the scene -- in much the same way you'd move through a first-person shooter like Unreal Tournament 2003. nVidia, because they are not a member of FutureMark's beta program, did not have access to that developer version when testing the 5900 Ultra with 3DMark2003.
<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->During our analysis of Game Test 4, we paused the benchmark, went into the free-look mode, and moved "off the rail" in the 3D scene. Once we did that, serious drawing errors were readily apparent. We interpreted these results to indicate that nVidia was adding static clip planes into the scene. These static clip planes reduce the amount of sky that the GPU has to draw, thus reducing the pixel shader workload and boosting performance. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->We discussed our findings with nVidia before we published this article, and the company's Direct3D driver team is currently investigating the problem. At press time, the company's engineers believe that the anomalies we're seeing are the result of a driver bug. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

Draw your own conclusions from the article....I know 3D Card makers have cheated on benchmarks before. But still, they werent tweaking the card to perform better, they were outright altering the benchmark to increase scores!

Comments

  • DubersDubers Pet Shop Boy Edinburgh, UK Join Date: 2002-07-25 Member: 998Members
    Man they must be worried if they are fixing results.
  • Cereal_KillRCereal_KillR Join Date: 2002-10-31 Member: 1837Members
    I think it's sad... I had lots of faith in nVidia and hoping that they would be able to topple ati's superority. Looks like they aren't all that honest.
  • GWARGWAR Join Date: 2002-11-01 Member: 2297Members, Contributor
    My overclocked Geforce 2 GTS has served me well, And if I were to ever upgrade it would be to a geforce4 Ti
  • ConfusedConfused Wait. What? Join Date: 2003-01-28 Member: 12904Members, Constellation, NS2 Playtester, Squad Five Blue, Subnautica Playtester
    ATI is your god:)

    but yeah this sux all those bench marks i pour over are not just inflated now but they are flat out irrelevatn i will have to go back to the tried and true if it cheaper buy it method:)
  • DubersDubers Pet Shop Boy Edinburgh, UK Join Date: 2002-07-25 Member: 998Members
    What I don't understand I why they did it when it was blatently obvious what they were up to.
  • Speed_2_DaveSpeed_2_Dave Join Date: 2002-11-15 Member: 8788Members
    <!--QuoteBegin--DuBERS+May 15 2003, 01:53 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DuBERS @ May 15 2003, 01:53 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> What I don't understand I why they did it when it was blatently obvious what they were up to. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    I believe he mentioned that NVidia lacked the trick that Slashdot used. They used a different "viewer" or somesuch. Also-if you can cheat cost-efficiently, wouldn't you? How many people will refute the claim of them cheating merely because they are NVidia Fanbois and will buy it anyways.

    All these benchmarks do is help confuse new buyers into buying something they think is superior to the competition. They succeeded.
  • Error404Error404 Join Date: 2002-11-19 Member: 9353Members
    This could be real damaging to their reputation.
  • eagleceaglec Join Date: 2002-11-25 Member: 9948Members, Constellation
    tbh, most of us know that all companies 'cheat' to get the best marketing campaign for their product. Look at any list of benchmarks and you are going to get mislead one way or another. My old Kyro 2 on some benchmarks scored higher than a GeForce 3, but in game... not a chance.

    Nvidia have accused ati of cheating when ati cheated better than nvidia, ati have accused nvidia and 3dfx accused them both. Take any benchmark with a large pinch of salt (about a kilo) especially if the manufacturer promotes their card based on the benchmark.

    Really neither company has a higher moral standing they're all after your money and if they can't get it legitimately they will happily trick you into it. That's bussiness. Remember it all comes down to whether 1 guy, or a small group of guys get their sports car/boat/holiday for Christmas.

    I'm an honest guy but if any company offered me 2 grand to say their card was fab or help them 'cheat' a benchmark I'd do it.

    Yes, I am a cynical old b@stard.

    Ten years ago I wasn't. I was a cynical young b@stard.


    So where did I put my money... well the the amd 450 a Kyro2, the P3-1G a GeForce3Ti200, the P4-2G a Gf4-4800Ti. I have a tnt2ultra sitting in a box. Would I like an ATI9700Pro? Of course I would, but it's too damn expensive.
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    I think we should all wait before rioting.

    Something about all this just seems fishy to me.

    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Meaning the player is on a "track", so Nvidia could have possibly put in clipping brushes around the track, making the workload of the card immensly lower by not rendering what the player wouldnt see. And giving it much higher benchmark scores.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

    Games don't render in 360 degrees. They render only what the player should see. Its been this way since DOOM (only rendering what the player could see was one of DOOM's big technological breakthroughs). 3D stuff would be unplayable slow any other way. I don't know what to think about this above statement other than "huh?".

    I say we just wait for more information from more reliable sites that we have heard of before, rather than just jumping to conclusions because Junky191 says there's wrong doing. Besides, don't you think the people doing these test would notice if the thing was reporting differently than what it was running?
  • SaltySalty Join Date: 2002-11-05 Member: 6970Members
    Isint rendering stuff before hand what buffering is?
  • OkaboreOkabore Join Date: 2002-11-21 Member: 9505Members
    Of course should a card just render what you see but the problem as I understood it is that they are dumping info to save bandwidth so when you want view things from a different angle it goes fubar.
  • airyKairyK Join Date: 2002-12-19 Member: 11126Members
    <!--QuoteBegin--eaglec+May 15 2003, 03:03 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (eaglec @ May 15 2003, 03:03 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> tbh, most of us know that all companies 'cheat' to get the best marketing campaign for their product. Look at any list of benchmarks and you are going to get mislead one way or another. My old Kyro 2 on some benchmarks scored higher than a GeForce 3, but in game... not a chance.

    Nvidia have accused ati of cheating when ati cheated better than nvidia, ati have accused nvidia and 3dfx accused them both. Take any benchmark with a large pinch of salt (about a kilo) especially if the manufacturer promotes their card based on the benchmark.

    Really neither company has a higher moral standing they're all after your money and if they can't get it legitimately they will happily trick you into it. That's bussiness. Remember it all comes down to whether 1 guy, or a small group of guys get their sports car/boat/holiday for Christmas.

    I'm an honest guy but if any company offered me 2 grand to say their card was fab or help them 'cheat' a benchmark I'd do it.

    Yes, I am a cynical old b@stard.

    Ten years ago I wasn't. I was a cynical young b@stard.


    So where did I put my money... well the the amd 450 a Kyro2, the P3-1G a GeForce3Ti200, the P4-2G a Gf4-4800Ti. I have a tnt2ultra sitting in a box. Would I like an ATI9700Pro? Of course I would, but it's too damn expensive. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    you've nailed it perfectly, in this game everyone trys to cheat. thats the one thing that everyone should know.
  • TychoCelchuuuTychoCelchuuu Anememone Join Date: 2002-03-23 Member: 345Members
    <!--QuoteBegin--DOOManiac+May 15 2003, 05:48 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DOOManiac @ May 15 2003, 05:48 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I think we should all wait before rioting.

    Something about all this just seems fishy to me.

    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Meaning the player is on a "track", so Nvidia could have possibly put in clipping brushes around the track, making the workload of the card immensly lower by not rendering what the player wouldnt see. And giving it much higher benchmark scores.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

    Games don't render in 360 degrees. They render only what the player should see. Its been this way since DOOM (only rendering what the player could see was one of DOOM's big technological breakthroughs). 3D stuff would be unplayable slow any other way. I don't know what to think about this above statement other than "huh?".

    I say we just wait for more information from more reliable sites that we have heard of before, rather than just jumping to conclusions because Junky191 says there's wrong doing. Besides, don't you think the people doing these test would notice if the thing was reporting differently than what it was running? <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    One of the next big things possible, according to that guy who made DOOM, is retinal tracking. Someday, companies will package retinal trackers with the games, and then your computer only renders high levels of detail wherever you are looking.

    This is sort of like that. Or they're just cheatin' weenies.
  • TenSixTenSix Join Date: 2002-11-09 Member: 7932Members
    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Games don't render in 360 degrees. They render only what the player should see. Its been this way since DOOM (only rendering what the player could see was one of DOOM's big technological breakthroughs). 3D stuff would be unplayable slow any other way. I don't know what to think about this above statement other than "huh?".
    I say we just wait for more information from more reliable sites that we have heard of before, rather than just jumping to conclusions because Junky191 says there's wrong doing. Besides, don't you think the people doing these test would notice if the thing was reporting differently than what it was running?<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

    <a href='http://www.extremetech.com/article2/0,3973,1086860,00.asp' target='_blank'>http://www.extremetech.com/article2/0,3973...,1086860,00.asp</a>

    From the article:
    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->During our analysis of Game Test 4, we paused the benchmark, went into the free-look mode, and moved "off the rail" in the 3D scene. Once we did that, serious drawing errors were readily apparent. We interpreted these results to indicate that nVidia was adding static clip planes into the scene. These static clip planes reduce the amount of sky that the GPU has to draw, thus reducing the pixel shader workload and boosting performance. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

    From what I understand after reading the article again, in the game it renders I think, the whole scene. The card only really "renders" what the player is looking at, the rest isnt detailed/bump mapped/pixel shaded or whatever until the player looks at it. But its still there. What Nvidia did, to my understanding, was put in clipping panes so that these unseen areas would not be rendered at all. Thus lessening the workload overall. You can do this in Half-Life, tell it not to render certain things at all.
  • Cereal_KillRCereal_KillR Join Date: 2002-10-31 Member: 1837Members
    <!--QuoteBegin--TychoCelchuuu+May 16 2003, 02:38 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (TychoCelchuuu @ May 16 2003, 02:38 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> One of the next big things possible, according to that guy who made DOOM, is retinal tracking. Someday, companies will package retinal trackers with the games, and then your computer only renders high levels of detail wherever you are looking.
    <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    what if you close your eyes? two people looking? cross-eye? <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->
Sign In or Register to comment.