Driverheaven Source Benchmark

2

Comments

  • dhakbardhakbar Join Date: 2004-08-01 Member: 30305Members
    <!--QuoteBegin-coil+Aug 19 2004, 10:59 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (coil @ Aug 19 2004, 10:59 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Frankly, what I love about Source as opposed to DOOM3 is the subtlety of it. <a href='http://www.darkbeforedawn.net/external/css1.jpg' target='_blank'>This pic</a> that Uncritical posted in the CS:Source thread is a perfect example -- the tiles look real. The lighting looks real. DOOM is scary and claustrophobic and cool, but with Source you could make and show me a room that was convincing enough to be real. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    This has nothing to do with the engine. What you are impressed by is not the engine, but the art assets. The capabilities of the HL2 engine that make realistic looking lighting and the realistic looking tiles are already in the Doom 3 engine.

    Someone could make convincing looking tiles and convincing lighting with the Doom 3 engine... you people all seem to think that Doom 3 the game is all that Doom 3 the engine can do, and that's preposterous. Wait until somebody makes a game using the Doom 3 engine that isn't dark and claustrophobic.
  • dhakbardhakbar Join Date: 2004-08-01 Member: 30305Members
    <!--QuoteBegin-coil+Aug 19 2004, 10:59 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (coil @ Aug 19 2004, 10:59 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> However, I'm not surprised by these results. Everything I've read on the subject, both from Valve and from independent sites, says that Source is, if nothing else, a DirectX-centric engine. DirectX was created as and is fundamentally a standard. It's like standardizing electrical outlets or HTML coding (the W3C standard) or anything - the idea is to make something that is easy to deal with - theoretically, on both the software and the hardware side of things.

    The problem, as I have been led to understand, is that NVidia likes to cut corners with its cards, and DirectX isn't really designed with corner-cutting in mind. The FX series is a classic example; they were so flawed (as someone said in the CS:Source thread, only capable of supporting Shader 1.4 instead of the DX9 standard Shader 2.0) that the Source engine actually defaults them to DX8.1 instead of DX9. It appears that the 6x series is cutting similar corners. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    You think that DirectX is "fundamentally a standard?!"

    Ever heard of OpenGL? The industry standard that is ruled by a standards group composed of various companies and industry representatives, NOT one large company that is also a known monpolist... That's a standard. It is used in applications ranging from rendering CGI movies to gaming on various gaming platforms, including Sony's consoles and PC's running Windows and Unix OS's.
  • RandomEngyRandomEngy Join Date: 2002-11-03 Member: 6146Members, Reinforced - Shadow
    <!--QuoteBegin-t20+Aug 19 2004, 04:37 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (t20 @ Aug 19 2004, 04:37 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Travis Dane+Aug 19 2004, 10:16 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Travis Dane @ Aug 19 2004, 10:16 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Then I suppose i'm the only seriously doubting these benchmark results <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink-fix.gif' border='0' style='vertical-align:middle' alt='wink-fix.gif' /><!--endemo-->.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    You are right to be suspicious travis <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile-fix.gif' border='0' style='vertical-align:middle' alt='smile-fix.gif' /><!--endemo-->

    The benchmark is quite simply BS. This one seems about right, showing much more reasonable results: <a href='http://www.vr-zone.com/?i=1181&s=1' target='_blank'>http://www.vr-zone.com/?i=1181&s=1</a>

    Also have a look at <a href='http://www.hardforum.com/showthread.php?t=796256&page=1&pp=20' target='_blank'>http://www.hardforum.com/showthread.php?t=...56&page=1&pp=20</a> Seems the gap is approx 20% with the 61.77 drivers and 10% with the beta 65.62s. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    About that first link:

    1) The site is terrible with pop-ups built into the page to circumvent blockers
    2) The frame rate is obviously capped at 75 fps, which is why all the benchmarks looked the same (close to but not over 75 fps).

    It looks like a pretty worthless review.
  • UnCriticalUnCritical Join Date: 2002-01-25 Member: 73Members, Constellation
    <!--QuoteBegin-dhakbar+Aug 19 2004, 06:47 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (dhakbar @ Aug 19 2004, 06:47 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> You think that DirectX is "fundamentally a standard?!"

    Ever heard of OpenGL? The industry standard that is ruled by a standards group composed of various companies and industry representatives, NOT one large company that is also a known monpolist... That's a standard. It is used in applications ranging from rendering CGI movies to gaming on various gaming platforms, including Sony's consoles and PC's running Windows and Unix OS's. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Both directX and OpenGL are standards...its not like you can only have one standard.

    Example -
    1 meter is a standard. So is a foot. They both do the same thing (measure distances)
    DirectX is a standard. So is OpenGL.
  • MulletMullet Join Date: 2003-04-28 Member: 15910Members, Constellation
    If that graph showed ATI with the lower FPS, I still would buy an ATi because I've been screwed over by 2 nvidia cards. I lost over 300 dollars from them....bast***s.
  • WheeeeWheeee Join Date: 2003-02-18 Member: 13713Members, Reinforced - Shadow
    it's not like Microsoft is gangbeating their DX standards into the industry. They take a lot of input from various graphics hardware companies. IIRC, nVidia dropped out of the DX9 project because they couldn't force Microsoft to add some standards which would have favored them.
  • Jim_has_SkillzJim_has_Skillz Join Date: 2003-01-19 Member: 12475Members, Constellation
    Doom 3 great visual quality to it and you will see some pretty kick **** games in the near future using its engine. Doom 3 even built in a Ultra setting which is meant for new video cards that are to come out with 512 mbs of vram. The quality of detail looks orgasmic if you turn this setting on, even if there is a very large drop in your fps.
  • SwiftspearSwiftspear Custim tital Join Date: 2003-10-29 Member: 22097Members
    A. The bargraph picture on this thread IS NOT part of the benchmark series. They compared the cards at full levels after the full benchmarks, just to see what it would look like. Geforce's rough 8xAAS stole alot of speed from the card, but even when comparing at equal AA levels, the ATI card showed an average of 30% increase in FPS preformance over the Nvid card.

    B. The second benchmark link on this page, the one with the site who's popus circumvent popup blocker (thank God for firefox), benchmarks CS:S, not HL2. These games run the same engine, but there are knowticable differences, it would be like benchmarking HL against NS... HL2 has much more poly intensive models, and does alot more work with map realism and lighting than CS:S does. This could easily account for the differnce in results for the two benchmarks...
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    edited August 2004
    However, considering that ATi's '8x' can be TFSAAd to run at 16x with NO loss in framerate, nVidia gets another jackboot to the face.

    The benchmarks are real. We've had Doom3, HEAVILY optimized for nVidia cards. ATi kept up at a decent rate, using the default OpenGL path.
    nVidia, instead used a customized backend (or reintegrated into the drivers backend) that dropped colour calculations to half or one-third of what they SHOULD have been, crappifying the visual quality to synthetically boost fps.. all so they can psych out what few fanboys they have into thinking they are actually performing well. So don't even talk about nVidia doing better with OpenGL, until you run the tests between 'clean' games. Meaning non-optimized ones. ATi cards will whup the crap out of 'em.

    Now we have Source coming out, a game that utilizes a different standard. One that nVidia hasn't had time to code in cheats for yet. They do abysmally... but the nVidia fanboys rush in and call the benchmarks 'unfair', because it shows how they ACTUALLY perform, <i>outside</i> a horrifically stacked and.. er... 'optimized' *coughcheateredcough* game.


    Face it. The 6800s are lacking. The only way they can even seem to keep up with the X800 XTPE is when they cheat, so they only do half to one third the work.
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    <!--QuoteBegin-Talesin+Aug 19 2004, 06:15 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Aug 19 2004, 06:15 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> However, considering that ATi's '8x' can be TFSAAd to run at 16x with NO loss in framerate, nVidia gets another jackboot to the face.

    The benchmarks are real. We've had Doom3, HEAVILY optimized for nVidia cards. ATi kept up at a decent rate, using the default OpenGL path.
    nVidia, instead used a customized backend (or reintegrated into the drivers backend) that dropped colour calculations to half or one-third of what they SHOULD have been, crappifying the visual quality to synthetically boost fps.. all so they can psych out what few fanboys they have into thinking they are actually performing well. So don't even talk about nVidia doing better with OpenGL, until you run the tests between 'clean' games. Meaning non-optimized ones. ATi cards will whup the crap out of 'em.

    Now we have Source coming out, a game that utilizes a different standard. One that nVidia hasn't had time to code in cheats for yet. They do abysmally... but the nVidia fanboys rush in and call the benchmarks 'unfair', because it shows how they ACTUALLY perform, <i>outside</i> a horrifically stacked and.. er... 'optimized' *coughcheateredcough* game.


    Face it. The 6800s are lacking. The only way they can even seem to keep up with the X800 XTPE is when they cheat, so they only do half to one third the work. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Soucres? Examples? You've been saying this forever and have never once shown how an Nvidia card looks worse in game than an ATI card, because I just don't see it.
  • CabooseCaboose title = name(self, handle) Join Date: 2003-02-15 Member: 13597Members, Constellation
    <!--QuoteBegin-t20+Aug 19 2004, 10:37 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (t20 @ Aug 19 2004, 10:37 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Travis Dane+Aug 19 2004, 10:16 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Travis Dane @ Aug 19 2004, 10:16 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Then I suppose i'm the only seriously doubting these benchmark results <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink-fix.gif' border='0' style='vertical-align:middle' alt='wink-fix.gif' /><!--endemo-->.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    You are right to be suspicious travis <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile-fix.gif' border='0' style='vertical-align:middle' alt='smile-fix.gif' /><!--endemo-->

    The benchmark is quite simply BS. This one seems about right, showing much more reasonable results: <a href='http://www.vr-zone.com/?i=1181&s=1' target='_blank'>http://www.vr-zone.com/?i=1181&s=1</a>

    Also have a look at <a href='http://www.hardforum.com/showthread.php?t=796256&page=1&pp=20' target='_blank'>http://www.hardforum.com/showthread.php?t=...56&page=1&pp=20</a> Seems the gap is approx 20% with the 61.77 drivers and 10% with the beta 65.62s. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    I'd be willing to bet that if a mod were to look at t20's IP, they would find it to be the same as Travis's. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin-fix.gif' border='0' style='vertical-align:middle' alt='biggrin-fix.gif' /><!--endemo-->
  • CForresterCForrester P0rk(h0p Join Date: 2002-10-05 Member: 1439Members, Constellation
    edited August 2004
    And a tense air blew through the thread as all remained silent, secure in the knowledge that Talesin had once again kicked their butts.

    [EDIT:] Damn. Not so silent.
  • CommunistWithAGunCommunistWithAGun Local Propaganda Guy Join Date: 2003-04-30 Member: 15953Members
    Cforresters member title should be Son of Meatloaf, not porkchop ;p
  • Status_QuoStatus_Quo Join Date: 2004-01-30 Member: 25749Members
    <a href='http://www20.graphics.tomshardware.com/graphic/20040504/ati-x800-12.html' target='_blank'>More benchmarks</a> for various games & cards, if you're interested.
  • dhakbardhakbar Join Date: 2004-08-01 Member: 30305Members
    <!--QuoteBegin-UnCritical+Aug 19 2004, 01:31 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (UnCritical @ Aug 19 2004, 01:31 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-dhakbar+Aug 19 2004, 06:47 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (dhakbar @ Aug 19 2004, 06:47 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> You think that DirectX is "fundamentally a standard?!"

    Ever heard of OpenGL?  The industry standard that is ruled by a standards group composed of various companies and industry representatives, NOT one large company that is also a known monpolist...  That's a standard.  It is used in applications ranging from rendering CGI movies to gaming on various gaming platforms, including Sony's consoles and PC's running Windows and Unix OS's. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Both directX and OpenGL are standards...its not like you can only have one standard.

    Example -
    1 meter is a standard. So is a foot. They both do the same thing (measure distances)
    DirectX is a standard. So is OpenGL. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    You entirely miss my point.

    DirectX is not as "standard" as OpenGL. That is a fact. OpenGL is usable on a huge variety of hardware. DirectX is usable on Windows PCs. DirectX is one of the reasons that PC games are primarily released for Windows... MS made DirectX and did all they could to get developers to use it to create their games so that those games would only run on the Windows platform.

    It's just standard Microsoft practice; making sure they can maintain their desktop monopoly. If more developers would develop with OpenGL, Linux and OSX users would have much better chances of playing games, as OpenGL is cross-platform. By using DirectX, game developers perpetuate their reliance upon Windows, which perpetuates everyone's reliance upon Windows because it is the only platform that most PC games are released for. That means less competition for the desktop, which means Microsoft can continue to charge high prices for a sub-par product and you will have to keep using it because you don't have any other options in such a situation.

    OpenGL is cross-platform, making it a much more consumer-friendly standard.
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    edited August 2004
    <!--QuoteBegin-404NotFound+Aug 19 2004, 03:22 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (404NotFound @ Aug 19 2004, 03:22 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Talesin+Aug 19 2004, 06:15 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Aug 19 2004, 06:15 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> However, considering that ATi's '8x' can be TFSAAd to run at 16x with NO loss in framerate, nVidia gets another jackboot to the face.

      The benchmarks are real. We've had Doom3, HEAVILY optimized for nVidia cards. ATi kept up at a decent rate, using the default OpenGL path.
      nVidia, instead used a customized backend (or reintegrated into the drivers backend) that dropped colour calculations to half or one-third of what they SHOULD have been, crappifying the visual quality to synthetically boost fps.. all so they can psych out what few fanboys they have into thinking they are actually performing well. So don't even talk about nVidia doing better with OpenGL, until you run the tests between 'clean' games. Meaning non-optimized ones. ATi cards will whup the crap out of 'em.

      Now we have Source coming out, a game that utilizes a different standard. One that nVidia hasn't had time to code in cheats for yet. They do abysmally... but the nVidia fanboys rush in and call the benchmarks 'unfair', because it shows how they ACTUALLY perform, <i>outside</i> a horrifically stacked and.. er... 'optimized' *coughcheateredcough* game.


      Face it. The 6800s are lacking. The only way they can even seem to keep up with the X800 XTPE is when they cheat, so they only do half to one third the work. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Soucres? Examples? You've been saying this forever and have never once shown how an Nvidia card looks worse in game than an ATI card, because I just don't see it. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Take a look at the screenshots posted by EEK <a href='http://www.unknownworlds.com/forums/index.php?showtopic=77620&st=30' target='_blank'>in the 6600 announced thread.</a> The topmost shows the shadow-banding on the hand, and to a lesser extent, the walls and zombies.
    Also noticeable is the horrible texture/normalmap artifacting, especially along the top of the AR and in the zombie-clothes. Also note that the nVidia 'answer' posts later in that thread do not feature any shots of the AR or hand in a position where the shadowbanding would be seen blatantly, but some banding is noticeable around the knuckles as it is barely seen, holding the flashlight up. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin-fix.gif' border='0' style='vertical-align:middle' alt='biggrin-fix.gif' /><!--endemo-->

    nVidia does NOT comply to the OpenGL standard, nor does it render the game as it was meant to be. They take shortcuts, doing half to one-third the work of an ATi, sacrificing showing the game as it's meant to be seen in favor of propping up their flagging framerates, to give what few fanboys they have left some small amount of hope that they might come back from second place.
  • Soylent_greenSoylent_green Join Date: 2002-12-20 Member: 11220Members, Reinforced - Shadow
    edited August 2004
    Am I the only one who thinks 8xS aa and 6xMSAA should not under any circumstances be compared? 8xS is MUCH MUCH harder for the hardware to do, it's something like 4XRGMS + 2xSS. Honestly this is like running the geforce in 1600x1200 and the radeon 1024x768 and going see, the ATi pwns nVidia by 67% !!!11

    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->However, considering that ATi's '8x' can be TFSAAd to run at 16x with NO loss in framerate, nVidia gets another jackboot to the face.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->

    Except it only works at high framerates, and then only when things are still(or the illusion won't work. It's when you're still that you see aliasing the easiest so this isn't much of a problem). Only 6xAA is being applied, but the sample pattern changes to trick your brain into seeing something that looks better. This won't work if you are moving or your FPS is too low(it needs to be quite high or it will look bad, that's why it reverts to regular AA if you don't meet a certain FPS). Also I would MUCH rather have 8xS AA, this AA's textures, that means that alpha test rendered things are AA'ed too!(things like wire mesh, { textures in HL1). MSAA is edges only.
  • Jim_has_SkillzJim_has_Skillz Join Date: 2003-01-19 Member: 12475Members, Constellation
    <!--QuoteBegin-Talesin+Aug 19 2004, 05:29 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Aug 19 2004, 05:29 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-404NotFound+Aug 19 2004, 03:22 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (404NotFound @ Aug 19 2004, 03:22 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Talesin+Aug 19 2004, 06:15 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Aug 19 2004, 06:15 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> However, considering that ATi's '8x' can be TFSAAd to run at 16x with NO loss in framerate, nVidia gets another jackboot to the face.

      The benchmarks are real. We've had Doom3, HEAVILY optimized for nVidia cards. ATi kept up at a decent rate, using the default OpenGL path.
      nVidia, instead used a customized backend (or reintegrated into the drivers backend) that dropped colour calculations to half or one-third of what they SHOULD have been, crappifying the visual quality to synthetically boost fps.. all so they can psych out what few fanboys they have into thinking they are actually performing well. So don't even talk about nVidia doing better with OpenGL, until you run the tests between 'clean' games. Meaning non-optimized ones. ATi cards will whup the crap out of 'em.

      Now we have Source coming out, a game that utilizes a different standard. One that nVidia hasn't had time to code in cheats for yet. They do abysmally... but the nVidia fanboys rush in and call the benchmarks 'unfair', because it shows how they ACTUALLY perform, <i>outside</i> a horrifically stacked and.. er... 'optimized' *coughcheateredcough* game.


      Face it. The 6800s are lacking. The only way they can even seem to keep up with the X800 XTPE is when they cheat, so they only do half to one third the work. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Soucres? Examples? You've been saying this forever and have never once shown how an Nvidia card looks worse in game than an ATI card, because I just don't see it. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Take a look at the screenshots posted by EEK <a href='http://www.natural-selection.org/forums/index.php?showtopic=77620&st=30' target='_blank'>in the 6600 announced thread.</a> The topmost shows the shadow-banding on the hand, and to a lesser extent, the walls and zombies.
    Also noticeable is the horrible texture/normalmap artifacting, especially along the top of the AR and in the zombie-clothes. Also note that the nVidia 'answer' posts later in that thread do not feature any shots of the AR or hand in a position where the shadowbanding would be seen blatantly, but some banding is noticeable around the knuckles as it is barely seen, holding the flashlight up. <!--emo&:D--><img src='http://www.natural-selection.org/forums/html//emoticons/biggrin-fix.gif' border='0' style='vertical-align:middle' alt='biggrin-fix.gif' /><!--endemo-->

    nVidia does NOT comply to the OpenGL standard, nor does it render the game as it was meant to be. They take shortcuts, doing half to one-third the work of an ATi, sacrificing showing the game as it's meant to be seen in favor of propping up their flagging framerates, to give what few fanboys they have left some small amount of hope that they might come back from second place. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    WTH are you TALKING ABOUT! Jeez dude I don't think anybody sees what you are talking about. I am looking at the screenshots and I have no idea where they are missing half of the picture because I really see all of it. Nor do I see any errors with the picture. Even if there was something wrong with the picture we don't know where it came from! I would like to see someone that is recognized in the community as a decent benchmarker run some tests on these new cards fairly. Maybe tomshardware??
  • Har_Har_the_PirateHar_Har_the_Pirate Join Date: 2003-08-10 Member: 19388Members, Constellation
    edited August 2004
    i do believe that the results are bias, as hell, but ati still probably beats nvidia in hl 2, but not by 70%, its kinda sad that people are such fan boys as too write up a biased review (nvidia has fanboy writers too) such as that as some people belive the first thing that they see and say "omg pawnzored)

    i for one dont take to much to heart from people who are biased one way or another and try to "read between the lines" before saying "omg pawnzored" (no offence but talesin is pretty hard core ati for ex, although he has alot of facts supporting his points)
  • Crono5Crono5 Join Date: 2003-07-22 Member: 18357Members
    <!--QuoteBegin-Jim has Skillz+Aug 19 2004, 10:44 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Jim has Skillz @ Aug 19 2004, 10:44 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> WTH are you TALKING ABOUT! Jeez dude I don't think anybody sees what you are talking about. I am looking at the screenshots and I have no idea where they are missing half of the picture because I really see all of it. Nor do I see any errors with the picture. Even if there was something wrong with the picture we don't know where it came from! I would like to see someone that is recognized in the community as a decent benchmarker run some tests on these new cards fairly. Maybe tomshardware?? <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    :X
  • TenSixTenSix Join Date: 2002-11-09 Member: 7932Members
    Some comparisons between <a href='http://www.unknownworlds.com/forums/uploads//post-10-1092399371.jpg' target='_blank'>this screen (ATI)</a> and <a href='http://www.unknownworlds.com/forums/uploads//post-10-1091515728.jpg' target='_blank'>thsi screen (Geforce)</a>. There are other image quality differences, but I ran out of room. The textures have some weird things going on with them in the Geforce screenie, and just generally seem to be less detailed.

    <img src='http://helios.acomp.usf.edu/~eboston/d3_geforce.jpg' border='0' alt='user posted image' />
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    O_o

    Wow, that is pretty bad. Still haven't seen anything like that in game, though. My 5900XT seems to do a pretty good job.
  • Marik_SteeleMarik_Steele To rule in hell... Join Date: 2002-11-20 Member: 9466Members
    <!--QuoteBegin-TenSix+Aug 19 2004, 11:35 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (TenSix @ Aug 19 2004, 11:35 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Some comparisons between <a href='http://www.unknownworlds.com/forums/uploads//post-10-1092399371.jpg' target='_blank'>this screen (ATI)</a> and <a href='http://www.unknownworlds.com/forums/uploads//post-10-1091515728.jpg' target='_blank'>thsi screen (Geforce)</a>. There are other image quality differences, but I ran out of room. The textures have some weird things going on with them in the Geforce screenie, and just generally seem to be less detailed.

    <img src='http://helios.acomp.usf.edu/~eboston/d3_geforce.jpg' border='0' alt='user posted image' /> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    As an ATI owner I'm not trying to put this evidence down, but I'd be curious to see a similarly-composed comparison shot where the gun <i>isn't</i> being fired in the ATI screenie, lighting up the hand. It may help solidify the claim that nVidia's been slacking on the shadow calculations.
  • XythXyth Avatar Join Date: 2003-11-04 Member: 22312Members
    edited August 2004
    Umm I don't think that screenshot is fair evidence... Correct me if im wrong but wasn't it taken on a really crappy card (ti something?)... That had some kind of glitch that allowed it to run on ultra detail but looked really strange?

    Guess I should back up my claim, did a search and found it. <a href='http://www.unknownworlds.com/forums/index.php?showtopic=76580&st=150' target='_blank'>Here</a>
    ti4600, not exactly a top of the line card :/
  • Jim_has_SkillzJim_has_Skillz Join Date: 2003-01-19 Member: 12475Members, Constellation
    <!--QuoteBegin-Crono5788+Aug 19 2004, 08:13 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Crono5788 @ Aug 19 2004, 08:13 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Jim has Skillz+Aug 19 2004, 10:44 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Jim has Skillz @ Aug 19 2004, 10:44 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> WTH are you TALKING ABOUT! Jeez dude I don't think anybody sees what you are talking about.  I am looking at the screenshots and I have no idea where they are missing half of the picture because I really see all of it.  Nor do I see any errors with the picture.  Even if there was something wrong with the picture we don't know where it came from!  I would like to see someone that is recognized in the community as a decent benchmarker run some tests on these new cards fairly.  Maybe tomshardware?? <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    :X <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Umm yeah that is horrible but I haven't seen ANYTHING LIKE THAT with my Geforce card. So you guys need to stop saying one screenshot should encompass all Geforce cards. For all we know that card could be bugged or the drivers would be WAY out of date. You don't know and none of the Geforce card owners have seen ANYTHING like this.
  • kuperayekuperaye Join Date: 2003-03-14 Member: 14519Members, Constellation
    jim is wrong. all the time.


    sorry but i would agree with the tale man
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    Or have and just overlook it due to not wanting to admit that their cards have to pull this crap to appear competative. ^_^
  • WheeeeWheeee Join Date: 2003-02-18 Member: 13713Members, Reinforced - Shadow
    <!--QuoteBegin-Soylent green+Aug 19 2004, 09:23 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Soylent green @ Aug 19 2004, 09:23 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Am I the only one who thinks 8xS aa and 6xMSAA should not under any circumstances be compared? 8xS is MUCH MUCH harder for the hardware to do, it's something like 4XRGMS + 2xSS. Honestly this is like running the geforce in 1600x1200 and the radeon 1024x768 and going see, the ATi pwns nVidia by 67% !!!11

    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->However, considering that ATi's '8x' can be TFSAAd to run at 16x with NO loss in framerate, nVidia gets another jackboot to the face.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->

    Except it only works at high framerates, and then only when things are still(or the illusion won't work. It's when you're still that you see aliasing the easiest so this isn't much of a problem). Only 6xAA is being applied, but the sample pattern changes to trick your brain into seeing something that looks better. This won't work if you are moving or your FPS is too low(it needs to be quite high or it will look bad, that's why it reverts to regular AA if you don't meet a certain FPS). Also I would MUCH rather have 8xS AA, this AA's textures, that means that alpha test rendered things are AA'ed too!(things like wire mesh, { textures in HL1). MSAA is edges only. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Yes, and I'll tell you why:

    The implementation of 6x for ATi looks just as good (if not better) than the 8x implementation for the nVidia cards. That just points to a gross inefficiency in the nVidia algorithm. You should be complaining at nVidia for not optimizing or redoing their AA rather than saying that nVidia is treated unfairly because they have a bigger sampling pattern set.
  • t20t20 Join Date: 2004-08-19 Member: 30718Members
    Talesin, can you find 1 major reputable review site that finds significant difference in image quality between the 6800 seires and x800 series? As screenshot looks like it was taken on low quality on an old card with old drivers. I just had a quick read through all their reviews, and not one has complaints or sees any difference you'd notice ingame. The worst I've found so far:
    <!--QuoteBegin-HardOCP+--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (HardOCP)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->In our in-game evaluation, we did not notice any differences in image quality. However, when we started poring over the screenshots, we did notice some differences. If you look on the floor to the right side in the first comparison above, you will notice that the ATI Radeon X800XT-PE has slightly more aliased floor lines than the BFGTech GeForce 6800Ultra OC. Both pictures were taken at 1600x1200 at High Quality, which enables 8X anisotropic filtering and offers no antialiasing. The BFGTech GeForce 6800Ultra is clearly doing a better job filtering the floor tiles. It's important to note though that we did not notice this when we were actually in the game playing it, only afterward when we put these screenshots side-by-side. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->

    And wheeee, they both use the same algorithm for AA, nvidia switched to a rotated grid antialiasing scheme in the 6800(except for 8xS which is like ultra AA and is on a par with 16x AA on ati cards) <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile-fix.gif' border='0' style='vertical-align:middle' alt='smile-fix.gif' /><!--endemo-->
  • Soylent_greenSoylent_green Join Date: 2002-12-20 Member: 11220Members, Reinforced - Shadow
    edited August 2004
    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Yes, and I'll tell you why:

    The implementation of 6x for ATi looks just as good (if not better) than the 8x implementation for the nVidia cards. That just points to a gross inefficiency in the nVidia algorithm. You should be complaining at nVidia for not optimizing or redoing their AA rather than saying that nVidia is treated unfairly because they have a bigger sampling pattern set.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->

    9800's can do super sampling, can you guess why it isn't enabled in the driver?

    Super sampling is much more inneficient than MSAA but it AA's textures too. Without it alpha test transparency looks awfull, this is a step back in image quality for the sake of massive performance gains. I'd love to have it enabled in the ATi drivers so I can use it for older games(like HL) presumably without that bad of a hit, but thanks to people who like to do unfair comparisons between MSAA and SSAA ATi has chosen not to enable it(it is enabled on 9800's for mac).

    It would be much better to do the comparison between the MSAA modes that the 6800 and ATi card has and to compare image quality and then list the rest of the modes(6x MSAA for ATi and 8xS for nVidia) and their performance without directly comparing, just noting their merits(or lack there of) and moving on. They don't do the same thing, the result is much different(for better or worse).
Sign In or Register to comment.