Driverheaven Source Benchmark

spinviperspinviper Join Date: 2003-05-08 Member: 16151Members
<div class="IPBDescription">ATIRadeon X800 XTPE VS Geforce 6800Ultra</div> <span style='font-size:14pt;line-height:100%'><b><a href='http://www.driverheaven.net/showthread.php?p=423812#post423812' target='_blank'>LINK</a></b></span>

Woohoo!
«13

Comments

  • usernameusername Join Date: 2004-06-22 Member: 29473Members
    edited August 2004
    haha! radeon ftw!

    geez, look at that!

    <img src='http://www.driverheaven.net/V3/16_12_max.JPG' border='0' alt='user posted image' />

    also, make sure you scroll down to the ingame pics, radeon looks noticably better. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile-fix.gif' border='0' style='vertical-align:middle' alt='smile-fix.gif' /><!--endemo-->
  • CommunistWithAGunCommunistWithAGun Local Propaganda Guy Join Date: 2003-04-30 Member: 15953Members
  • spinviperspinviper Join Date: 2003-05-08 Member: 16151Members
    ?? <!--emo&::hive::--><img src='http://www.unknownworlds.com/forums/html//emoticons/hive5.gif' border='0' style='vertical-align:middle' alt='hive5.gif' /><!--endemo-->
  • RaVeRaVe Join Date: 2003-06-20 Member: 17538Members
    Ouch. Talk about being pwnt on a major scale o_O
  • DY357LXDY357LX Playing since day 1. Still can&#39;t Comm. England Join Date: 2002-10-27 Member: 1651Members, Constellation
    Which graphics card did Doomy buy last week?
  • CommunistWithAGunCommunistWithAGun Local Propaganda Guy Join Date: 2003-04-30 Member: 15953Members
    the one with the littlebar, but the non ultra version
  • AldarisAldaris Join Date: 2002-03-25 Member: 351Members, Constellation
    edited August 2004
    Not in the slightest bit surprised. Of course the Radeons were gonna run faster then the Geforces on Source, just as the Geforces ran faster in Doom3
  • TequilaTequila Join Date: 2003-08-13 Member: 19660Members
    Agreed, this is no shock.
  • BirdyBirdy Join Date: 2003-05-29 Member: 16825Members, Constellation
    AFK, ordering an ATI <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html//emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    Except the ATis didn't need to cut every corner possible to get the framerate that high. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile-fix.gif' border='0' style='vertical-align:middle' alt='smile-fix.gif' /><!--endemo-->

    Without the nVidia 'optimizations', the benchmark scores would look pretty similar for Doom3. Heck, look at the early comparatives. Then look at the horrible shadowbanding and normal/texture corruptions on the nVidias.

    The 6800 just got served, before they can crappify the rendering to push their framerates up. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin-fix.gif' border='0' style='vertical-align:middle' alt='biggrin-fix.gif' /><!--endemo-->
  • CommunistWithAGunCommunistWithAGun Local Propaganda Guy Join Date: 2003-04-30 Member: 15953Members
    If I am dissapointed with my radeon 9800pro i ordered you shall pay.
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    Oop. I am mistaken. That's WITH the nVidia optimizations enabled. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin-fix.gif' border='0' style='vertical-align:middle' alt='biggrin-fix.gif' /><!--endemo-->
  • SnidelySnidely Join Date: 2003-02-04 Member: 13098Members
    Maybe I ought to give my little sister my GeForce 4800 Ti and upgrade to an ATI before I get Doom 3...then again, that would involve spending money. I always feel guilty spending money in large quantities.
  • RueRue Join Date: 2002-10-21 Member: 1564Members
    I want to know how 9800Pro's are doing <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile-fix.gif' border='0' style='vertical-align:middle' alt='smile-fix.gif' /><!--endemo-->

    I did read the other unofficial benchmarks but they only use counter strike as a benchmark...... <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html//emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->
  • TenSixTenSix Join Date: 2002-11-09 Member: 7932Members
    ATI cards seem to run Doom 3 just as well as Geforce cards. Not as well as the 6800's, but not as bad as the 6800's do with HL2. Dear God, thats just abysmal for such a high-end card.
  • esunaesuna Rock Bottom Join Date: 2003-04-03 Member: 15175Members, Constellation
    I'm so freakin' happy i have a Radeon.

    /me hugs his Radeon
  • SirusSirus Join Date: 2002-11-13 Member: 8466Members, NS1 Playtester, Constellation
    Hrm, I'll have the money for a new videocard in the future. I think I know which one now...
  • Travis_DaneTravis_Dane Join Date: 2003-04-06 Member: 15249Members
    Then I suppose i'm the only seriously doubting these benchmark results <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink-fix.gif' border='0' style='vertical-align:middle' alt='wink-fix.gif' /><!--endemo-->.

    I did expect the Radeon Series to have a somewhat lead over the GeForce Series, but 60%? That's just plain pruposterous. Previous benchmarks with a wide array of games between the GF6 and X800 have showed their performances match usually.
    Differences are seen, but nothing too big (Doom3 probably being the worst case).

    When differences this big appear, I don't think you can keep blaming graphics cards. Clearly the GF6 demonstrated it's power in Doom3 and various other games where it easely matched an X800 (GT->Pro in particular). I think i'm going to go out on a limb here and blame this on poor development of the Source Engine.

    I really do think people start to lose the bigger picture with games lately, bad performance? Blame it on the hardware! Loaded with graphical glitches? Blame it on the drivers! It just so happen to be that people can actually make <u>BAD</u> engines. Now I don't dare calling the Source engine crappy, but I do have serious doubts regarding it's development. Valve's sponsorship by ATi in particular.

    I don't think it's unlikely that Valve has seriously optimized their goodies for the ATi franchise. Word has it that Valve put alot of effort in optimizing for the GFFX, well, that's mighty generous of them, but unfortunatly people are looking at the GF6 now, not previous generation (badly designed) graphics cards.

    Now in the end, i'm certainly not buying (if I din't already own a GF6) an ATi based on it's apperent (and maybe <u>temporary</u>) good performance with HL2. In fact, if I was going to base it on an engine, the Source Engine would be my last desicive factor, mostly because we haven't seen <u>anything</u> fancy from Valve. A Quake1\2 hybrid is the only thing we can go by to assume Valve will do a good job on it's engines, trustworthy? Not really. Rather i'll be looking at companies that have proven themselfs at delivering great engines. iD anyone? They use OpenGL though, so it's hard to make good comparisons, but you get the general idea.

    PS: This post might just be loaded with plain nonense and faulty logic, as I can't be arsed making alot of effort at the moment <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin-fix.gif' border='0' style='vertical-align:middle' alt='biggrin-fix.gif' /><!--endemo-->.
  • ChargeCharge Join Date: 2003-02-05 Member: 13144Members
  • coilcoil Amateur pirate. Professional monkey. All pance. Join Date: 2002-04-12 Member: 424Members, NS1 Playtester, Contributor
    Travis, did you look at the screenshots? If those aren't beautiful, I really don't know what is. I agree that Valve has never been a terribly visionary level design firm, but the tools they've given us have AMAZING potential. Frankly, what I love about Source as opposed to DOOM3 is the subtlety of it. <a href='http://www.darkbeforedawn.net/external/css1.jpg' target='_blank'>This pic</a> that Uncritical posted in the CS:Source thread is a perfect example -- the tiles look real. The lighting looks real. DOOM is scary and claustrophobic and cool, but with Source you could make and show me a room that was convincing enough to be real.

    Regarding the performance, it should definitely be noted that both cards are set to their <i>respective</i> maximum settings. For the NVidia card, that's 8aa -- the ATI card only goes up to 6aa. So the NVidia card is trying to do more than the ATI, which accounts for some of the framerate hit.

    However, I'm not surprised by these results. Everything I've read on the subject, both from Valve and from independent sites, says that Source is, if nothing else, a DirectX-centric engine. DirectX was created as and is fundamentally a standard. It's like standardizing electrical outlets or HTML coding (the W3C standard) or anything - the idea is to make something that is easy to deal with - theoretically, on both the software and the hardware side of things.

    The problem, as I have been led to understand, is that NVidia likes to cut corners with its cards, and DirectX isn't really designed with corner-cutting in mind. The FX series is a classic example; they were so flawed (as someone said in the CS:Source thread, only capable of supporting Shader 1.4 instead of the DX9 standard Shader 2.0) that the Source engine actually defaults them to DX8.1 instead of DX9. It appears that the 6x series is cutting similar corners.
    _____

    I don't think that this is a death knell for NVidia owners. People were dismayed by ATI's performance on DOOM3, and within weeks a patch for the Catalyst drivers was available that increased FPS in DOOM by as much as 30%. NVidia has always had great drivers, so it's likely that they will be able to update their Detonators to address the problem.

    I did hear that Valve actually spent more time working with the FX series than they did with ATI's cards... that says to me that the FXs *needed* more help, while the ATI cards were more flexible. Form your own conclusions.
    _____

    Last comment, regarding the screenshots posted further down the thread. The two water shots weren't similar enough to easily compare them, but the shots of the elevator mine shaft were. Frankly, the only real difference I could see in visual quality was that the NVidia card seemed to do a worse job with the rock texture at the very *bottom* of the tunnel - it wasn't as clean as the ATI's work. But motion is the real kicker, and it seems the NVidia suffered pretty hard there.

    We'll see what happens... but frankly, I know what the next card I buy will be.
  • Travis_DaneTravis_Dane Join Date: 2003-04-06 Member: 15249Members
    edited August 2004
    <!--QuoteBegin-coil+Aug 19 2004, 04:59 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (coil @ Aug 19 2004, 04:59 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Travis, did you look at the screenshots?  If those aren't beautiful, I really don't know what is.  I agree that Valve has never been a terribly visionary level design firm, but the tools they've given us have AMAZING potential.  Frankly, what I love about Source as opposed to DOOM3 is the subtlety of it.  <a href='http://www.darkbeforedawn.net/external/css1.jpg' target='_blank'>This pic</a> that Uncritical posted in the CS:Source thread is a perfect example -- the tiles look real.  The lighting looks real.  DOOM is scary and claustrophobic and cool, but with Source you could make and show me a room that was convincing enough to be real.

    Regarding the performance, it should definitely be noted that both cards are set to their <i>respective</i> maximum settings.  For the NVidia card, that's 8aa -- the ATI card only goes up to 6aa.  So the NVidia card is trying to do more than the ATI, which accounts for some of the framerate hit.

    However, I'm not surprised by these results.  Everything I've read on the subject, both from Valve and from independent sites, says that Source is, if nothing else, a DirectX-centric engine.  DirectX was created as and is fundamentally a standard.  It's like standardizing electrical outlets or HTML coding (the W3C standard) or anything - the idea is to make something that is easy to deal with - theoretically, on both the software and the hardware side of things.

    The problem, as I have been led to understand, is that NVidia likes to cut corners with its cards, and DirectX isn't really designed with corner-cutting in mind.  The FX series is a classic example; they were so flawed (as someone said in the CS:Source thread, only capable of supporting Shader 1.4 instead of the DX9 standard Shader 2.0) that the Source engine actually defaults them to DX8.1 instead of DX9.  It appears that the 6x series is cutting similar corners.
    _____

    I don't think that this is a death knell for NVidia owners.  People were dismayed by ATI's performance on DOOM3, and within weeks a patch for the Catalyst drivers was available that increased FPS in DOOM by as much as 30%.  NVidia has always had great drivers, so it's likely that they will be able to update their Detonators to address the problem.

    I did hear that Valve actually spent more time working with the FX series than they did with ATI's cards... that says to me that the FXs *needed* more help, while the ATI cards were more flexible.  Form your own conclusions.
    _____

    Last comment, regarding the screenshots posted further down the thread.  The two water shots weren't similar enough to easily compare them, but the shots of the elevator mine shaft were.  Frankly, the only real difference I could see in visual quality was that the NVidia card seemed to do a worse job with the rock texture at the very *bottom* of the tunnel - it wasn't as clean as the ATI's work.  But motion is the real kicker, and it seems the NVidia suffered pretty hard there.

    We'll see what happens... but frankly, I know what the next card I buy will be.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Nowhere have I claimed the Source Engine din't deliver high quality graphics, rather at it's performance.

    I know the GF6 was tested with 8x FSAA, and the X800 6x respectively. Not entirely sure how much of an performance impact it would've had, but I doubt it decreased performance by as much as 60%.

    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->It appears that the 6x series is cutting similar corners.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Please elaborate.

    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->I did hear that Valve actually spent more time working with the FX series than they did with ATI's cards... that says to me that the FXs *needed* more help, while the ATI cards were more flexible.  Form your own conclusions.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    1 conclusion you could make, is that Valve actually based their engine around the Radeon series (the 9800pro in particular), and as you know the X800 has a very similar architecture. So essentially the fact that they optimized for the GFFX, shouldn't really be something to be thankfull for, as they ignored it earlier on in the development stage (mind you, this is all speculation).

    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Last comment, regarding the screenshots posted further down the thread.  The two water shots weren't similar enough to easily compare them, but the shots of the elevator mine shaft were.  Frankly, the only real difference I could see in visual quality was that the NVidia card seemed to do a worse job with the rock texture at the very *bottom* of the tunnel - it wasn't as clean as the ATI's work.  But motion is the real kicker, and it seems the NVidia suffered pretty hard there.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    I've seen people struggling to get any texture to display decently on a GeForce, with CS:Source. So that could possibly be blamed to driver issues. IQ differences have always been a blown up issue. FarCry had IQ issues on the GeForce series, those were resolved without the slightest FPS drop...

    PS: About the screeny you posted at the beginning, the lightning on the floor shines even when you're behind the crate...
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    <!--QuoteBegin-Talesin+Aug 19 2004, 08:50 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Aug 19 2004, 08:50 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Except the ATis didn't need to cut every corner possible to get the framerate that high. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile-fix.gif' border='0' style='vertical-align:middle' alt='smile-fix.gif' /><!--endemo-->

    Without the nVidia 'optimizations', the benchmark scores would look pretty similar for Doom3. Heck, look at the early comparatives. Then look at the horrible shadowbanding and normal/texture corruptions on the nVidias.

    The 6800 just got served, before they can crappify the rendering to push their framerates up. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin-fix.gif' border='0' style='vertical-align:middle' alt='biggrin-fix.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    I still don't see this image quality thing you're talking about.. the two screenies look exactly the same to me.
  • coilcoil Amateur pirate. Professional monkey. All pance. Join Date: 2002-04-12 Member: 424Members, NS1 Playtester, Contributor
    404: as I said, the only difference I could find was in the screenshot in the link, of the elevator shaft. If you look at the rocks at the base of the shaft, the ATI-rendered shot looks a little (IMO) smoother. Between those shots, however, that was the only solid diference I could find.
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    <!--QuoteBegin-coil+Aug 19 2004, 11:28 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (coil @ Aug 19 2004, 11:28 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> 404: as I said, the only difference I could find was in the screenshot in the link, of the elevator shaft. If you look at the rocks at the base of the shaft, the ATI-rendered shot looks a little (IMO) smoother. Between those shots, however, that was the only solid diference I could find. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Yeah, they look pretty similar to me.



    Also, anyone got any lower-res benchmarks? My monitor doesn't even support 1600x1200 lol
  • t20t20 Join Date: 2004-08-19 Member: 30718Members
    <!--QuoteBegin-Travis Dane+Aug 19 2004, 10:16 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Travis Dane @ Aug 19 2004, 10:16 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Then I suppose i'm the only seriously doubting these benchmark results <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink-fix.gif' border='0' style='vertical-align:middle' alt='wink-fix.gif' /><!--endemo-->.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    You are right to be suspicious travis <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile-fix.gif' border='0' style='vertical-align:middle' alt='smile-fix.gif' /><!--endemo-->

    The benchmark is quite simply BS. This one seems about right, showing much more reasonable results: <a href='http://www.vr-zone.com/?i=1181&s=1' target='_blank'>http://www.vr-zone.com/?i=1181&s=1</a>

    Also have a look at <a href='http://www.hardforum.com/showthread.php?t=796256&page=1&pp=20' target='_blank'>http://www.hardforum.com/showthread.php?t=...56&page=1&pp=20</a> Seems the gap is approx 20% with the 61.77 drivers and 10% with the beta 65.62s.
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    edited August 2004
    I knew that last benchmark smelled fishy: <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Driverheaven biased BS strikes again. They benched Nvidia's 8xS supersampling against ati's lesser multisampling and used old drivers. Worthless benchmarks.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->

    I remember reading benchmarks of the 6800, can't find them now, that showed that 8xS resulted in a HUGE speed drop. We're talking about 1/3rd the speed of the normal 8xAA tecnhique.

    found the pic:

    <img src='http://graphics.tomshardware.com/graphic/20040414/images/image064.gif' border='0' alt='user posted image' />
  • ZelZel Join Date: 2003-01-27 Member: 12861Members
    x800 XT PE, $550 - $600
    6800 GTUltra $400 - $550

    37% to 9% price difference, should merit a large performance increase. maybe if they wouldve benched equal cards to one another it would be fair.

    and i dont care if thats the best offering from both companies, ebcause nobody buys based on the most powerful card in the world, or theyd be benching 2x nvidia SLI cards on a dual opteron vs that same Radeonx800.
  • AlignAlign Remain Calm Join Date: 2002-11-02 Member: 5216Forum Moderators, Constellation
    I saw this earlier and thought "WTH that cant be right, I get ~20 fps at max settings with a freaking Geforce 3, the newest card cant possibly get the same fps", and then realized I was on 1024x960 while the test was on 1600x1200. Still, it doesn't seem like it would make THAT big a difference, upping the res two steps...
  • TommyVercettiTommyVercetti Join Date: 2003-02-10 Member: 13390Members, Constellation, Reinforced - Shadow
    edited August 2004
    Um, I found somewhere where I can get an X800 XT PE for $500, but it's not coming out until September 20th. Also, me is glad I'm getting an ATi card...

    <a href='http://www.compusa.com/products/products.asp?N=0&Ntt=radeon&Ntk=All&Nty=1&D=radeon' target='_blank'>Second one from the top.</a>
  • QuaunautQuaunaut The longest seven days in history... Join Date: 2003-03-21 Member: 14759Members, Constellation, Reinforced - Shadow
    Only thing I'm wondering about with both engines, is why in the hell don't they use Detailed textures? Their textures look like crap up close, UT2k4's don't. Maybe they should implement them?
Sign In or Register to comment.