Nvida Users Need Not Worry About Hl2, Just Yet

DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
<div class="IPBDescription">as seen on halflife2.net</div>Note: This is an almost direct cut & paste from <a href='http://www.halflife2.net/forums/showthread.php?s=&postid=142009' target='_blank'>my post</a> over at the <a href='http://www.halflife2.net/forums/' target='_blank'>halflife2.net forums</a>.

<b>Disclaimer:</b> There seems to be a massive amount of Fanboy-ism on both sides of the camp here on these forums. Please, lets try to avoid that in this thread. I'm not out to praise or condem one or the other. This thread is simply to provide current and potential future NVidia or ATI customers with more information to make a better informed decision in regards to the video card they purchase next.

As you all know Valve (who is partnered with ATI) came out today with some questionable benchmarks that show ATI products completely creaming NVidia products. Shortly afterwards NVidia issued a statement, of which the below quote is an exerpt from:
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--><span style='font-size:8pt;line-height:100%'>If you would like to read the whole press release, click <a href='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/003.htm' target='_blank'>here</a>.</span>

Here's what I think:

1) NVidia can't afford for Half-Life 2 to run like crap. They'll get it running great. They'll have to to stay in business.
2) After ATI screwed up with id, they were probably very desperate to find another software partner...
(To clarify, I do not mean to imply that Valve would be unfairly biased towards ATI simply because they were partnered with them, I mean merely to state that ATI would be especially anxious to get HL2 performing best on ATI products)
3) Valve would be silly to have over 50% of their customer base (last time I checked NVidia still had the majority of the market) run their game like crap...
4) As much as you may want to believe to the countrary, Half-Life 2 is not the only game coming out. If you plan on playing an array of games, I suggest you do more research (including waiting till said games have shipped and you can get actual benchmarks rather than estimates) and get the best all around card, rather than one that tailors to a specific game. (To those who like to take things the negative way, this doesn't just mean "ATI for Half-Life 2" but also means "NVidia for DOOM 3".)

5) The vast majority of people (read: ones not posting on or reading these forums) aren't going to be upgrading their system to play Half-Life 2 if its within reason (say 1.8ghz with a Geforce 3). So don't let the benchmarks scare you completely if you have an older system: Benchmarks by nature are meant to stress hardware to its limits, not give an approximation of performance during actual gameplay.

I think Anandtech said it better than I can:
<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->It's almost ironic that the one industry we deal with that is directly related to entertainment, has been the least exciting for the longest time. The graphics world has been littered with controversies surrounding very fickle things as of late; the majority of articles you'll see relating to graphics these days don't have anything to do with how fast the latest $500 card will run, instead we're left to argue about the definition of the word "cheating", we pick at pixels with hopes of differentiating two of the fiercest competitors the GPU world has ever seen, and we debate over 3DMark.

What's interesting is that all of the things we have occupied ourselves with in recent times, have been present throughout history. Graphics companies have always had questionable optimizations in their drivers, they have almost always differed in how they render a scene and yes, 3DMark has been around for quite some time now (only recently has it become "cool" to take issue with it).<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

As for myself, I'm going to keep the current video card (its about a year old) that I have for a bit until after Half-Life 2 and a few other games I plan on playing all ship, and then I'm going to check with independant third party sites (HardOCP, Anandtech, etc.) to find out which one I should choose.

<b>Bonus Note:</b> At Quakecon there were representatives at HardOCP, NVidia, and ATI that casually mentioned that some big big video card price cuts were coming around November 15th. I would recommend waiting until this date for making <i>any</i> video card purchase unless you've got extra money to blow, or are on an unbearably crappy card already.

<i>Again, before you post I'd like to remind you that this is supposed to be a flameless, fanboy-less thread. If you can't post without saying "X sucks, Y rocks" then please do not post.</i>

Comments

  • Psycho-Kinetic_Hyper-GeekPsycho-Kinetic_Hyper-Geek Join Date: 2002-11-18 Member: 9243Banned, Constellation
    X freaking rocks man, I totally dig on it cause its almost always used as the variable in any equation. X RULES!
  • Nil_IQNil_IQ Join Date: 2003-04-15 Member: 15520Members
    edited September 2003
    Well frankly X and Y both suck, B is the king. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo-->

    I'm buying a graphics card. TODAY. Now you have made me all confus0red. It will be an ATI card either way, and nothing you can say will stop me, MWAHAAHAAHAA!

    If it doesn't run Half-life I will be very sad, but I can't see why it should. Well, as (the now banned I believe) Evil the cat put it, "it's not going to only render beautiful 3D environments and then see a half-life polygon and say "OMG! F**K THIS!".

    And for those who said ATI have driver issues, yeah, in the 90's they did. Get with the times...

    Wow. Referring to the 90's in the past tense makes me feel old <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo-->
  • RyoOhkiRyoOhki Join Date: 2003-01-26 Member: 12789Members
    Well i don't know much about all this kinda stuff, but I have a simple question:

    My system specs are an Athlon XP 1800, 512 meg of DDR ram and a Winfast Geforce 4 Ti4200 vid card. Will HL2 run fine on my system?
  • CommunistWithAGunCommunistWithAGun Local Propaganda Guy Join Date: 2003-04-30 Member: 15953Members
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    *bump*

    trying to bump this one to the top so all the other "less unbiased" threads aren't all people see...
  • RenegadeRenegade Old school Join Date: 2002-03-29 Member: 361Members
    <!--QuoteBegin--Nil_IQ+Sep 12 2003, 01:03 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Nil_IQ @ Sep 12 2003, 01:03 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> will be an ATI card either way, and nothing you can say will stop me, MWAHAAHAAHAA! <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    I don't see what the obsession is with Video Card loyalty. If a new Nvidia card comes out that is faster than the latest ATI card on the market, I'm gonna get it. Likewise, if a new ATI card comes out that is faster than the fastest Nvidia card on the market, I'm gonna get it. Cards should be rated by specs, not the company that creates them.
  • p4Samwisep4Samwise Join Date: 2002-12-15 Member: 10831Members
    <!--QuoteBegin--Ryo-Ohki+Sep 12 2003, 01:34 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Ryo-Ohki @ Sep 12 2003, 01:34 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Well i don't know much about all this kinda stuff, but I have a simple question:

    My system specs are an Athlon XP 1800, 512 meg of DDR ram and a Winfast Geforce 4 Ti4200 vid card. Will HL2 run fine on my system? <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    YOU WILL DIE!!!!!!

    *starts beating Ryo with a halibut*
  • WheeeeWheeee Join Date: 2003-02-18 Member: 13713Members, Reinforced - Shadow
    <!--QuoteBegin--Renegade+Sep 12 2003, 11:54 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Renegade @ Sep 12 2003, 11:54 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin--Nil_IQ+Sep 12 2003, 01:03 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Nil_IQ @ Sep 12 2003, 01:03 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> will be an ATI card either way, and nothing you can say will stop me, MWAHAAHAAHAA! <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
    I don't see what the obsession is with Video Card loyalty. If a new Nvidia card comes out that is faster than the latest ATI card on the market, I'm gonna get it. Likewise, if a new ATI card comes out that is faster than the fastest Nvidia card on the market, I'm gonna get it. Cards should be rated by specs, not the company that creates them. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    Well, most of us can't afford to go out and buy every new video card that comes out, so we have to argue on which one is the best deal...and I have no idea why people are defending nVidia recently. ATi has come out with better cards, accept it and move on. Maybe the next batch of card releases will put nVidia back on top, but for now they are pretty much overclassed in the middle-high price range so stop bickering.
  • MonsieurEvilMonsieurEvil Join Date: 2002-01-22 Member: 4Members, Retired Developer, NS1 Playtester, Contributor
    Before you debate much more, this article at <a href='http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/hl2_followup/001.htm' target='_blank'>http://www.gamersdepot.com/hardware/video_...ollowup/001.htm</a>

    It is pretty well written, and even has screenshots from the benchmarks showing the quality differences as the cards run DirectX8, 8.1, and 9. It also points out some other interesting things from a technology and historical perspective with the card manufacturers themselves.
  • CommunistWithAGunCommunistWithAGun Local Propaganda Guy Join Date: 2003-04-30 Member: 15953Members
    I uh...don't notice a difference...and I'm serious...is that odd?
  • ZelZel Join Date: 2003-01-27 Member: 12861Members
    my opinion:
    ATI and Nvidia top end cards are in all practicality, the same. one uses faster chip and the other uses optimized drivers, but they equate to the same goodness.

    nvidia had always used optimized drivers, btu one day someone saw a shortcut in the 3dmark bit of the driver and all hell broke loose. suddenly there were massive flamewars on how nvidia cheats and ati puts out better cards.

    okay, whatever, but in a real game playing real life situation, the cards will perform alike. i'm gonna stick with the brand i have used before because it doesnt really matter.

    the 50.xx set will make nvidia shine on hl2, while some fanboy somewhere will pick it apart and see that its only because it crops antialiasing on objects out of the users focus area or off screen or soem **** like that, where it wont effect the user, but just purely effect the forumites flames.
  • moultanomoultano Creator of ns_shiva. Join Date: 2002-12-14 Member: 10806Members, NS1 Playtester, Contributor, Constellation, NS2 Playtester, Squad Five Blue, Reinforced - Shadow, WC 2013 - Gold, NS2 Community Developer, Pistachionauts
    <!--QuoteBegin--CommunistWithAGun+Sep 12 2003, 01:50 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (CommunistWithAGun @ Sep 12 2003, 01:50 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I uh...don't notice a difference...and I'm serious...is that odd? <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    look at the water reflections. But yeah, its negligible.
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    <!--QuoteBegin--CommunistWithAGun+Sep 12 2003, 12:50 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (CommunistWithAGun @ Sep 12 2003, 12:50 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I uh...don't notice a difference...and I'm serious...is that odd? <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    Nope, I can't tell a difference either, though I'm told a couple pixels in the water are different...
  • MonsieurEvilMonsieurEvil Join Date: 2002-01-22 Member: 4Members, Retired Developer, NS1 Playtester, Contributor
    You can't tell the difference between 77 FPS and 14 FPS? The difference is, that screenshot is exactly how your game will look on a FX 5200 card with current drivers - no movement at all <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo--> .

    As I have said previously, I very much doubt Nvidia users have too much to worry about with HL2. They are still firmly entrenched as the majority card owners. However, once upon a time I was firmly in the majority as a 3DfX Voodoo2 owner... and where are they now, after once having 98% 3D card market share? If a few more top games like Doom3 and DeusEx2 and such come out and clearly run better on ATI, Nvidia will have some hard decisions to make with their engineering doctrine and future...
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    I was talking about picture quality between the pics in the article :P
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    <!--QuoteBegin--Zel+Sep 12 2003, 11:20 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Zel @ Sep 12 2003, 11:20 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> my opinion:
    ATI and Nvidia top end cards are in all practicality, the same. one uses faster chip and the other uses optimized drivers, but they equate to the same goodness.

    nvidia had always used optimized drivers, btu one day someone saw a shortcut in the 3dmark bit of the driver and all hell broke loose. suddenly there were massive flamewars on how nvidia cheats and ati puts out better cards.

    okay, whatever, but in a real game playing real life situation, the cards will perform alike. i'm gonna stick with the brand i have used before because it doesnt really matter.

    the 50.xx set will make nvidia shine on hl2, while some fanboy somewhere will pick it apart and see that its only because it crops antialiasing on objects out of the users focus area or off screen or soem **** like that, where it wont effect the user, but just purely effect the forumites flames. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    Actually, the cheating thing was that the drivers included manually-entered clip planes to tell the card what to, and what not to render. If they were dynamically generated, it'd be a fair optimization. But they weren't.. they were put in specifically, by hand to speed up that benchmark and give the general populace the feeling that nVidia still can sell the same old cards with new sectional pipelines slapped in.

    There's a limit to optimized drivers, honestly. When you have to code in a program specifically to make a given brand happy... that's it. Unless you're directly accessing the card itself, essentially building your own drivers into the application, you should be able to hand it a standardized interface (say... OpenGL) and have it run. If more needs to be done than that, it's a fault of the drivers.

    (Oh, and you can't antialias things off-screen... AA is accomplished by rendering the image much larger, and then sampling it down. A post-render function. And again... if it does it for everything, automatically, then it's quite fair. The cheating only comes in when you intentionally create synthetic numbers.)



    Back on-topic, it's pretty easy to see the difference between the three, at least on my LCD monitor. The first one, the water appears to be more of slush-mud. You can't see through it. DX8.1 you can a very small amount, which appears to be more of surface reflection. The DX9 one, the water is almost clear and has full refractions.
    I can't help but wonder how they're going to handle the enemies that rely on visual distortion to give away their position, with anything but a DX9 part that can HANDLE running in DX9 mode. It'll either be incredibly easy to spot them (think v1.0 L1/2 cloaking), or so difficult as to make them nearly unspottable.
  • VenmochVenmoch Join Date: 2002-08-07 Member: 1093Members
    TBH I must be blind or something but today I played SOF2 on a GeForce FX (Not my own BTW) it looked alright but not much better than my GeForce 2 MX. But then I never cared about Graphics. Sure its nice if you have graphical flashyness but who cares if a game looks like real life if it sucks?

    Practical Example #1: UT2003

    Yes the graphics are nice but there is a wafer thin gameplay that is fun for a while but soon gets grating and horrid. (Its also nothing like UT which isn't a good thing.) Epic Concentrated on the graphics rather than making a good killer app. Of course I sense people about to post "But HL2 has good graphics and is supposedly really good!" But we haven't played it yet (Unless of course your an illegal asshat but thats not my area.)

    So I wish everyone would get their heads out their proverbial graphics card crap and at least stop the "OMG nVIDIA SUK0RS" Because its crap like that that makes me want to become a console gamer again.....
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    edited September 2003
    Pardon? Last I knew, no one had said it quite that basically. Nor do nVidia cards 'SUK0R'... the only thing we're debating at this point is which is the fastest, and which company feels the need to cheat to wedge it into the average consumer's mind that they can still consistently come out on top in the mid-high end arena.

    If I was making a mid-low end machine, I'd go with a GF4, most likely, over a Radeon 9200/9000 Pro. While the 9000 might give a good showing, due to the fact that it's just a cut-down R8500, it has some oddities that an average user would get frustrated with... like occasionally forgetting to turn back on the VGA port, but leaving the TV-Out running just fine after certain resolution changes.

    Mid level though, I have to go with ATI on the 9500/9600 Pro. They're the best bang-for-buck out there right now, with the cancellation of the K3.

    High end (ruling out Matrox cards) would be a R9800 Pro, or Pro2. I've got some concerns about the DDR2 RAM on the Pro2 though, which could theoretically make it run slower than the standard Pro in some instances.



    All in all, we can wait for the video card prices to drop as both companies unveil their new products... from hearsay, I can stipulate around Nov. 15th. Now if only I had money to blow. <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->
  • Hand_Me_The_Gun_And_Ask_Me_AgainHand_Me_The_Gun_And_Ask_Me_Again Join Date: 2002-02-07 Member: 178Members
    After looking at the screenshots, I'm even more confused.

    Yes, they're definitely slightly different, but not <i>hugely</i> so. I have a feeling it'll be easier to tell the different DirectX thingies apart when stuff's actually moving - like how Doom 3 looks pretty dull in screenshots but awesome in motion. If the apparent similarity shows anything, it's that Valve haven't ignored people with older hardware - it's not a 'this game will look cack unless you have such and such a card'...

    I definitely have to get some kind of new graphics card, though, as my old Matrox G400 is definitely showing its age. Bah. I think I'll get the cheapest true GeForce 4 that has dual-head, and appreciate the better Linux drivers... <!--emo&???--><img src='http://www.unknownworlds.com/forums/html/emoticons/confused.gif' border='0' style='vertical-align:middle' alt='confused.gif'><!--endemo-->
Sign In or Register to comment.