Bad News For Nvidia Users

ViPrViPr Resident naysayer Join Date: 2002-10-17 Member: 1515Members
<div class="IPBDescription">no HL2 for you</div> oh my god, did you hear the news on planethalflife? Valve made a presentation saying basically that Nvidia graphic cards are not working with HL2. oh well too bad that sucks. good thing i waited so now i know to buy an ATI.
«1

Comments

  • JohnnySmashJohnnySmash Join Date: 2003-08-04 Member: 18870Members
    Fudge. I have an NVidia.

    -JohnnySmash
  • HauntedHaunted Join Date: 2003-03-01 Member: 14178Members
    Heh, this report was the final nail in the coffin. I know what card I'm getting when I finally get a new machine.
  • TyrainTyrain Join Date: 2003-01-03 Member: 11746Members
    edited September 2003
    Thats just not true. Ati users will have a performance advantage but it will work. Heck I mean Valve would cut itself. 30% of all PC users got a Nvidia card
  • Frogg2Frogg2 Join Date: 2002-11-02 Member: 4867Members, Constellation
    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->In his speech, Gabe outlined several key points of both personal and professional frustration:

    Valve take serious issue with "optimizations" from NVIDIA as of late

    In no certain words, Valve is highly disappointed with current NV3x hardware as a high-performance DX9 accelerator

    Valve will have Half-Life 2 treat NV3x hardware as DX8.1 hardware for the default settings.

    According to Gabe, the rumors and speculation of ATI paying them off is nothing but bull - he said Valve's top priority has everything to do with them wanting HL2 players to have the best experience. After doing some early-on benchmarking between NVIDIA and ATI, the choice was clear from Valve's standpoint of who to partner with.

    Microsoft's DirectX team was on-hand to give full blessing to Valve's upcoming HL2 benchmark - in fact it's being referred to as the most complete DX9 benchmark to date. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
  • interiotinteriot Join Date: 2003-01-22 Member: 12586Members
    edited September 2003
    Ok, that's a bit of an extreme reading.

    <a href='http://www.gamersdepot.com/interviews/gabe/002.htm' target='_blank'>From here:</a> (dated yesterday)

    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->
    GD: What's your relationship with NVIDIA been like in light of all the recent ATI press over HL2?

    Gabe: Valve and NVIDIA both know that we have a lot of shared customers, and we've invested a lot more time optimizing that rendering path to ensure the best experience for the most customers.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

    It sounds like ATI cards are obviously better from a programmer's standpoint, but they're not going to leave NVidia customers out in the cold, far from it.
  • bertbert Join Date: 2003-02-11 Member: 13433Members
    <!--QuoteBegin--Frogg2+Sep 11 2003, 08:20 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Frogg2 @ Sep 11 2003, 08:20 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->According to Gabe, the rumors and speculation of ATI paying them off is nothing but bull - he said Valve's top priority has everything to do with them wanting HL2 players to have the best experience. After doing some early-on benchmarking between NVIDIA and ATI, the choice was clear from Valve's standpoint of who to partner with.
    <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    LOL if he really thought only of halflife2 players he wouldn't make it a monthly cost for playing online! <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif'><!--endemo-->

    rawr!



    bert!
  • TyrainTyrain Join Date: 2003-01-03 Member: 11746Members
    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.

    <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
  • Marik_SteeleMarik_Steele To rule in hell... Join Date: 2002-11-20 Member: 9466Members
    <span style='color:orange'>*Phased*</span> to Off-Topic.
  • ThemanwithnonameThemanwithnoname Join Date: 2003-08-26 Member: 20233Members
    <!--QuoteBegin--bert+Sep 11 2003, 08:23 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (bert @ Sep 11 2003, 08:23 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->
    LOL if he really thought only of halflife2 players he wouldn't make it a monthly cost for playing online! <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif'><!--endemo-->

    rawr!



    bert! <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    Halflife 2 is NOT p2p damn it!
  • ssjyodassjyoda Join Date: 2002-03-05 Member: 274Members, Squad Five Blue
    It will still play, but not as well. Visa versa with doom3. Doom3 will run better on nvidia and worse on ati. Its just the way the card is structured and the way the game engine is coded. Its not done on purpose.
  • ViPrViPr Resident naysayer Join Date: 2002-10-17 Member: 1515Members
    my friend is gonna be so ****. he just got a new computer with a GeForce FX to get ready for HL2 and now this report says he's gonna get like 14 frames per second. who can play a game at that frame rate?
  • QuaunautQuaunaut The longest seven days in history... Join Date: 2003-03-21 Member: 14759Members, Constellation, Reinforced - Shadow
    SSJYoda once again, and whenever he comes, wisdom is to follow. Now, being that not one of you pea-brains(sorry for anyone that actually read anything) actually has followed HL2 despite the fact that you guys keep screaming "OMG HL2 WILL BE TEH H4X!" and the such, so here I am to straighten you -ALL- out.

    Basically, goes like this: Until they find a way to work with the drivers for the new NVidia drivers, its going to have a performance decrease from 20%-40% compared to other games with the same graphic capability.

    The whole Pay-To-Play rumor is full on bull crap. OK? So stop bringing it up. Planethalf-life.com is getting left in the dust, and in a desperate attempt(and being that gamespy.com doesn't run it themselves, just host it) to get more hits, they keep putting up BS news stories, and its been going for a while like this.

    Half-Life 2:

    Will run on all graphics cards with 64mb(but with VERY decreased detail[or unless said to not run on them, which it will say on their official site])

    Will not be Play-to-pay Multiplayer

    Will not ship in 3 different versions(this was a big rumor that many believed, and right now VALVe is basically trying to find every site that's posted that, so they can sue them for fraud and the such, being no one has any proof, and because its full on BS.

    Will be out on shelves by September 31st(reason it won't be on shelves September 30th is because most stores are too stupid to put them on the shelves the second they get the shipment, not to mention, think of all the late shipments)

    Will ship with *MOST* mod tools(The newest version of Hammer will be included[map editor], the Speach Unit[for lip synching], and steam. Through steam, they plan to have a model program you can buy called XSI[models let you make characters, guns, vehicles, and the such])

    Will Run with little or no bugs(not sure yet, but thats what the official word recorded on MIDI from Gabe Newell says)

    If any of this is incorrect, and you can point me to something to prove it- do it. I've been following this, and just about everything they say is unfounded. I bet that most of planethalf-life.com's stuff is off of the IRC channel #halflife2(which is a fake, BS channel)
  • JammerJammer Join Date: 2002-06-03 Member: 728Members, Constellation
    Apparentl, NVidia is rasing a stink because <a href='http://www.planethalflife.com/news/nvidia.shtml' target='_blank'>Valve isn't using the opitimized drivers</a> for their testing. Performance will be much better than ATI is leading onto.
  • QuaunautQuaunaut The longest seven days in history... Join Date: 2003-03-21 Member: 14759Members, Constellation, Reinforced - Shadow
    And I'm proved right again.

    Probably half a day after its out their gonna patch HL2 to run great with NVidia cards.
  • GlissGliss Join Date: 2003-03-23 Member: 14800Members, Constellation, NS2 Map Tester
    This is the point where I say something clever like, "Oh, HL2 is so useless, who needs it!". But inside, my heart is screaming, "NNNNNNNNNNNNNNNNOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO!".
  • ViPrViPr Resident naysayer Join Date: 2002-10-17 Member: 1515Members
    <!--QuoteBegin--Jammer+Sep 12 2003, 01:59 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Jammer @ Sep 12 2003, 01:59 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Apparentl, NVidia is rasing a stink because <a href='http://www.planethalflife.com/news/nvidia.shtml' target='_blank'>Valve isn't using the opitimized drivers</a> for their testing. Performance will be much better than ATI is leading onto. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    maybe those drivers cheat?
  • GrimmGrimm Join Date: 2003-04-13 Member: 15448Members
    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Quaunaut Posted on Sep 11 2003, 07:00 PM
    --------------------------------------------------------------------------------
    And I'm proved right again.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

    Or, maybe you're <b>proven</b> right again. <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->

    (Hehe, couldn't resist)

    Anyway, I've had quite enough with people spreading so many rumors about the game. Its just common sense that Valve would lose too much of its fanbase if Half Life 2 didn't run on NVidia cards. Advice to all you rumor starters/spreaders: Look for and read the solid facts first, before you start posting wildly about something that isn't even true.
  • ViPrViPr Resident naysayer Join Date: 2002-10-17 Member: 1515Members
    damn i just had a look at those Doom3 tests and Nvidia did beat ATI in them. and ATI with the latest drivers was broken too. now i'm back where i started being unsure of what card to get. i'm gonna have to wait till they do some more tests with new drivers. it's the war of the drivers.

    btw what's interesting is that in general, Doom3 has better frame rates than Half life 2. this totally contradicts reports i've heard before.
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    NVidia has already responded:

    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

    In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

    We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
  • ViPrViPr Resident naysayer Join Date: 2002-10-17 Member: 1515Members
    ok but the difference in performance is shocking. could drivers really make that much difference? and you have to also worry that maybe if they make drivers to make the game run faster will it be by lowering the quality and doing other cheat-like things hoping we won't look carefully at the images.
  • DOOManiacDOOManiac Worst. Critic. Ever. Join Date: 2002-04-17 Member: 462Members, NS1 Playtester
    <!--QuoteBegin--ViPr+Sep 11 2003, 09:32 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (ViPr @ Sep 11 2003, 09:32 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> could drivers really make that much difference? <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    Yes. Oh yes.

    And also, HL2 is NOT going to be crap on NVidia cards. Valve would be stupid to let over 50% of their customers have crappy performance... (NVidia still holds market dominance last time I checked)
  • MerciorMercior Join Date: 2002-11-02 Member: 4019Members, Reinforced - Shadow
    I hereby declare this thread "stupid"
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    edited September 2003
    <!--QuoteBegin--DOOManiac+Sep 11 2003, 09:01 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DOOManiac @ Sep 11 2003, 09:01 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin--ViPr+Sep 11 2003, 09:32 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (ViPr @ Sep 11 2003, 09:32 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> could drivers really make that much difference? <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
    Yes. Oh yes.

    And also, HL2 is NOT going to be crap on NVidia cards. Valve would be stupid to let over 50% of their customers have crappy performance... (NVidia still holds market dominance last time I checked) <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
    Actually, ATI holds market dominance (fighting with Diamond/S3) in a much larger field.. the OEM market. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo-->
    nVidia *does* hold the budget card share quite well (mostly due to their marketing department running a smear campaign to kill the PVR Kyro2, which also killed the K3 even before it could be put in a card), and some of the mid-range.
    Mid-high falls to ATI pretty overwhelmingly (if you need speed and image quality, and don't want to lay out $20,000 for a vid card, you're getting an ATI), and high end is (of course) dominated by Matrox and niche cards.

    HL2 will *run acceptably* on a GFFX, but it will default to DX8.1 mode (no Pixel Shader v2.0 support, which is required to see all the pretty-pretties the way VALVe intended). You can shift it into full DX9 mode, just expect <b>huge</b> framerate hits due to nVidia's slipshod approach to architecture updates.

    <!--QuoteBegin--PHL+ nVidia Press Release--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (PHL @ nVidia Press Release)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->
    Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation.
    <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

    Hmmm. Let's see. Admitting outright that their cards are hard for programmers to work with (requiring specific rendering pipelines to get any kind of performance), making excuses as to why the full new (standardized) way is no better than their current incomplete implementation thereof, and finally... ACTUALLY EXPECTING that people will <i>believe</i> that you have no image quality degradation when you're losing <u>16 bits of precision</u>!
  • CommunistWithAGunCommunistWithAGun Local Propaganda Guy Join Date: 2003-04-30 Member: 15953Members
    I wish they could make a game that didn't limit its fun by how much loot you have. At first I thought thats the way HL2 was, no I'm second guessing
  • RyoOhkiRyoOhki Join Date: 2003-01-26 Member: 12789Members
    First I posted this:

    <!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Well i don't know much about all this kinda stuff, but I have a simple question:

    My system specs are an Athlon XP 1800, 512 meg of DDR ram and a Winfast Geforce 4 Ti4200 vid card. Will HL2 run fine on my system?
    <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->

    And got an answer of "Yes". Thankyou CWAG <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo-->

    But now I'm after more specitifed information. I wish to know what kind of HL2 experiance I will be recieving. Could I have all graphical options on? Could I have full pixel shading ect *insert all those weird funky techy thingys that cards do these days*? Will the game run smoothly? Will I have to tone down some of the graphical options? If someone knows the answers to these questions please reply, because I can't make out anything from the tangled web this thread has become <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->
  • CommunistWithAGunCommunistWithAGun Local Propaganda Guy Join Date: 2003-04-30 Member: 15953Members
    Well you can, because its better than 700mhz DX6 card....



    Like I said, second guessing now. ATI is just out to make money and I can't afford to keep buying a small car for video cards
  • fribblefribble Join Date: 2003-09-11 Member: 20744Members
    basically what this is one of two things:

    either valve/nividia is stirring the pot for kicks and giggles

    or

    they are farking well serious and I might just have to have a word with gabe, and make some eloquent arguments ( ie im gonna slap him like a ****)

    whatever, hl2 is as sweet as a very sweet thing so it might be worth it anyway.
  • MajinMajin Join Date: 2003-05-29 Member: 16829Members, Constellation
    edited September 2003
    Don't be stupid ppl
    If you have a GF 4 card or higher, your gonna be able to play HL2 just fine, but you wont be getting the SEX you would with a new ATI.
    The game will be playable just not full of the SEX you would get from the Radeon
  • MonsieurEvilMonsieurEvil Join Date: 2002-01-22 Member: 4Members, Retired Developer, NS1 Playtester, Contributor
    <!--QuoteBegin--Ryo-Ohki+Sep 12 2003, 10:17 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Ryo-Ohki @ Sep 12 2003, 10:17 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Could I have all graphical options on? Could I have full pixel shading ect *insert all those weird funky techy thingys that cards do these days*? Will the game run smoothly? Will I have to tone down some of the graphical options? If someone knows the answers to these questions please reply, because I can't make out anything from the tangled web this thread has become <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo--> <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
    You will not be able to have all graphical options on as you do not have a completely Directx9 optimized card - many of the cool effects in this video require that. A Geforce 4 was mainly optimized for 8.1 - grabbing a late model 9X00 ATI or Nvidia FX-class card will get you into DirectX 9 full compatibility...

    As for the rest of this nonsensical thread (complete with 'Le Fanbois de ATI'), just wait and see. It's not like its in Valve's best interest to run badly on half the modern video cards Nvidia makes. I'm sure the performance differences at the high end will be rather negligible to a human eye, just like in all games. This sort of dopey thread happens with every big name game ever released, and it always ends up being a moot point.
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    It may not be in their best interests, but it's also not in their best interests to create a sub-group specifically to write an nVidia-specific rendering pipeline, when the ATI card is scooting along just fine with standard OpenGL. In the nVidia press release, that's what is meant... ATI cards perform junky when subjected to an nVidia-specific pipeline, and nVidia cards run like crud on standard OpenGL applications. <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->
Sign In or Register to comment.