Video Card Bias?

2»

Comments

  • EddieEddie Join Date: 2004-10-22 Member: 32412Members, Constellation
    ^^ Actually, ATI's HD 4870 X2 outperforms the 280 and 285 now. Nvidia's GTX 295 is top dog though, for 50-100 dollars more.

    Definitely going to look into getting myself some ATI tri-sli action going with that 4870 X2 :D
  • rebirthrebirth Join Date: 2007-09-23 Member: 62416Members, Reinforced - Supporter, Reinforced - Silver, Reinforced - Shadow
    The bias only happens depending on how the engine works.
    Has nothing to do with what hardware the devs are using or who gives them money (most of the time at least).

    ATI and NVidia cards calculate their stuff in different ways and depending what the engine favors (shaders or pure clockspeed) one of the two performs better.
    At least it's something along these lines with more high tech babble involved that i don't understand.

    But it's not like devs go "oh we love ATI and all our PC's have ATI GPU's so lets make ATI cards perform better!", most of the time it's just the nature of the engine that decides which cards perform better. And how the engine works mostly get decided by how it's the easiest to work with and not what GPU performs best with it.
  • WhiteZeroWhiteZero That Guy Join Date: 2004-06-24 Member: 29511Members, Constellation
    <!--quoteo(post=1712641:date=Jun 17 2009, 03:10 AM:name=rebirth)--><div class='quotetop'>QUOTE (rebirth @ Jun 17 2009, 03:10 AM) <a href="index.php?act=findpost&pid=1712641"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->The bias only happens depending on how the engine works.
    Has nothing to do with what hardware the devs are using or who gives them money (most of the time at least).

    ATI and NVidia cards calculate their stuff in different ways and depending what the engine favors (shaders or pure clockspeed) one of the two performs better.
    At least it's something along these lines with more high tech babble involved that i don't understand.

    But it's not like devs go "oh we love ATI and all our PC's have ATI GPU's so lets make ATI cards perform better!", most of the time it's just the nature of the engine that decides which cards perform better. And how the engine works mostly get decided by how it's the easiest to work with and not what GPU performs best with it.<!--QuoteEnd--></div><!--QuoteEEnd-->
    Think about it: While debugging and optimizing an engine, and your all using one brand of GPU, then those optimizations are going to be more geared twords that GPU you all have equipped.
    Your going to optimize your game until there are noticeable differences during runtime. A certain percentage of those optimizations are mainly going to be applicable under the same hardware that the devs were using.
  • cerberus414cerberus414 Join Date: 2005-05-07 Member: 51098Members, Constellation, Reinforced - Shadow
    I think I understand what WhiteZero is trying to say here, and I agree with him. If UWE is only using ATI or NVIDIA and not both, there is a good chance that the engine will be more optimized towards that brand since that's the brand the tweaking will be done on. The engine does not directly influence how well one video card will perform vs another. They all support standard libraries (OpenGL and D3D), but how well each one is optimized to perform those library methods (functions) internally is decided during the design and manufacturing process of the GPU. If the engine mainly relies on the functions that are better geared towards one brand, then that brand will be the winner.

    On the side note, I think UWE will use both brands to tweak NS2, primarily for debugging purposes. OpenGL and D3D libraries are huge and therefore still have bugs and issues, hence when you write a piece of code, it might work on one GPU and not another. The NS community is roughly split between ATI and Nvidia, therefore when UWE will debug the game, they need to have both GPU brands in the house (to reproduce bugs). It's not a matter of preference which card will come out on top (unless endorsed), its just a matter of applying good programming practices.
  • juicejuice Join Date: 2003-01-28 Member: 12886Members, Constellation
    Yeah, "bias" is probably the wrong word; any difference in performance would not be a goal or preference of the developers.
  • SentrySteveSentrySteve .txt Join Date: 2002-03-09 Member: 290Members, Constellation
    Throughout my gaming career I've owned Voodoo, Nividia, and ATI cards, and both ATI cards I've had were the only video cards to die on me to due manufacturer error or some bull###### "wear and tear." Nividia cards don't suck.
  • WhiteZeroWhiteZero That Guy Join Date: 2004-06-24 Member: 29511Members, Constellation
    <!--quoteo(post=1712693:date=Jun 17 2009, 02:13 PM:name=juice)--><div class='quotetop'>QUOTE (juice @ Jun 17 2009, 02:13 PM) <a href="index.php?act=findpost&pid=1712693"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Yeah, "bias" is probably the wrong word; any difference in performance would not be a goal or preference of the developers.<!--QuoteEnd--></div><!--QuoteEEnd-->
    Yes, bias was probably a bit strong of a word to use for the subject.
Sign In or Register to comment.