Nvidia Or Ati?

TommyVercettiTommyVercetti Join Date: 2003-02-10 Member: 13390Members, Constellation, Reinforced - Shadow
<div class="IPBDescription">FX 6800 vs. X800</div> I'm in the market for a new PC, and I feel like designing my own. I hear that the ATi X800 gets better framerates at high resolutions and settings, but does not support the advanced DX9 effects. So, can you geniuses (I'm serious) help me out with the video card decision? The games I'm most likely going to be playing on it are Doom III, Half-Life 2, STALKER: Shadow of Chernobyl, and others like them. So, which would you recommend? Keep in mind I really love the eye candy.
«134

Comments

  • SwiftspearSwiftspear Custim tital Join Date: 2003-10-29 Member: 22097Members
    And thus we pave the way for Talesin once again!

    Haven't you been reading any of the onther topics ATI owns NVidia in every way.
  • QuaunautQuaunaut The longest seven days in history... Join Date: 2003-03-21 Member: 14759Members, Constellation, Reinforced - Shadow
    He isn't lying. I've got a NVidia, and frankly, just TOUCHING a ATi makes you realize how much they are superior.

    Don't touch NVidia- terrible cards, terrible drivers, terrible service.
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    I've never had an Nvidia driver go bad or incompatable on me... and i've been using Nvidia for 5 years now.
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    edited July 2004
    Actually Tommy, you have that backward. ATi cards excel at DX9. OpenGL they lag slightly, because they still use full colour calculations (while nVidia cards cut the back-end calcs, making the image look crappy to anyone not half-blind).

    Perhaps what you're thinking of is 'Pixel Shader 3.0'... which is mostly a marketing scam, as any card that can run actual PS2.0, and has an on-board hardware T&L unit (aka: any ATi card after the R9500) can do them with minor driver handle-modifications by the coders.
    Oddly, nVidia is boasting that they can do them... when the 6800 series is the first nVidia card to actually handle ANYTHING past PS1.4 <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->


    Short version, the x800 series kicks the living crap out of the 6800 series. The XTPE is the fastest consumer card on the market ($500 msrp). The 6800 Ultra Extreme ($700 msrp) is a 'golden sample' card, which means they find the BEST ones out of their stock and clock them higher. They STILL can't keep up with the x800 XTPE, draw two DEVOTED molex power connectors, and hog two expansion slots. They also go back to the 'dustbuster' cooling system that was mocked so much on the FX5800.



    Go for the ATi. Your eyes, ears and wallet will thank you later.
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    Uhh, they don't use the dustbuster at all.

    The image look near identical to me on both cards unless you get out the magnifying glass (even then it isn't much)

    the benchmarks are actually pretty similar. the two cards are neck and neck.
  • QuaunautQuaunaut The longest seven days in history... Join Date: 2003-03-21 Member: 14759Members, Constellation, Reinforced - Shadow
    <!--QuoteBegin-404NotFound+Jul 18 2004, 06:33 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (404NotFound @ Jul 18 2004, 06:33 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I've never had an Nvidia driver go bad or incompatable on me... and i've been using Nvidia for 5 years now. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    I'm not saying they go bad.

    I'm saying they CAN'T DO CRAP.
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    edited July 2004
    <!--QuoteBegin-Quaunaut+Jul 18 2004, 08:50 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Quaunaut @ Jul 18 2004, 08:50 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-404NotFound+Jul 18 2004, 06:33 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (404NotFound @ Jul 18 2004, 06:33 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I've never had an Nvidia driver go bad or incompatable on me... and i've been using Nvidia for 5 years now. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    I'm not saying they go bad.

    I'm saying they CAN'T DO CRAP. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Example?

    I'd also like to know about the "terrible service" claim, since Nvidia doesn't actually make the cards.
  • JHunzJHunz Join Date: 2002-11-15 Member: 8815Members, Constellation
    <!--QuoteBegin-Quaunaut+Jul 18 2004, 08:50 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Quaunaut @ Jul 18 2004, 08:50 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-404NotFound+Jul 18 2004, 06:33 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (404NotFound @ Jul 18 2004, 06:33 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I've never had an Nvidia driver go bad or incompatable on me... and i've been using Nvidia for 5 years now. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    I'm not saying they go bad.

    I'm saying they CAN'T DO CRAP. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    much like ATI's mobility or linux drivers, I hear
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    No, the 6800 UE requires a beefier cooling system than the standard 6800 Ultra, and it IS a dustbuster. Bearable during normal 2D use, but <i><b>horribly</i></b> loud once you get into an actual 3D game. The 6800 GT uses a single-slot, single-connector, is fairly quiet, and gets slapped around like a five-year-old in a Hells' Angels bar by the x800 Pro. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->

    And JHunz, the Mobility series use the same drivers as the standard desktop boards.
    The Linux drivers are just about as difficult to install as any other module (easier actually, given that they come with an option to compile a custom module based upon your sourcetree, rather than a static-linked binary).. the only kvetch I have with them is that they don't have a 64-bit Linux version available yet.
  • TommyVercettiTommyVercetti Join Date: 2003-02-10 Member: 13390Members, Constellation, Reinforced - Shadow
    Thanks for the advice. I had always planned on getting an X800, but after seeing some screenshots of Far Cry with hi-dynamic range rendering and hearing that the X800 couldn't do I was concerned. Looks like ATi is getting my hard-earned dollars.
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    <!--QuoteBegin-Talesin+Jul 18 2004, 09:09 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Jul 18 2004, 09:09 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> No, the 6800 UE requires a beefier cooling system than the standard 6800 Ultra, and it IS a dustbuster. Bearable during normal 2D use, but <i><b>horribly</i></b> loud once you get into an actual 3D game. The 6800 GT uses a single-slot, single-connector, is fairly quiet, and gets slapped around like a five-year-old in a Hells' Angels bar by the x800 Pro. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->

    And JHunz, the Mobility series use the same drivers as the standard desktop boards.
    The Linux drivers are just about as difficult to install as any other module (easier actually, given that they come with an option to compile a custom module based upon your sourcetree, rather than a static-linked binary).. the only kvetch I have with them is that they don't have a 64-bit Linux version available yet. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    I've not heard a single source say it's like the old "dustbuster" they say it CAN be, but NEVER got that hot.
  • 7Bistromath7Bistromath Join Date: 2003-12-04 Member: 23928Members, Constellation
    <!--QuoteBegin-TommyVercetti+Jul 18 2004, 08:21 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (TommyVercetti @ Jul 18 2004, 08:21 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I'm in the market for a new PC, and I feel like designing my own. I hear that the ATi X800 gets better framerates at high resolutions and settings, but does not support the advanced DX9 effects. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    This isn't really the place to go for balanced advice. There's too many fanboys on either side.

    Really, you can say that of <i>any</i> internet forum. Best bet is to just flip a coin, I say.
  • TheChucksterTheChuckster Join Date: 2003-09-20 Member: 21056Members
    I've had nothing but good experiences with NVidia, both as a gamer and a game developer. The driver quality is excellent (especially their Linux drivers). Even John Carmack says, "When I have a problem on an Nvidia, I assume that it is my fault. With anyone else's drivers, I assume it is their fault." The frame rates are fast and the graphics quality is excellent. <!--emo&::gorge::--><img src='http://www.unknownworlds.com/forums/html//emoticons/pudgy.gif' border='0' style='vertical-align:middle' alt='pudgy.gif' /><!--endemo-->
  • TommyVercettiTommyVercetti Join Date: 2003-02-10 Member: 13390Members, Constellation, Reinforced - Shadow
    Oh ****. I have to do some real research now. Damn you fanboys!
  • EEKEEK Join Date: 2004-02-25 Member: 26898Banned
    edited July 2004
    <!--QuoteBegin-TheChuckster+Jul 18 2004, 09:18 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (TheChuckster @ Jul 18 2004, 09:18 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I've had nothing but good experiences with NVidia, both as a gamer and a game developer. The driver quality is excellent (especially their Linux drivers). Even John Carmack says, "When I have a problem on an Nvidia, I assume that it is my fault. With anyone else's drivers, I assume it is their fault." The frame rates are fast and the graphics quality is excellent.  <!--emo&::gorge::--><img src='http://www.unknownworlds.com/forums/html//emoticons/pudgy.gif' border='0' style='vertical-align:middle' alt='pudgy.gif' /><!--endemo--> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    So what? Go to Alienware's page and read their 'testimonials'. Guess who got paid the most?

    Secondly, you are NOT a game developer. If you care to prove me wrong, give me a title, name, and publishing company, but the ONLY thing I've ever seen you able to do is run your linux emulators.

    Third, even though you CLAIM to be a game developer (news flash - Coding your little flash applets doesn't make you a developer on par with John Carmack), you say that frame rates are superb. Yeah, sure, just like Pentiums, NVidia appeals to bigger numbers then better quality, so all the idiot kiddies out there think their penises are bigger (these are the same people who think refresh rates are bunk and brag that they get 100 fps on half-life)

    Fourth, and going back:

    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->

    I'd also like to know about the "terrible service" claim, since Nvidia doesn't actually make the cards.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->

    That sounds reason enough to me why they're ****. You get an ATI made by ATI, you know what you're getting. You go to buy an nvidia card, you have to check which manufacturers are better, which will screw you, and I got SCREWED on my Geforce 4 (the thing melted its own cooling unit).
  • Jim_has_SkillzJim_has_Skillz Join Date: 2003-01-19 Member: 12475Members, Constellation
    oh come on Talesin, don't say that Nvidia's pixel shader technology is just a marketing scam, its actually a good way of doing business. Yes, it's true that most games out there will only run PS 2.0 for a while but since Geforce supports 3.0, it will allow developers to develop now instead of later and guess which cards their going to use.

    Also don't forget about PCI express now Talesin. If you want the best money can buy, you can put 2 Video cards into your system doubling the power and easily destroying ATi. There is a lot Geforce is doing for the market and they keep getting better and better. Yes ATi is damn good(which is great, it creates competition which means lower prices), but if you truly want the best, get a pair of Geforce 6800 Ultra's with PCI Express. Beware if you do that, it means your going to have to get a motherboard that has PCI Express slots(which unfortunately takes up 2 of your regular PCI slots.(Wouldn't matter for me though cuz I don't use any PCI slots).

    If you want more reviews and ranting on this topic, you should check some previous threads that are in Off-Topic.
  • EEKEEK Join Date: 2004-02-25 Member: 26898Banned
    edited July 2004
    3DFX invented the dual card system.



    3DFX is dead for a variety of reasons, mainly because they tried to pull that same **** - pass off lower cards and say 'well you should buy two to do it best' or something.



    That said:

    <a href='http://www.nordichardware.com/reviews/graphiccard/2004/r420/index.php?ez=10' target='_blank'>http://www.nordichardware.com/reviews/grap...index.php?ez=10</a>


    Now flip through - the 6800 beat the X800 in only a couple games, or did it by a difference of less then 8 FPS. Whereas when the X800 beat the 6800, it did it by a massive amount.


    1600 4xAA 8xAF - X800 did far more then THREE TIMES as good as the 6800. That is freaking PATHETIC.


    Also this article is HIGHLY nvidia biased - They pass off graphical errors and corruptions on the NVidia cards (read the end of that far cry page for one) and say 'But we'll just wait for the next patch'. Then read the Temporal AA page - They lambast it saying it's crap since it doesn't work at under 60 FPS. What's worse - the NV40 drives that GIVE ERRORS or a feature that is working just as intended but not how THEY want it? Apparently the latter.


    As for the pixel shader - consider how many games have been created for FUTURE cards using FUTURE technology lately. Sorry, but right now the only thing we have to go on is Unreal 3. And frankly? The super-ultra-vast majority of games do NOT use 'potential' technology unless they're ULTRA POSITIVE that they'll sell. Look at games like Call of Duty - was built on existing technology. Games like Doom3 raise the bar, but you're fooling yourself if you think within a year more then only a handful of games will be using pixel shader 3.0 - and by the time they do, it won't matter that NVidia had it first.


    Dreamworks and Pixar have ultra powerful machines, superior rendering systems. Do game developers make games that look like those? No, nearly never. Because if they did right away, wow look, a dozen people are able to run the game.

    Buying the 6800 for PS3.0 is sheer idiocy. If you did, you're an utter fool.
  • SwiftspearSwiftspear Custim tital Join Date: 2003-10-29 Member: 22097Members
    Pretty much the only advantage NVidea has over ATI is thier duel card scam. If you have a couple thosand bucks laying around, go for it. Otherwize get the bang for your buck, X800
  • 7Bistromath7Bistromath Join Date: 2003-12-04 Member: 23928Members, Constellation
    edited July 2004
    <!--QuoteBegin-Jim has Skillz+Jul 18 2004, 09:46 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Jim has Skillz @ Jul 18 2004, 09:46 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> If you want more reviews and ranting on this topic, you should check some previous threads that are in Off-Topic. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Don't you think he knows? He's been in pretty much every one. He's known in <i>other planes of reality</i> for hating nVidia with a burning passion.
  • illuminexilluminex Join Date: 2004-03-13 Member: 27317Members, Constellation
    If Half Life 2 is your #1 game, get the X800. If Doom III is your #1 game, wait for the benchmarks to come out, but you'll probably want to be using the 6800. If neither is your game of choice, get the X800.

    The "advanced technologies" of the 6800 are not going to be used a whole lot for at least 2-3 years, at which point those technologies will be standard on every card, and will be more efficient than they are now. When you buy the 6800, you're investing in future technology that will be done better in less than a year, on lesser cards.

    Be smart; wait for the PCI Express mobo's and X800's to come out and buy those. That will be worth the time and money.
  • Jim_has_SkillzJim_has_Skillz Join Date: 2003-01-19 Member: 12475Members, Constellation
    <!--QuoteBegin-EEK+Jul 18 2004, 06:55 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (EEK @ Jul 18 2004, 06:55 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> 3DFX invented the dual card system.



    3DFX is dead for a variety of reasons, mainly because they tried to pull that same **** - pass off lower cards and say 'well you should buy two to do it best' or something.



    That said:

    <a href='http://www.nordichardware.com/reviews/graphiccard/2004/r420/index.php?ez=10' target='_blank'>http://www.nordichardware.com/reviews/grap...index.php?ez=10</a>


    Now flip through - the 6800 beat the X800 in only a couple games, or did it by a difference of less then 8 FPS. Whereas when the X800 beat the 6800, it did it by a massive amount.


    1600 4xAA 8xAF - X800 did far more then THREE TIMES as good as the 6800. That is freaking PATHETIC.


    Also this article is HIGHLY nvidia biased - They pass off graphical errors and corruptions on the NVidia cards (read the end of that far cry page for one) and say 'But we'll just wait for the next patch'. Then read the Temporal AA page - They lambast it saying it's crap since it doesn't work at under 60 FPS. What's worse - the NV40 drives that GIVE ERRORS or a feature that is working just as intended but not how THEY want it? Apparently the latter.


    As for the pixel shader - consider how many games have been created for FUTURE cards using FUTURE technology lately. Sorry, but right now the only thing we have to go on is Unreal 3. And frankly? The super-ultra-vast majority of games do NOT use 'potential' technology unless they're ULTRA POSITIVE that they'll sell. Look at games like Call of Duty - was built on existing technology. Games like Doom3 raise the bar, but you're fooling yourself if you think within a year more then only a handful of games will be using pixel shader 3.0 - and by the time they do, it won't matter that NVidia had it first.


    Dreamworks and Pixar have ultra powerful machines, superior rendering systems. Do game developers make games that look like those? No, nearly never. Because if they did right away, wow look, a dozen people are able to run the game.

    Buying the 6800 for PS3.0 is sheer idiocy. If you did, you're an utter fool. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Yeah, your right, 3DFX did develop the PCI Express and then Nvidia bought 3DFX thus allowing them to update their hardware and resell it.

    EEK, you should really do some research on this, but since its obvious you didn't do your homework I will give you a basic rundown of how PCI Express works.

    Basically, you have two cards(right now Nvidia is saying they should be of the same time, for instance Two Geforce 6800 Ultras instead of say a Geforce 6800 Ultra and Geforce 6800 GT). The graphics to be processed is divided by two and each card is given an equal amount of work do to for the final output. Once the processing is done, the two outputs become one final output which gets displayed.

    If you still don't understand how it works, go look around for a couple articles on it.

    That being said, one test does not mean that a certain card is better than the others. This has already been described in previous threads but I will say it again. We already know that the Geforce gets it **** kicked in high Anti-Aliasing test(thats what ATi is good at!!). But guess what, whenever OpenGL comes around, guess who's **** is getting kicked(you probably thought Nvidia and you were horribly wrong). Nvidia does a fine job of kicking ATi's **** in OpenGL therefore giving them the edge it that market.

    And about the Pixel Shaders. READ MY FREAKIN POST MAN, I said that Nvidia has the technology to allow GAME DEVELOPERS begin work on games that actually use this technology by using their hardware. By no way did I say that the GAMERS would benefit from this card(except that they can look forward to seeing nicer-looking games a lot sooner than they probably would have).

    My point is that both cards are similar and you can't give one test to show that one card is exponentially better than the other, because then your not giving a fair review! You have to give the whole picture and when the whole picture is drawn in this matter, you will find that both cards are very similar except one developer has the technology to allow 2 video cards in one system instead of just one.

    It is a fact, 2 is better than 1, you can't dispute that.
  • EEKEEK Join Date: 2004-02-25 Member: 26898Banned
    And? I can theoretically put 2 AMD 1600's and using your utter lack of logic say that that means that 1600 is better then the 2800.
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    Uh... Jim? Do a bit of research yourself, please. What you're thinking of is:
    <b><span style='font-size:21pt;line-height:100%'>SLI</span></b>, or Scan-Line Interleaving.. the system that uses a pair of bridged cards.
    PCI-Express, or PCI-X, is a replacement for the AGP port on your motherboard. It offers far greater bandwidth for large amounts of data-transfer (usually textures are the largest part).

    However, SLI is only available (last I saw from their page) on the PCI-X version of a hand-selected crop of 6800 Ultra Extremes, with custom PCBs.

    Short version, they will be as rare as hen's teeth, require a standalone PSU to run them, be noisy as all crap, and deliver approximately 150%-190% of the current 6800 UE performance numbers. Which will leave the X800 XTPE in the dust. However, I suppose if you have at least $700 per card (price for a 6800UE, msrp), plus a markup for the PCIX version, plus a markup for the SLI bridge PCB, plus a markup for the motherboard that has the slots to support it.


    If you're willing to spend approximately $2000 for the absolute cream of the crop consumer-grade 3D card, then that's fine with me.

    However.
    You could probably liquid or vapor-cool an X800 XTPE, get a custom BIOS from ATi, and overclock it to hell and back for about half the cost, the same performance, and a little bit of risk. <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif' /><!--endemo--> Oh, and you wouldn't have to switch out your motherboard.
    Now, are we both done talking about fantasy-cards that few, if any, will ever realize?

    Short version. The XTPE is $500, and is faster than the UE at $700. And let's not even talk about how easy it is to walk down to the local Fry's and pick up an XTPE, as opposed to trying to find one of the rare 'Golden Sample' 6800 UE cards... it'll only go downhill. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
  • TommyVercettiTommyVercetti Join Date: 2003-02-10 Member: 13390Members, Constellation, Reinforced - Shadow
    <!--QuoteBegin-illuminex+Jul 18 2004, 10:10 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (illuminex @ Jul 18 2004, 10:10 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> If Half Life 2 is your #1 game, get the X800. If Doom III is your #1 game, wait for the benchmarks to come out, but you'll probably want to be using the 6800. If neither is your game of choice, get the X800.

    The "advanced technologies" of the 6800 are not going to be used a whole lot for at least 2-3 years, at which point those technologies will be standard on every card, and will be more efficient than they are now. When you buy the 6800, you're investing in future technology that will be done better in less than a year, on lesser cards.

    Be smart; wait for the PCI Express mobo's and X800's to come out and buy those. That will be worth the time and money. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Since my ceiling is about $3,500, I'm going for the X800 XT P.E. Two 6800's is simply not worth it.

    And to answer your question, these games are the most important to me:
    1. STALKER: Shadow of Chernobyl
    2. Half-Life 2
    3. Doom III
  • QuaunautQuaunaut The longest seven days in history... Join Date: 2003-03-21 Member: 14759Members, Constellation, Reinforced - Shadow
    <!--QuoteBegin-404NotFound+Jul 18 2004, 06:50 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (404NotFound @ Jul 18 2004, 06:50 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Quaunaut+Jul 18 2004, 08:50 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Quaunaut @ Jul 18 2004, 08:50 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-404NotFound+Jul 18 2004, 06:33 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (404NotFound @ Jul 18 2004, 06:33 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I've never had an Nvidia driver go bad or incompatable on me... and i've been using Nvidia for 5 years now. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    I'm not saying they go bad.

    I'm saying they CAN'T DO CRAP. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Example?

    I'd also like to know about the "terrible service" claim, since Nvidia doesn't actually make the cards. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Far Cry? Splinter Cell: Pandora Tomorrow? ANY game using pixel shaders(other than DOOM3- its opengl based, so they'll be pretty close to same)
  • Jim_has_SkillzJim_has_Skillz Join Date: 2003-01-19 Member: 12475Members, Constellation
    <!--QuoteBegin-Talesin+Jul 18 2004, 08:18 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Jul 18 2004, 08:18 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Uh... Jim? Do a bit of research yourself, please. What you're thinking of is:
    <b><span style='font-size:21pt;line-height:100%'>SLI</span></b>, or Scan-Line Interleaving.. the system that uses a pair of bridged cards.
    PCI-Express, or PCI-X, is a replacement for the AGP port on your motherboard. It offers far greater bandwidth for large amounts of data-transfer (usually textures are the largest part).

    However, SLI is only available (last I saw from their page) on the PCI-X version of a hand-selected crop of 6800 Ultra Extremes, with custom PCBs.

    Short version, they will be as rare as hen's teeth, require a standalone PSU to run them, be noisy as all crap, and deliver approximately 150%-190% of the current 6800 UE performance numbers. Which will leave the X800 XTPE in the dust. However, I suppose if you have at least $700 per card (price for a 6800UE, msrp), plus a markup for the PCIX version, plus a markup for the SLI bridge PCB, plus a markup for the motherboard that has the slots to support it.


    If you're willing to spend approximately $2000 for the absolute cream of the crop consumer-grade 3D card, then that's fine with me.

    However.
    You could probably liquid or vapor-cool an X800 XTPE, get a custom BIOS from ATi, and overclock it to hell and back for about half the cost, the same performance, and a little bit of risk. <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html//emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif' /><!--endemo--> Oh, and you wouldn't have to switch out your motherboard.
    Now, are we both done talking about fantasy-cards that few, if any, will ever realize?

    Short version. The XTPE is $500, and is faster than the UE at $700. And let's not even talk about how easy it is to walk down to the local Fry's and pick up an XTPE, as opposed to trying to find one of the rare 'Golden Sample' 6800 UE cards... it'll only go downhill. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Sorry, yeah its called SLI but from my knowledge and the reviews I have read, it won't be rare and it isn't certainly going to cost that much. It also isn't JUST FOR the cream of the crop Geforce cards. I think the lowest cards they will support are the Geforce 6800 GTs.

    As for you EEk, I hope thats not your rebuttal... Of course I didn't mean two chips of much lesser value than one big one. We are talking about 2 cards that are different in many ways but yet really similar in benchmarks, etc. I hope you aren't saying that the X800 card is more than 2 times better than the Geforce equivalent.
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    Jim, look at how often it was used back when 3Dfx tried it. You could buy two of their cards and hook 'em together with a proprietary bridge cable (afaik, you'll have to buy the nVidia SLI as a package deal).
    They sold less than a thousand of the cables in the consumer market, which were *required* for SLI.
    It was a bomb. A goose-egg. A cash-sink that helped to drag them downward. I only find it amusing how many of 3Dfx's mistakes nVidia is DUPLICATING, SLI only being the latest.

    In the case of nVidia, the cards must be SLI-specific... which the early review boards are NOT. They don't have the headers. The average consumer board may not have the headers, either.
    See prior note about 'jack up price for X feature'.

    Regardless, you're going to be paying at least $800 (assuming a pair of GTs) for a monstrous space heater that will dominate your case and give only marginally better performance (if that) to the ATi solution. You'll also be giving up two full, dedicated power-leads from your PSU, which leaves most people with one to run all of their HDDs, optical drives, and any accent lighting.
    That's right. For EACH of these cards, nVidia demands the sacrifice of one PSU lead... on the 6800 UE, make that two PSU leads PER CARD. You'll literally have to put in a second PSU *just to run the video cards*.
  • Marik_SteeleMarik_Steele To rule in hell... Join Date: 2002-11-20 Member: 9466Members
    edited July 2004
    <!--QuoteBegin-Talesin+Jul 19 2004, 12:58 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Jul 19 2004, 12:58 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> [...]
    You'll also be giving up two full, dedicated power-leads from your PSU, which leaves most people with one to run all of their HDDs, optical drives, and any accent lighting.
    That's right. For EACH of these cards, nVidia demands the sacrifice of one PSU lead... on the 6800 UE, make that two PSU leads PER CARD. You'll literally have to put in a second PSU *just to run the video cards*. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    This is where 3DFX went absolutely nuts. On their very rare high-end model of the Voodoo5 (I believe it was called the "6000") their prototypes required something different.

    You know that cable going straight from your wall outlet/surge protector to the back of your computer? Yeah. Think of having a 2nd one of those cables plugged directly into the back of the video card. I didn't believe it either until I <a href='http://www.sysopt.com/articles/21stCenturyTrends/v5_6000.png' target='_blank'>saw it</a>.


    In any case, it's evident that nVidia's engineers aren't pushing as hard as ATi's in terms of power consumption and heat.
  • DefianceDefiance Join Date: 2003-12-01 Member: 23847Members
    After tons of garbage driver problems, and incompatablility with certain VIA chipsets, I will never buy another ATI card for as long as I live. RADEON 9600XT's do not work with VIA KT600 Chipsets, regardless of the motherboard model or manufactor. I'm not saying their cards bad in terms of hardware and how well they can run games, I'm saying their cards are bad in terms of garbage drivers and compatability issues. The most you can get out of that card in that situation is 4x AGP mode - half of what it can really do.

    All I know is I've never had problems with any Nvidia cards I own.

    And after looking at a lot of benchmarks since I have to get a new video card now, most of the higher end ones, like others have said, are alike in many ways, but each card has it own advantages.

    Personal preference... buy a card for what you want it to do, and make sure to research it throughly to make sure it will work with all your current hardware.
  • NumbersNotFoundNumbersNotFound Join Date: 2002-11-07 Member: 7556Members
    <!--QuoteBegin-Talesin+Jul 19 2004, 12:58 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Jul 19 2004, 12:58 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Regardless, you're going to be paying at least $800 (assuming a pair of GTs) for a monstrous space heater that will dominate your case and give only marginally better performance (if that) to the ATi solution. You'll also be giving up two full, dedicated power-leads from your PSU, which leaves most people with one to run all of their HDDs, optical drives, and any accent lighting.
    That's right. For EACH of these cards, nVidia demands the sacrifice of one PSU lead... on the 6800 UE, make that two PSU leads PER CARD. You'll literally have to put in a second PSU *just to run the video cards*. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Am I the only one that has a PSU that is overflowing with PSU molex connectors? I must have like 8 of the things totally in excess.
Sign In or Register to comment.