Radeon Hits Back

2

Comments

  • DragonMechDragonMech Join Date: 2003-09-19 Member: 21023Members, Constellation, Reinforced - Shadow
    edited May 2004
    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Second of all, the X800 XT cards have not come out.  I said the cheaper card(and I will look this up for you to be more specific) the Radeon X800 Pro(not the XT) has come out and can be bought in stores. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->Yeah it comes out at the end of the month. So? I'd be very willing to wait less than three weeks for a card as hot as the X800 XP PE.

    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Third of all, your basing your review of the GF6 on one aspect of their card which is FSAA.  I can do that also, ATi's card is uncomparable to the GF6 because the GF6 is 50 fps above the ATi card in all the benchmarks on Call of Duty.  See how thats ONE aspect of the card where it clearly dominates.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Ok, it's better at one game. Woo hoo. On average the X800 beats it on just about any other game. And since when is <a href='http://www.tomshardware.com/graphic/20040504/ati-x800-19.html' target='_blank'>this</a> or <a href='http://www.tomshardware.com/graphic/20040504/ati-x800-18.html' target='_blank'>this</a> or <a href='http://www.tomshardware.com/graphic/20040504/ati-x800-17.html' target='_blank'>this</a> a 50 FPS difference? At best it's ~30 FPS higher. And it's really hard to tell a difference in the case of +100 FPS (which is the case for nearly all of the CoD tests). Most monitors can't even handle above 80 Hz at UXGA resolution (1600x1200).

    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Fourth of all, there are a lot of better benchmarks to use as your bias considering the one you used is comparing the Geforce in 8xFSAA compared to the 6xFSAA of the ATi card in one part of the test.  BTW if you didn't notice, Nvidia one overall in that aspect vs ATi.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->Since when does 84.2 FPS lose to 22.4 FPS? Or (although it's not really that much of a difference) 104.5 FPS to 102.9? In only one test does the GF6 win and that's by less than 6 FPS.


    And on one last note, take a look at the card alienware recommends (even though it's more than 200$ cheaper!). The arrow & circle tell all:
  • Har_Har_the_PirateHar_Har_the_Pirate Join Date: 2003-08-10 Member: 19388Members, Constellation
    edited May 2004
    this is kinda off topic, but can some one tell me why the hell you would put AA on any resolution, i see no difference ever. soooo, wth?

    and why do they still AA the whole image and not just the edge of the polygon similar to how the parahelion works
  • Jim_has_SkillzJim_has_Skillz Join Date: 2003-01-19 Member: 12475Members, Constellation
    <!--QuoteBegin-Dragon_Mech+May 4 2004, 09:40 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Dragon_Mech @ May 4 2004, 09:40 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Second of all, the X800 XT cards have not come out.  I said the cheaper card(and I will look this up for you to be more specific) the Radeon X800 Pro(not the XT) has come out and can be bought in stores. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->Yeah it comes out at the end of the month. So? I'd be very willing to wait less than three weeks for a card as hot as the X800 XP PE.

    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Third of all, your basing your review of the GF6 on one aspect of their card which is FSAA.  I can do that also, ATi's card is uncomparable to the GF6 because the GF6 is 50 fps above the ATi card in all the benchmarks on Call of Duty.  See how thats ONE aspect of the card where it clearly dominates.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Ok, it's better at one game. Woo hoo. On average the X800 beats it on just about any other game. And since when is <a href='http://www.tomshardware.com/graphic/20040504/ati-x800-19.html' target='_blank'>this</a> or <a href='http://www.tomshardware.com/graphic/20040504/ati-x800-18.html' target='_blank'>this</a> or <a href='http://www.tomshardware.com/graphic/20040504/ati-x800-17.html' target='_blank'>this</a> a 50 FPS difference? At best it's ~30 FPS higher. And it's really hard to tell a difference in the case of +100 FPS (which is the case for nearly all of the CoD tests). Most monitors can't even handle above 80 Hz at UXGA resolution (1600x1200).

    <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Fourth of all, there are a lot of better benchmarks to use as your bias considering the one you used is comparing the Geforce in 8xFSAA compared to the 6xFSAA of the ATi card in one part of the test.  BTW if you didn't notice, Nvidia one overall in that aspect vs ATi.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->Since when does 84.2 FPS lose to 22.4 FPS? Or (although it's not really that much of a difference) 104.5 FPS to 102.9? In only one test does the GF6 win and that's by less than 6 FPS.


    And on one last note, take a look at the card alienware recommends (even though it's more than 200$ cheaper!). The arrow & circle tell all: <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Alienware doesn't have the true prices yet but yes the Ultra card will be more expensive than the X800 but thats because they are only going to make so much. ATi card is recommended only because it will be cheaper even though it will perform a little worse than the limited edition Geforce.

    BTW this is the end of that section from which you took that screenie.

    The <i><b>GeForce 6800 Ultra wins this discipline</b></i>, albeit by a slim margin. ATI, on the other hand, is trying to answer this new threat to its traditional turf with its new Temporal-AA (see above). Unfortunately, due to the obligatory V-Sync, this AA mode cannot be benchmarked correctly. Also, as we explained before, the effect can't be seen in screenshots.

    Geforce 6 won <b>Overall</b>.

    But again, this discussion will get nowhere as it stands. There are too many changes in store for BOTH cards to say which one is the better pick, both will be priced the same. The geforce 6 can easily compete with the ATi as it stands but there are a couple things up NVidia's sleeves which you might have missed. One is the 3.0 shader support they offer in their card where the ATi card only offers 2.0 shaders. Now I won't be biased right here because the ATi team is playing it safe, there aren't going to be any games that will support 3.0 shaders till next year but Nvidia is getting into that market now so they will attract all the companies looking forward to using this technology. Expect to see great things come from this company.
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    Actually, Jim.. you've got that a bit backward. The GF6 scored LOWER overall, excepting one or two games, which it ran ~30 frames faster. The rest ran anywhere between 5 and <b>100</b>fps faster on the ATi card. nVidia is still second-best, overall. And still takes up TWO slots, excepting their budget card. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo-->

    And as to the person complaining that nVidia was using 8X FSAA as opposed to ATi's 6x... the problem is, ATi's 6x FSAA is visually superior to nVidia's 8x mode. And STILL runs faster. You do an IQ (image quality) comparison, and the Radeon is achieving a more complete AA solution.. even at the 6x versus 8x. It's truly sad how poorly nVidia performs under *quality* conditions, rather than cutting color precision, disabling the ability for the end-user to override 'optimizations' (such as the 'hidden' color precision cuts that make the game look like ****), and doing non-proper anisotropic filtering.


    As for the driver issue; remember, you'll likely never see the drivers they used. They'll be improved all around by the time they go fully-public. But then again, that could be a down side... certain companies HAVE recently tried to cheat by putting in static clip planes in certain benchmarks (*COUGH*), to try yet another underhanded tactic to get back the 'top speed' crown, then removed them when the beta drivers went public, and they were caught. But only after strongarming the benchmarker to try and force 'em to say that pulling that crap was legal.
    Let's see... cut color calculations, not allowing the user to turn off 'optimizations', cheating at benchmarks, inefficient operation, requiring a huge-**** power supply, crappy AA/Aniso performance, crappy (comparative) IQ, taking up two slots... versus raw, beautiful power, using only one molex connector.

    Well, I made my pick a while back. Those who feel like staying with nVidia... well.. hope you have the cash to burn. Because that's essentially what you're doing with it.
  • VenmochVenmoch Join Date: 2002-08-07 Member: 1093Members
    Seriously guys stop the frikkin point scoring. Its doing my damn nut in.

    Each card will have an advantage over the other.

    I don't know what they are but there will be.

    Seriously you all look like a bunch of whiners going on about "olo this one goez fastar!" or "olo this one takes up less space" or whatever.

    So what do you buy a graphics card for? Playing games right? Well in that case <b>PLAY THE FRIKKIN GAMES INSTEAD OF BITCHING ABOUT WHAT CARD IS BETTAR!</b> and anyway, who needs such a hige FPS, the human eye can only see a max of 30 fps's any more is just a stupid waste of time.
  • BeastBeast Armonkyi Join Date: 2003-04-21 Member: 15731Members, Constellation
    <!--QuoteBegin-Venmoch+May 5 2004, 09:17 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Venmoch @ May 5 2004, 09:17 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Seriously guys stop the frikkin point scoring. Its doing my damn nut in.

    Each card will have an advantage over the other.

    I don't know what they are but there will be.

    Seriously you all look like a bunch of whiners going on about "olo this one goez fastar!" or "olo this one takes up less space" or whatever.

    So what do you buy a graphics card for? Playing games right? Well in that case <b>PLAY THE FRIKKIN GAMES INSTEAD OF BITCHING ABOUT WHAT CARD IS BETTAR!</b> and anyway, who needs such a hige FPS, the human eye can only see a max of 30 fps's any more is just a stupid waste of time. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Seconded.
    I would like to say though, the Max FPS you can really get anyway is your monitor's refresh rate... unless you want choppy frames.. o.o
    As far as I know, anyway.
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    Not if you're doing stereoscopic rendering. That essentially cuts the framerate in half, as it has to draw an image for each eye. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->

    And the Radeon is faster, takes up less space, draws less power, runs cooler, and gives a better picture quality. There's no quibbling on that part, aside from one or two games. And even then, that's ONLY on the fps rate. The Radeon still spanks the GF6 for image quality. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->

    Venmoch, that 30fps is assuming film. Which, as with most analog media, is very forgiving... it doesn't capture crisp single frames. It captures the blurring of motion. Which is essentially what fools the eye. Video cards don't do that blur. They give crisp frames... it's the same kind of problems you get with a digital video camera. Once you get up to 60-80fps though, THEN it gets more difficult to tell.

    However, those fps ratings aren't neccessarily saying that you're going to be RUNNING it at that speed. It's a benchmark of how powerful the card is.. ie, how well it handles complexity under differing conditions and load.
    What it says is that with the new Radeon, you can run almost all current games at 1600x1200x32, with max FSAA and anisotropic filtering, and STILL get silky-smooth motion.
  • CreepieCreepie Join Date: 2003-02-19 Member: 13734Members
    The new cards from both manufacturers will bring existing prices down. Good news for the up and coming Doom3/HL2/Stalker/etc... releases
  • CommunistWithAGunCommunistWithAGun Local Propaganda Guy Join Date: 2003-04-30 Member: 15953Members
    <!--QuoteBegin-404NotFound+May 4 2004, 04:56 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (404NotFound @ May 4 2004, 04:56 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I, for one, am glad to have two this close in competition... can we say "capitalism in action?" <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Not yet, we need the starving children <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html//emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif' /><!--endemo-->
  • TacOneTacOne Join Date: 2002-11-05 Member: 7070Members
    Damn! Now I've <b>got</b> to get me an X800XT <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html//emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif' /><!--endemo--> <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html//emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo-->

    Now the only thing i need is money...


    Better go to the bank [pumps shotgun]
  • TychoCelchuuuTychoCelchuuu Anememone Join Date: 2002-03-23 Member: 345Members
    <!--QuoteBegin-dirtygabbsnevada+May 5 2004, 02:24 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (dirtygabbsnevada @ May 5 2004, 02:24 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> and why do they still AA the whole image and not just the edge of the polygon similar to how the parahelion works <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    I sure as heck wish I knew, but I have a feeling that only doing it on the edge would take a lot of recoding that nobody wants to do.
  • coilcoil Amateur pirate. Professional monkey. All pance. Join Date: 2002-04-12 Member: 424Members, NS1 Playtester, Contributor
    "Humans can only see 30fps max."

    True, and not true.

    Film played at less than 30fps looks choppy. A still image from a film will be blurry, as it captures motion. It's like taking a photograph while accidentally moving the camera. When you put these 30-motion-blurred-frames-per-second together, the human eye sees fluid movement. Increasing the FPS will not increase quality in any discernable amount.

    Frames in a video game are always static images. Screenshots contain no motion blur. To my knowledge no video card or engine has yet effectively produced motion blur. Therefore, 30fps is actually choppy in our eyes because there is no blur to round out the images.

    Imagine you're watching a game running at 60fps. You can only see 30fps. So what do you see? For every frame *you* can see, what you actually see is a blur of two frames. Motion blur, produced by your eye. Run at 90fps, and each frame your eye sees is a blur of *three* frames. More effective motion blur.
  • aaarrrghaaarrrgh Join Date: 2003-10-20 Member: 21812Members
    <!--QuoteBegin-Talesin+May 5 2004, 01:40 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ May 5 2004, 01:40 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Well, I made my pick a while back. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    and by this single remark, you've admitted to being biased towards one company. How can anyone base their purchase on anyone like that? I'm all for these threads, but if people don't bring objective views into it (something you haven't done in the past either, when it's down to graphics cards. At least on IRC), the discussions will be nothing but cesspools of hate and fanboyism.

    And for this post, I will take a stance defending Nvidia, not because I prefer it (which is too early yet. Drivers and price are both factors important to me), but simply because the discussions have been littered with oddities, and factual mistakes that I'm incredibly curious about (and I promise not to make any hastened conclusions. Contact me again, when both cards are on the market, and aren't using beta drivers, and I'll be able to make a fair comment). Again, I think these have more to do with ones bias, than anything else, but I may be mistaken. Given the fact that I study, and therefore don't have alot of money, I will focus primarily the cheaper models. That'd make more sense to 90% of you in here, as I bet loads of you don't have alot of money, either. Thirdly, please correct any factual mistakes I make, as I'm not an expert by any means. Lastly, I'll base anything I say on anandtech's tests. To me, they stand out to be the smartest, and the fairest of the reviewers. And unlike Toms hardware, they aren't littered with obvious factual mistakes (their farcry 1.1 / gf6800 coverage springs to mind) And opposite Hardocp, they compare apples to apples. While Hardocp's system does make sense, it's also easily confusing, which is the reason I won't even touch it here.

    General misconceptions:
    - First up, only the gf6800 ultra (I think. bloody name schemes) takes 2 molex connectors. This is their top card. This is nvidia's 9800xt. This is the card that only the select elite will end up buying. So if you're looking for a top of the line please ignore this post.

    - ATI has the faster cards. What? Go read anandtech's tests again. They're almost equal. ATI has the lead in a couple of tests (farcry, homeworld 2 (big one here), eve, ffxi, ut2k4 with AA on etc), but so does Nvidia (ut2k4 without AA, that F1 game, x2 in higher res, jedi knight, wolfenstein, nwn). It's interesting to see that several of these games aren't typical benchmark games, but in my opinion, that's for the better. Game performance varies, and as such, it's important to test on a wide array of <b>popular</b> games. Which they've done.

    What's obvious is that Nvidia has the lead in openGL games (by a larger margin), whereas ATI leads the dx9.0 games (slightly). Given the fact that I can't think of any openGL games coming out (Doom III excluded) in the near future, I'd state that holding the DX throne is naturally more important. So ATI wins this one. At least until drivers are released.

    Personally, I think that each card has notable one problem. In ATI's case, it's the lack of PS 3.0. While we probably won't see any games taking proper advantage of this for several months, it's a stepback, the same way the previous generation of nvidia cards were limited. Nothing more, nothing less.

    Nvidia's obvious problem is the power consumption. A 400W PSU might be somewhat expensive, but then again, the 9800pro in my mate's machine won't run properly without 350W. Still, I imagine it being expensive if you're forced to upgrade that as well.

    Likewise, I think that each card has several obvious advantages. Nvidia, unlike ATI has a $299 card. While it may be slower than any of the other cards, it's still this generation's Radeon 9600. We'll have to wait for benchmarks/power consumption specs before a final comment can be made around this. In short, the card that might potentially offer the best performance/price.

    But as I've mentioned, there are still 2 factors needed before anyone can even make the statement that one is blowing your money on card xx or yy. Price and Drivers. Nvidia has been known to pump out new and improved drivers at a much faster rate in the past. But given the new and close competition, I expect ATI to step up their game, and at least offer the same.

    To a similar extent, it's important to meassure the price. $30-$40 possibly even $50 might not seem like much, but it is to many, including me.

    My conclusion? Anyone deciding yet, are either fanboys, or really really rich (to which, they should probably go for ATI's offering). But for everyone else, keep your head cool, and give each card a month to mature. It's for the better.
  • DragonMechDragonMech Join Date: 2003-09-19 Member: 21023Members, Constellation, Reinforced - Shadow
    <!--QuoteBegin-aaarrrgh+May 5 2004, 12:51 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (aaarrrgh @ May 5 2004, 12:51 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--><!--QuoteBegin-Talesin+May 5 2004, 01:40 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ May 5 2004, 01:40 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Well, I made my pick a while back. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    and by this single remark, you've admitted to being biased towards one company. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Duh. That's because we think the product (or in ATI's case, products) they make outperforms the competiton. I like ATI because their cards outperform Nvidia's in a wide range of areas. Their cards handle AA/AF far better, and seem to get higher FPS on average. Also their engineering is far more efficient. They can, with only 160 Million transistors, rival and even surpass the performance of a 222 million transistor card.
  • aaarrrghaaarrrgh Join Date: 2003-10-20 Member: 21812Members
    edited May 2004
    <!--QuoteBegin-Dragon_Mech+May 5 2004, 02:11 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Dragon_Mech @ May 5 2004, 02:11 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-aaarrrgh+May 5 2004, 12:51 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (aaarrrgh @ May 5 2004, 12:51 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--><!--QuoteBegin-Talesin+May 5 2004, 01:40 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ May 5 2004, 01:40 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Well, I made my pick a while back. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    and by this single remark, you've admitted to being biased towards one company. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Duh. That's because we think the product (or in ATI's case, products) they make outperforms the competiton. I like ATI because their cards outperform Nvidia's in a wide range of areas. Their cards handle AA/AF far better, and seem to get higher FPS on average. Also their engineering is far more efficient. They can, with only 160 Million transistors, rival and even surpass the performance of a 222 million transistor card. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    and that's why you'd make up your mind 'a while ago', obviously before the card is even benchmarked?

    ...ok...

    Oh, and go read those anandtech benchmarks again. they're as equal as it gets when it comes down to framerates.

    Again, I'm defending nvidia, but only because it seems to me that people are showing an obvious bias. As to the AA, I've never studied this intensenly. I've figured that if my card does 1600x1200, then it'd be good enough for me. Perhaps you a link or two to a non-biased source that discusses/shows/displays/whatever this, as it is a valid point.
  • DragonMechDragonMech Join Date: 2003-09-19 Member: 21023Members, Constellation, Reinforced - Shadow
    edited May 2004
    Nooo... I made up my mind when I saw the GF6 get 22.4 FPS vs the X800's 80+ FPS.

    For some good unbiased benchmarks, try Tom's Hardware. A link to the X800 article is provided at the first post of this thread.
  • aaarrrghaaarrrgh Join Date: 2003-10-20 Member: 21812Members
    again, given toms hardware's trackrecord, I'm an completely unwilling to trust them. Sadly.
  • Nemesis_ZeroNemesis_Zero Old European Join Date: 2002-01-25 Member: 75Members, Retired Developer, NS1 Playtester, Constellation
    Friendly note from a grumpy admin: Keep it civil, folks.
  • DragonMechDragonMech Join Date: 2003-09-19 Member: 21023Members, Constellation, Reinforced - Shadow
    edited May 2004
    While I've never heard anything but positive results from TH, you just as easily go to any of the other 5 reivew places linked at the top of this thread.
  • Cereal_KillRCereal_KillR Join Date: 2002-10-31 Member: 1837Members
    I'm already counting the molexes in my case (don't want it to blow up on me). And my current card doesn't even have a aux power.

    Oh, and quality > raw speed atm, for such a close difference.
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    aaarrrgh, expect a $199 part out of ATi once the top-end buying frenzy has settled. Also, doesn't nVidia recommend a 480W PSU, not just a 400? Those get pretty expensive pretty quick. Tack on an additional $120-250 depending on where you go, plus the hassle of swapping your PSU out.

    Though I do have to agree with you about Tom's. They've been heavily biased in the past. The funny part is... they've always been nVidia/Intel-biased. Interesting that either their bias has changed, or they HAVE no negative statistics to put up. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
    Also, note that Anandtech was using an FIC motherboard. Not a really large point of contention, but they aren't known for the best design. MSI, followed closely by ASUS, hold that crown. Hell, I believe Soyo comes in before FIC, on that regard.

    Aaarrrgh, I made my decision when I saw that the nVidia solution would hog space in my machine, pump out unreasonable amounts of heat, and eat two DEDICATED Molex connectors. You'd expect for something that greedy, it'd blow everything else out of the water.
    Benchmarks only strengthened this feeling, showing that the GF6 is markedly slower, on the whole.

    And though HardOCP may not seem to compare 'apples to apples', remember that in the AA/AF tests, the nVidia solution is LESS effective at maximum settings than the ATi, though ATi only uses 6x as their max AA setting, and 16x for AF.
    Welcome to marketroidism. The idiot will pick the card that can do '8x FSAA', even if it looks like crap compared to the competitor's 6x FSAA. It has a bigger number, after all! Do research, become an educated consumer. Don't fall into the semantics trap.
  • Jim_has_SkillzJim_has_Skillz Join Date: 2003-01-19 Member: 12475Members, Constellation
    <!--QuoteBegin-Dragon_Mech+May 5 2004, 12:58 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Dragon_Mech @ May 5 2004, 12:58 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Nooo... I made up my mind when I saw the GF6 get 22.4 FPS vs the X800's 80+ FPS.

    For some good unbiased benchmarks, try Tom's Hardware. A link to the X800 article is provided at the first post of this thread. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
    Please Dragon Mech, that benchmark is not even equal to the ATi's benchmark. True the ATi card does beform better with Antialiasing on thats not the ONLY thing you should look at in a card, if perhaps thats all you care about then go get an ATi card.

    For each cards sake, I suggest this thread just gets locked and reopened when both cards come out. I have seen way too many pure hate threads for the opposite video card when each card is merely the same as the other card.

    Anyways, I am not gonna post in this thread anymore. Even if you post unbiased no one who has a set mind will look in your direction.
  • Cereal_KillRCereal_KillR Join Date: 2002-10-31 Member: 1837Members
    I remember THG saying something like "ati being like AMD and nvidia the intel" back in 8500 days, relating to how they were the rising "star"

    I don't feel them being so biased at all, then agian, I never read only one review to base my thoughts on <!--emo&???--><img src='http://www.unknownworlds.com/forums/html//emoticons/confused.gif' border='0' style='vertical-align:middle' alt='confused.gif' /><!--endemo-->
  • aaarrrghaaarrrgh Join Date: 2003-10-20 Member: 21812Members
    edited May 2004
    <!--QuoteBegin-Talesin+May 5 2004, 03:21 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ May 5 2004, 03:21 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> aaarrrgh, expect a $199 part out of ATi once the top-end buying frenzy has settled. Also, doesn't nVidia recommend a 480W PSU, not just a 400? Those get pretty expensive pretty quick. Tack on an additional $120-250 depending on where you go, plus the hassle of swapping your PSU out.
    <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    that's only for the ultra extreme edition (or whatever it's called. I give up). Tests (although I cannot honestly remember from where. if it WAS tom, then shoot me) have shown that the 6800 "only" consumes 50-60W more than the Radeon.

    Also, I guess I'll just follow "jim has skillz" and be back when the market has matured a bit. Until then, keep your head up, and don't inhale all that glue!
  • DragonMechDragonMech Join Date: 2003-09-19 Member: 21023Members, Constellation, Reinforced - Shadow
    edited May 2004
    <!--QuoteBegin-Jim has Skillz+May 5 2004, 02:24 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Jim has Skillz @ May 5 2004, 02:24 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--><!--QuoteBegin-Dragon_Mech+May 5 2004, 12:58 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Dragon_Mech @ May 5 2004, 12:58 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Nooo... I made up my mind when I saw the GF6 get 22.4 FPS vs the X800's 80+ FPS.

    For some good unbiased benchmarks, try Tom's Hardware. A link to the X800 article is provided at the first post of this thread. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Please Dragon Mech, that benchmark is not even equal to the ATi's benchmark. True the ATi card does beform better with Antialiasing on thats not the ONLY thing you should look at in a card, if perhaps thats all you care about then go get an ATi card.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    Of course that's not the only thing I'm looking at. I'm looking at (in no particular order):

    1) Performance. Mostly FPS X resolution/color depth. Etc.
    2) Size / weight. (The next computer I get is going to be a SFF, so size matters - the smaller the better.)
    3) Power. SFF PCs don;t have a lot of room for huge power supplies. I don't think a standard SFF PC could run a GF6U since it takes so much power.
    4) Frills. Stuff like AA & AF, smoothshading, etc.
    5) $$$ - da mula!

    In most of those areas, ATI takes the cake.
  • EpidemicEpidemic Dark Force Gorge Join Date: 2003-06-29 Member: 17781Members
    edited May 2004
    <span style='color:red'>*NUKED.* Tom's doesn't appreciate bandwidth theft either.</span>
  • Cereal_KillRCereal_KillR Join Date: 2002-10-31 Member: 1837Members
    actually, whoever wants an ati - get an ati.
    Whoever wants an nvidia - get an nvidia.

    I know I'm going to get an ati for my own personal reasons, but if people want a 6800, they can go ahead. My father uses a Parhelia (Matrox) and swears it is the best graphic card out there. I can't disagree, but I wouldn't use a Matrox seeing my standards.
  • DragonMechDragonMech Join Date: 2003-09-19 Member: 21023Members, Constellation, Reinforced - Shadow
    edited May 2004
    <!--QuoteBegin-Epidemic+May 5 2004, 02:38 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Epidemic @ May 5 2004, 02:38 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Dragonmech, I have one thing to say, suuuuucccckkk it bbbabbbyyy <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html//emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo--> <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
    1) THG doesn't allow for bandwith theft.
    2) I'll add this when I find out what picture you were linking to.

    [EDIT] That would be the 3rd CoD screen (not available ATM) but I remember that the GF6 outdid the X800 by ~20 FPS in CoD. So what? 99% of monitors can't handle +80 FPS at UXGA resolution. I said that before. Most of the CoD tests were at +100-110 FPS.

    [EDIT 2] Ok, now that I can see the image, I have so ask again 'So what?' Virtually no monitor on the market (if any) can handle that many FPS. A game makes a poor benchmark when you're comparing incredible high FPS. A 25 vs. 45 FPS comparison is a heck of a lot more usefull. All that your image tells me is that CoD is an easy game to run.
  • EpidemicEpidemic Dark Force Gorge Join Date: 2003-06-29 Member: 17781Members
    THG can kiss my ****.
    Anyway, yes, nvidia has problems with 8x FFAA (something). Perhaps it will be ironed out with a new driver, you saw how much the new drivers jumped from the old drivers?
    And tom says <!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->NVIDIA's 8xS mode, a combination of SuperSampling and MultiSampling, offers superior quality. Unfortunately, it also incurs a major performance hit, bringing the frame rates crashing down.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    edited May 2004
    Also... don't tell anyone, but the RX800 XTPE only uses 56W on its own. 60W more for the 6800 Ultra means more than DOUBLE the power consumption. Quite a bit more than 'just a little'. And the only 6800 I've seen that has a single Molex is the 'budget' version. The standard still requires two.

    As well, I'm holding comment on the TAA tech until I see it for myself. But personally the outlook is good, given that ATi realized how to effectively double their AA sampling rate, so long as the framerate is high enough. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> I'm just happy that I'll be able to play around with it on my R9500 Pro, given that it's being backported completely as a function of the driver software. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo-->

    When you get into running tests at 1600x1200x32 and still can't declare a clear victor, quality is all you have left. ATi pumps the frames AT quality, rather than cutting it down to keep up, as nVidia has been forced to do. They pump them more effectively still under AA/AF. They stay in their own damn slot, as a bonus.


    (edit) And 'superior quality' is comparing it against the FX5950's 8xFSAA. They still fall behind ATi, even with that 'advancement'. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo--> (/edit)
Sign In or Register to comment.