<!--QuoteBegin--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->in game benchmarks are the real key... i never believe any of this 3dmark crap. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
Obviously they think enough people DO care about 3D Mark tests to warrant putting their reputation on the line <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->
TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
Okay... here we go.
Yes, nVidia cheated on the benchmark. Wanna know why they didn't think they'd be caught? <b>They didn't own a full version of 3DMark2003 at the time.</b> They didn't /know/ you could move the camera off the rails. They were caught flat-out, bald-faced cheating to try and market their cr*ppy cards as 'fast'.
That graph is amazing.. 10fps? Wow. They must have ONLY tested their card with the full-version, and dumped the old tech demo on the R9800. As well as used the new Cats, which are known to have problems. When running with Catalyst 3.1 drivers, the GFFX 5900 gets STOMPED by 10-20% by the R9800 Pro. And that's without the Pro version which will be coming out in a month or two with DDR-2 and a higher core speed. In addition, I hope they shrink the die process on the 9800.. then they'd be able to match the core speed of the GFFX, and MORE THAN DOUBLE its shoddy performance. nVidiot had, at one point, been decent. However, they fell behind and are relying on their marketers' smear campaigns, outright CHEATING, and brute-force huge-*ss-fan extreme core-speed to try and eke out a few more bucks. And shall we even go into the GFFX's lackadaisical excuse for FSAA? If you're spending THAT much for a card, you'd best expect to run it super-high and super-pretty.
Oh.. and shall we see how long it takes THIS card to get to store shelves, when the R9800 Pro was there before ANY of the GFFXes were even shipped? <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo-->
TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
nVidia has Dawn... but they don't look to have fully worked out the subcutaneous glow problem. <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif'><!--endemo-->
Though I guess if you want T&A, nVidia has an argument for the lowbrow. Personally, I'll never understand the draw.
Marik_SteeleTo rule in hell...Join Date: 2002-11-20Member: 9466Members
edited May 2003
The way I see it, NVidia fell back in the tech race when they spent time and human/financial resources making the video card systems for the XBox and the NForce chipset. Both fulfill their respective purposes. But the result is that ATI has been (to my knowledge) only concentrating on video cards for the past five years, and understandably have a superior product.
TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
I still can't help but find it funny that a mobo with an nForce or nForce2 chipset has problems when running an nVidia graphics card. If they can't get two products that they themselves manufacture to work together properly, much less seamlessly... it just pointedly shows that they have severe issues.
realityisdeadEmployed by Raven Software after making ns_nothingJoin Date: 2002-01-26Member: 94Members, NS1 Playtester, Contributor
edited May 2003
What kind of problems, exactly, Talesin? I've seen you mention it before, but really never heard of anything about it anywhere else. Are there some sources you can point me to that demonstrate or confirm this? Like I mentioned in a previous post a bit ago, I plan to put together a computer eventually here, and had/have the intention of pairing an nForce2 based motherboard with something in the GeForce line of video cards. If it's an actual problem, I'd certainly like to look into it before throwing quite a bit of money down the drain, heh.
On topic, does anyone know of a comparison table/benchmark setup which shows how the 5900 Ultra stands in comparision to something earlier, but once top-of-the-line, like a GeForce4 Ti 4600? Or an idea on how a Geforce4 would hold up in next-gen DirectX 9 games?
Also, does anyone have a brand name preference regarding nVidia cards? Leadtek, PNY? Does it matter? :)
<!--QuoteBegin--ken20banks+May 20 2003, 09:45 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (ken20banks @ May 20 2003, 09:45 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Also, does anyone have a brand name preference regarding nVidia cards? Leadtek, PNY? Does it matter? <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html/emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif'><!--endemo--> <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd--> well, some brands make better looking cards, more effecient cooling options*leave the leaf blower jokes out*, better on board ram on the card- etc. it all depends on the manufacture. it maybe nvidia has fallen for awhile this may be true, but there cards work great for me and i wouldnt want anything less, yes ive used a radeon before and i wasnt impressed one bit - more trouble then what i want. bias'ed or whatever to me a good geforce is all i need to play my games regardless of pointless benchmarks out there just to sell more games.
TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
Unfortunately I can't link any articles as I'm a bit lazy. The long and short of it is that at least until recently (I'm not certain if they've figured out why and fixed it or not.. I avoid shoddy hardware like the plague) installing an nVidia GeForce of any revision on a motherboard carrying the nForce or nForce2 chipsets would result in unexplained errors and general instability in the system. Some machines work fine with it. Others have no end of problems, to the point of being forced to either return the mobo for another model, or swap out for a different brand of video card... in either case resulting in the errors vanishing instantly.
I'd recommend the KT400 chipset, myself. Through personal experience, they're stable, fast, and the few VIA-based bugs are known and have workarounds available... none of which cause problems to the extent of crashing the entire machine, or freezing it solid. It's used in top-grade boards... the ASUS A7V8X (which I now own or have put together two Alienware-killer machines based upon) and the Soyo Dragon Platinum/Platinum Deluxe, which a number of my friends swear by (as opposed to at, in the case of nForce boards).
<!--QuoteBegin--Marik_Steele+May 20 2003, 11:18 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Marik_Steele @ May 20 2003, 11:18 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> But the result is that ATI has been (to my knowledge) only concentrating on video cards for the past five years, and understandably have a superior product. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd--> ATI helped make the Gamecubes on board graphics chip
<!--QuoteBegin--Venmoch+May 21 2003, 12:56 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Venmoch @ May 21 2003, 12:56 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin--Marik_Steele+May 20 2003, 11:18 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Marik_Steele @ May 20 2003, 11:18 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> But the result is that ATI has been (to my knowledge) only concentrating on video cards for the past five years, and understandably have a superior product. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> ATI helped make the Gamecubes on board graphics chip <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd--> I know. When I was buying my Gamecube I almost didn't get it when I saw that sticker. Really, I had put it down and was walking towards the door. But then I thought "Now now, Metroid Prime. Think of Samus! You need her!" so I broke down and bought one anyway. :)
<!--QuoteBegin--Talesin+May 20 2003, 03:08 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ May 20 2003, 03:08 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Though I guess if you want T&A, nVidia has an argument for the lowbrow. Personally, I'll never understand the draw. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd--> Are you referring to the lovely Dawn as merely "T&A"? Knave! I should strike you down where you stand.
<!--QuoteBegin--Zel+May 15 2003, 01:14 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Zel @ May 15 2003, 01:14 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> so what if ATI packs more megahertz onto their card? that makes them spiffier in many benchmark softwares, but heck, if this card performs as that graph says, who cares about specifications?
sadly, i think that graph is fake. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd--> *nVIDIA* packs more megahertz into their cards. GFFX 5900 ultra is clocked at 450/450 as i recall, and the GFFX 5800 Ultra is clocked at 500/500.
TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
edited May 2003
Oh... and a recent development. <a href='http://www.rage3d.com/articles/atidawning/' target='_blank'>Dawn has been cheating on nVidia... with ATI!</a> And she moves more smoothly, even AFTER being passed through a wrapper written by <u>college students</u>, than on the native hardware the demo was written specifically to showcase. XD
I'm going to say this just once, as it so very well applies.
IMHO, there is no need for Graphics Card fanboys, that's abit like having fanboys for different types of Drafting Programs. I think both CAD and Rhino are equaly good, but Rhino is more "user-friendly".
<!--QuoteBegin--[2iD]EriC[LdR]+May 15 2003, 03:54 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> ([2iD]EriC[LdR] @ May 15 2003, 03:54 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> driver issue ? most likely but its fun taking a stab at ati .. ati fans do this all the time <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd--> That's because the ATi fans are RIGHT. <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif'><!--endemo--> Once Doom 3 is released, newer Catalyst drivers will be out and the 9800 will eat FX 5900s for breakfast. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html/emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif'><!--endemo--> NVidia screwed up with their FX line, plain and simple. It took up double the space, tripled PC noise, and cost more than the Radeon 9700. Radeon > GeForce, and always will, unless ATi decides to hire idiot technicians, and NVidia hires better techies. The FX 5900 is the good player they put on the "special" team to make them look better, pretty much. If NVidia were just a tad smarter, they'd probably release it as a new brand. Most are pretty much avoiding any video card with "FX" stamped on the box.
A warning for other people: Don't post on how ATI sucks unless you have RELIABLE evidence. Don't post on how NVidia Sucks unless you have RELIABLE evidence. Reliable evidence: <a href='http://www6.tomshardware.com/graphic/20030512/index.html' target='_blank'>Click Here</a> Several basic tests, showing that the fx5900 is indeed the better card at the moment.
Something that makes all these stupid arguments seem all the more stupid: fx5900 owns the Radeon 9800 Radeon 9900 will wipe the floor with the fx5900 fx6000 will wipe the floor with the Radeon 9900 Radeon 10000 will wipe the floor with the fx6000 Anyone notice a pattern?
<!--QuoteBegin--83457+May 28 2003, 05:42 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (83457 @ May 28 2003, 05:42 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> A warning for other people: Don't post on how ATI sucks unless you have RELIABLE evidence. Don't post on how NVidia Sucks unless you have RELIABLE evidence. Reliable evidence: <a href='http://www6.tomshardware.com/graphic/20030512/index.html' target='_blank'>Click Here</a> Several basic tests, showing that the fx5900 is indeed the better card at the moment.
Something that makes all these stupid arguments seem all the more stupid: fx5900 owns the Radeon 9800 Radeon 9900 will wipe the floor with the fx5900 fx6000 will wipe the floor with the Radeon 9900 Radeon 10000 will wipe the floor with the fx6000 Anyone notice a pattern? <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd--> That depends. The 9700, which was much older than the original FX line, owned it. Once the Radeon gets to 10000, nVidia might as well go out of business.
TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
edited May 2003
Especially with Tom's (*cough*) slight sellout factor.
And again, as noted. With the 9800 Pro running on the pre-3.4 Catalysts, and the GFFX 5900 running without their cheats, the R9800 Pro lands in the lead. The Dawn demo running FASTER on an ATI card *with* a wrapper in between slowing it down should tell you that. A whopping 15% against an NV30, and still ahead (albeit by a slimmer margin) against an NV35.
nVidia has had their top-of-the-line card beaten, on their own turf, through an interpreter, by ATI.
If THAT isn't enough 'reliable evidence', then what is? They couldn't even beat the R9800 Pro when they WROTE the demo, and were ALLOWED to cheat as much as possible.
AllUrHiveRblong2usBy Your Powers Combined...Join Date: 2002-12-20Member: 11244Members
I fail to see how it matters which is better. They are both good, and after playing so many games on a Voodoo 3 for so long, I have realised that the specs don't matter, as long as it works. Argueing about it is pointless. Will they both run HL? Yes. Will either of them explode when under strain? No. Can I afford either? No. Now it's settled.
You guys *did* read John Carmack's Slashdot post about how the benchmarks both companies are claiming they run at are completely and utterly pointless, correct?
<!--QuoteBegin--DuBERS+May 14 2003, 10:47 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DuBERS @ May 14 2003, 10:47 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> ATI will release their next gen card and blow NVIDIA away. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd--> <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo--> That goes without saying. It is a game of leapfrog. One releases the newest, latest, and greatest. Then the other does the same. I doubt that is going to end anytime soon.
TalesinOur own little well of hateJoin Date: 2002-11-08Member: 7710NS1 Playtester, Forum Moderators
Only problem is, nVidia's been missing the 'jump over' part recently, as they haven't had to really work those leg muscles, resting on their laurels for so long. Instead they've started to land short, and are yelling and spewing **** to try and distract people from that fact.
Comments
or see the other topic
I wonder if its true lol <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
that's only in synthetic benchmarks though...
in game benchmarks are the real key... i never believe any of this 3dmark crap.
Obviously they think enough people DO care about 3D Mark tests to warrant putting their reputation on the line <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->
Yes, nVidia cheated on the benchmark. Wanna know why they didn't think they'd be caught? <b>They didn't own a full version of 3DMark2003 at the time.</b> They didn't /know/ you could move the camera off the rails. They were caught flat-out, bald-faced cheating to try and market their cr*ppy cards as 'fast'.
That graph is amazing.. 10fps? Wow. They must have ONLY tested their card with the full-version, and dumped the old tech demo on the R9800. As well as used the new Cats, which are known to have problems. When running with Catalyst 3.1 drivers, the GFFX 5900 gets STOMPED by 10-20% by the R9800 Pro. And that's without the Pro version which will be coming out in a month or two with DDR-2 and a higher core speed.
In addition, I hope they shrink the die process on the 9800.. then they'd be able to match the core speed of the GFFX, and MORE THAN DOUBLE its shoddy performance. nVidiot had, at one point, been decent. However, they fell behind and are relying on their marketers' smear campaigns, outright CHEATING, and brute-force huge-*ss-fan extreme core-speed to try and eke out a few more bucks.
And shall we even go into the GFFX's lackadaisical excuse for FSAA? If you're spending THAT much for a card, you'd best expect to run it super-high and super-pretty.
Oh.. and shall we see how long it takes THIS card to get to store shelves, when the R9800 Pro was there before ANY of the GFFXes were even shipped? <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo-->
<img src='http://www.nvidia.com/docs/IO/3648/SUPP/large02.jpg' border='0' alt='user posted image'>
ATI = pwned.
Though I guess if you want T&A, nVidia has an argument for the lowbrow. Personally, I'll never understand the draw.
On topic, does anyone know of a comparison table/benchmark setup which shows how the 5900 Ultra stands in comparision to something earlier, but once top-of-the-line, like a GeForce4 Ti 4600? Or an idea on how a Geforce4 would hold up in next-gen DirectX 9 games?
Also, does anyone have a brand name preference regarding nVidia cards? Leadtek, PNY? Does it matter? :)
well, some brands make better looking cards, more effecient cooling options*leave the leaf blower jokes out*, better on board ram on the card- etc. it all depends on the manufacture. it maybe nvidia has fallen for awhile this may be true, but there cards work great for me and i wouldnt want anything less, yes ive used a radeon before and i wasnt impressed one bit - more trouble then what i want. bias'ed or whatever to me a good geforce is all i need to play my games regardless of pointless benchmarks out there just to sell more games.
I'd recommend the KT400 chipset, myself. Through personal experience, they're stable, fast, and the few VIA-based bugs are known and have workarounds available... none of which cause problems to the extent of crashing the entire machine, or freezing it solid. It's used in top-grade boards... the ASUS A7V8X (which I now own or have put together two Alienware-killer machines based upon) and the Soyo Dragon Platinum/Platinum Deluxe, which a number of my friends swear by (as opposed to at, in the case of nForce boards).
ATI helped make the Gamecubes on board graphics chip
ATI helped make the Gamecubes on board graphics chip <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
I know. When I was buying my Gamecube I almost didn't get it when I saw that sticker. Really, I had put it down and was walking towards the door. But then I thought "Now now, Metroid Prime. Think of Samus! You need her!" so I broke down and bought one anyway. :)
<b>Do It For Her!</b>
Are you referring to the lovely Dawn as merely "T&A"? Knave! I should strike you down where you stand.
But the ogre will do that for me.
<img src='http://www.nvidia.com/docs/IO/3649/SUPP/large02.jpg' border='0' alt='user posted image'>
nVidia: 2
ATI: 0
<b>Do It For Her!</b> <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
She's my honey!
^_^
<img src='http://members.cox.net/doomaniac/meandsamus.jpg' border='0' alt='user posted image'>
sadly, i think that graph is fake. <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
*nVIDIA* packs more megahertz into their cards. GFFX 5900 ultra is clocked at 450/450 as i recall, and the GFFX 5800 Ultra is clocked at 500/500.
Compare: Radeon 9700 pro, stock clock 375/375. Radeon 9800 pro, Stock clock 380/380
Thanks for the heads up, Talesin.
And she moves more smoothly, even AFTER being passed through a wrapper written by <u>college students</u>, than on the native hardware the demo was written specifically to showcase. XD
I'm going to say this just once, as it so very well applies.
nVidia == <span style='color:yellow'><b>*PWEENED*</b></span>
That's because the ATi fans are RIGHT. <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' style='vertical-align:middle' alt='wink.gif'><!--endemo--> Once Doom 3 is released, newer Catalyst drivers will be out and the 9800 will eat FX 5900s for breakfast. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html/emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif'><!--endemo--> NVidia screwed up with their FX line, plain and simple. It took up double the space, tripled PC noise, and cost more than the Radeon 9700. Radeon > GeForce, and always will, unless ATi decides to hire idiot technicians, and NVidia hires better techies. The FX 5900 is the good player they put on the "special" team to make them look better, pretty much. If NVidia were just a tad smarter, they'd probably release it as a new brand. Most are pretty much avoiding any video card with "FX" stamped on the box.
Don't post on how ATI sucks unless you have RELIABLE evidence.
Don't post on how NVidia Sucks unless you have RELIABLE evidence.
Reliable evidence: <a href='http://www6.tomshardware.com/graphic/20030512/index.html' target='_blank'>Click Here</a>
Several basic tests, showing that the fx5900 is indeed the better card at the moment.
Something that makes all these stupid arguments seem all the more stupid:
fx5900 owns the Radeon 9800
Radeon 9900 will wipe the floor with the fx5900
fx6000 will wipe the floor with the Radeon 9900
Radeon 10000 will wipe the floor with the fx6000
Anyone notice a pattern?
Don't post on how ATI sucks unless you have RELIABLE evidence.
Don't post on how NVidia Sucks unless you have RELIABLE evidence.
Reliable evidence: <a href='http://www6.tomshardware.com/graphic/20030512/index.html' target='_blank'>Click Here</a>
Several basic tests, showing that the fx5900 is indeed the better card at the moment.
Something that makes all these stupid arguments seem all the more stupid:
fx5900 owns the Radeon 9800
Radeon 9900 will wipe the floor with the fx5900
fx6000 will wipe the floor with the Radeon 9900
Radeon 10000 will wipe the floor with the fx6000
Anyone notice a pattern? <!--QuoteEnd--> </td></tr></table><span class='postcolor'> <!--QuoteEEnd-->
That depends. The 9700, which was much older than the original FX line, owned it. Once the Radeon gets to 10000, nVidia might as well go out of business.
And again, as noted. With the 9800 Pro running on the pre-3.4 Catalysts, and the GFFX 5900 running without their cheats, the R9800 Pro lands in the lead. The Dawn demo running FASTER on an ATI card *with* a wrapper in between slowing it down should tell you that. A whopping 15% against an NV30, and still ahead (albeit by a slimmer margin) against an NV35.
nVidia has had their top-of-the-line card beaten, on their own turf, through an interpreter, by ATI.
If THAT isn't enough 'reliable evidence', then what is? They couldn't even beat the R9800 Pro when they WROTE the demo, and were ALLOWED to cheat as much as possible.
<!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo--> That goes without saying. It is a game of leapfrog. One releases the newest, latest, and greatest. Then the other does the same. I doubt that is going to end anytime soon.