<!--QuoteBegin-Swiftspear+Aug 14 2004, 01:15 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Swiftspear @ Aug 14 2004, 01:15 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> The fact that the eye can intellegently process an image flashed for 1/200th of a second does not mean that your eye is refreshing its picture at that FPS. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> The whole point of that article is that the brain doesn't have a refresh rate.
It's funny, and yet very sad, watching people argue that one side is better than the other. Whatever nVidia card EEK used to take those screenshots from Doom 3 sure as hell wasn't a 6800 series card. I run D3 at 1280x1024, high quality (8xaf), with all the special effects turned on, and the game runs smooth and looks incredible.
<!--QuoteBegin-Talesin+Aug 15 2004, 01:10 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Aug 15 2004, 01:10 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Jim, you are defending a flawed position. Look at the texture and normal corruption on the AR. Look at the HAND. Look at the terrible, horrible shadow-banding that is a tell-tale of the corners nVidia has been forced to cut, just so their fanboys will have some small measure of hope... like a homeless family huddled around a last candle. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> I relooked at the picture and it does have flaws but I have an Nvidia and I don't get anything like that AND I play the game on low. Advice would be to upgrade drivers cuz I don't get anything like that with my card and its a geforce 3...
Talesin you really shouldn't call people fanboys when you are more...vocal about your alliances than anyone else here....your input helped me, and I will be getting an ATI, but yeah you just seem to get too angry <!--emo&???--><img src='http://www.unknownworlds.com/forums/html//emoticons/confused.gif' border='0' style='vertical-align:middle' alt='confused.gif' /><!--endemo-->
coilAmateur pirate. Professional monkey. All pance.Join Date: 2002-04-12Member: 424Members, NS1 Playtester, Contributor
The "humans can see 30 fps" myth comes from film. Film is shot at 35fps (I think) because it's the least number of fps that produce a smooth, consistant image.
The catch here is the difference between film and computer graphics. If you take full-motion video footage of a guy jumping and then look at a single frame of the reel, that single frame will be blurred. *Every* frame is blurred. The human eye melds the motion-blurred images together and creates a smooth, cohesive movement in your brain. Anything less than 30-35fps, and it starts to look choppy.
Compare that to a computer game. Any screenshot you take has *zero* motion in it - it is a screenshot, a photograph, a still image. This is comparable to stop-motion animation in movies like <i>Jason & The Argonauts</i> or <i>The Nightmare Before Christmas</i>. The motion is choppy, because there is no blur to fill in the gaps for your brain. Computers make up for it by upping the FPS - at 90 fps, each of the 30fps of film equates to three frames. Your eye blends those three frame, producing a single motion-blurred image for your brain to then meld with the previous and the next one. The result: smoother movements.
At some point in cinema, the technique of stop-motion animation reached a technical level where motion-blurring was possible (I forget the name of the movie; maybe "Dragon's Lair" or something like that). Eventually, we can hope, video cards will reach the same level of ability and then fps won't be the Holy Grail goal that it is today.
<!--QuoteBegin-Talesin+Aug 15 2004, 04:10 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Aug 15 2004, 04:10 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Jim, you are defending a flawed position. Look at the texture and normal corruption on the AR. Look at the HAND. Look at the terrible, horrible shadow-banding that is a tell-tale of the corners nVidia has been forced to cut, just so their fanboys will have some small measure of hope... like a homeless family huddled around a last candle.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> I ask once again: What specific video card was that taken on? Not a single review site has mentioned these kinds of IQ problems with the 6xxx series.
Are you THAT upset that even the top of the line ATIs were <b>murdered</b> in Doom3, even after the image fudging that Carmack himself pointed out?
<!--QuoteBegin-Talesin+Aug 15 2004, 04:10 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Talesin @ Aug 15 2004, 04:10 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Jim, you are defending a flawed position. Look at the texture and normal corruption on the AR. Look at the HAND. Look at the terrible, horrible shadow-banding that is a tell-tale of the corners nVidia has been forced to cut, just so their fanboys will have some small measure of hope... like a homeless family huddled around a last candle. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> I somehow <a href='http://rb6rs.homestead.com/files/doom3.png' target='_blank'>don't</a> see a problem.
That's 1024x768@high quality, no AA. I got a Leadtek 5900XT.
<!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->btw, apparently the x600 line will suxor (hearsay from a friend), as will the x300 series.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
My friend has a Dell with an x300. The thing sucks. It couldn't even push Q3 above 100 or so FPS, while my 5900 is right at 300fps with the same settings.
<!--QuoteBegin-coil+Aug 16 2004, 04:28 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (coil @ Aug 16 2004, 04:28 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> The "humans can see 30 fps" myth comes from film. Film is shot at 35fps (I think) because it's the least number of fps that produce a smooth, consistant image. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> That's wrong, I think. NTSC video is shot at 29.97 FPS, NTSC Film is shot at 23.976 FPS and PAL is shot at 25 FPS, if I remember correctly. With motion blurring, motion is smooth as long as it's fairly slow at 15 FPS.
<!--QuoteBegin-Wheeee+Aug 17 2004, 12:33 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Wheeee @ Aug 17 2004, 12:33 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> btw, apparently the x600 line will suxor (hearsay from a friend), as will the x300 series. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> My freind says that there is a million dollars buried under my house, I'm thinking of digging up the floor and pulling it out...
<!--QuoteBegin-Swiftspear+Aug 17 2004, 01:27 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Swiftspear @ Aug 17 2004, 01:27 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-Wheeee+Aug 17 2004, 12:33 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Wheeee @ Aug 17 2004, 12:33 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> btw, apparently the x600 line will suxor (hearsay from a friend), as will the x300 series. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> My freind says that there is a million dollars buried under my house, I'm thinking of digging up the floor and pulling it out... <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> only dead corpses under my floor boards
Might sound nubish, but how the hell are you guys getting your fps to showup on doom3? I've tried a bunch of commands in the console, but I'm unfamiliar with doom3 commands so...
<!--QuoteBegin-Invader Scoot+Aug 17 2004, 09:54 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Invader Scoot @ Aug 17 2004, 09:54 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Might sound nubish, but how the hell are you guys getting your fps to showup on doom3? I've tried a bunch of commands in the console, but I'm unfamiliar with doom3 commands so... <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> com_showfps 1
<!--QuoteBegin-Invader Scoot+Aug 17 2004, 09:54 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Invader Scoot @ Aug 17 2004, 09:54 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Might sound nubish, but how the hell are you guys getting your fps to showup on doom3? I've tried a bunch of commands in the console, but I'm unfamiliar with doom3 commands so... <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> com_showfps 1
<!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Im having trouble understanding why the X800 costs almost as much as my Mobo+cpu+ram+case together.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Because your Mobo+cpu+ram+case won't give you eye-gasms like a X800.
<!--QuoteBegin-CommunistWithAGun+Aug 17 2004, 05:17 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (CommunistWithAGun @ Aug 17 2004, 05:17 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> I'm still stuck in a time where RAM and CPU speed mattered just as much as video <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> Ya, the recent preformance bottle necks have been video card peformance. CPU and RAM max out much easyer than ever.
I will conclude this post with the new icon I just knowticed <!--emo&::hive::--><img src='http://www.unknownworlds.com/forums/html//emoticons/hive5.gif' border='0' style='vertical-align:middle' alt='hive5.gif' /><!--endemo-->
Comments
The whole point of that article is that the brain doesn't have a refresh rate.
I relooked at the picture and it does have flaws but I have an Nvidia and I don't get anything like that AND I play the game on low. Advice would be to upgrade drivers cuz I don't get anything like that with my card and its a geforce 3...
The catch here is the difference between film and computer graphics. If you take full-motion video footage of a guy jumping and then look at a single frame of the reel, that single frame will be blurred. *Every* frame is blurred. The human eye melds the motion-blurred images together and creates a smooth, cohesive movement in your brain. Anything less than 30-35fps, and it starts to look choppy.
Compare that to a computer game. Any screenshot you take has *zero* motion in it - it is a screenshot, a photograph, a still image. This is comparable to stop-motion animation in movies like <i>Jason & The Argonauts</i> or <i>The Nightmare Before Christmas</i>. The motion is choppy, because there is no blur to fill in the gaps for your brain. Computers make up for it by upping the FPS - at 90 fps, each of the 30fps of film equates to three frames. Your eye blends those three frame, producing a single motion-blurred image for your brain to then meld with the previous and the next one. The result: smoother movements.
At some point in cinema, the technique of stop-motion animation reached a technical level where motion-blurring was possible (I forget the name of the movie; maybe "Dragon's Lair" or something like that). Eventually, we can hope, video cards will reach the same level of ability and then fps won't be the Holy Grail goal that it is today.
I ask once again: What specific video card was that taken on? Not a single review site has mentioned these kinds of IQ problems with the 6xxx series.
Are you THAT upset that even the top of the line ATIs were <b>murdered</b> in Doom3, even after the image fudging that Carmack himself pointed out?
I somehow <a href='http://rb6rs.homestead.com/files/doom3.png' target='_blank'>don't</a> see a problem.
That's 1024x768@high quality, no AA. I got a Leadtek 5900XT.
<!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->btw, apparently the x600 line will suxor (hearsay from a friend), as will the x300 series.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
My friend has a Dell with an x300. The thing sucks. It couldn't even push Q3 above 100 or so FPS, while my 5900 is right at 300fps with the same settings.
That's wrong, I think. NTSC video is shot at 29.97 FPS, NTSC Film is shot at 23.976 FPS and PAL is shot at 25 FPS, if I remember correctly. With motion blurring, motion is smooth as long as it's fairly slow at 15 FPS.
My freind says that there is a million dollars buried under my house, I'm thinking of digging up the floor and pulling it out...
My freind says that there is a million dollars buried under my house, I'm thinking of digging up the floor and pulling it out... <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
only dead corpses under my floor boards
I always go for 60fps its my favorite number
com_showfps 1
I think
com_showfps 1
Because your Mobo+cpu+ram+case won't give you eye-gasms like a X800.
Ya, the recent preformance bottle necks have been video card peformance. CPU and RAM max out much easyer than ever.
I will conclude this post with the new icon I just knowticed <!--emo&::hive::--><img src='http://www.unknownworlds.com/forums/html//emoticons/hive5.gif' border='0' style='vertical-align:middle' alt='hive5.gif' /><!--endemo-->