OK maybe one of you can help me.. My father in law likes playing some games but ends up with HL running at 30fps on a:
P3 500mhz 16mb Riva TNT 2 (Viper 550) WinXP 256mb of ram Running OpenGL (D3D is very buggy). Running at 640x480 in 16bit color
It's not just NS, its everything, including DoD and CS.. Now SURELY such a computer must be able to get better fps. If I misposted, I'm sorry, but you peeps seem to know yer beezwax. Thanx!
anyone who can truthfully say 'the human eye cant distinguish anything over 35fps' has obviously never played a game at greater than 35fps. Go try it, it'll blow your mind.
regarding video captured with a camera, it can be played at much lower fps and look fine because each frame of film is (essentially) an average of everything the camera could see for 1/24 of a second (some time in between each frame is lost due to processing/reading/resetting of the digital optics), giving objects that were moving during that time the phenomenon known as motion blur. They appear stretched in their direction of motion because while the digital 'shutter' was open the object moved. Because one frame starts averaging close to where (in time) the other left off, where the stretch on one frame stopped, the stretch on the next frame starts soon after. This creates one continuous path that your eye sees.
However, each 'frame' of a computer game is the static, instantaneous state of the world each time the frame is created, without regard for whatever happened since the last frame was created: no motion blur. Moving objects are in one location one frame and in a new position the next frame, with nothing connecting the two locations. The human eye really doesn't like that, as its used to a world where everything is continuous, not discrete. As a result, it takes lots of frames per second to con the eye/mind into thinking its seeing motion (more frames per second mean that for an object at a given velocity, its discrete positions are closer together, so it is closer and closer to being a continuous path).
So in essence, the distance between successive positions of objects is what determines whether or not the eye percieves something as motion. Motion blur does that easily, but it can also be done by jacking up your fps.
60fps maxed, vsynced at 1280x960 (id play it at a higher refresh but I have a 17", I'd run it at a lower resolution, but then it doesnt fit the screen :/) never drops below 60.. unless theres a smoke grenade (cs, drops to 30fps errrrr) or a milion bloody turrets in one room.. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html/emoticons/smile.gif' border='0' valign='absmiddle' alt='smile.gif'><!--endemo-->
1024x768x16 @ ca.70 fps, GF4MX440, PIII700, 512 MB, 98se, 17" CRT at 85 Hz - works fine for me <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html/emoticons/smile.gif' border='0' valign='absmiddle' alt='smile.gif'><!--endemo-->
The "Human I specs" somewhere up in this thread were right (to my knowledge) and to add: Military simulations have (had once?) the absolute minimum of 60 fps. AFAIK over 60 fps your brain doesn't make a difference between "seperate images" and "real life movement" - to a fly 60 fps still is a slide-show <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' valign='absmiddle' alt='wink.gif'><!--endemo-->
Also there were some "experimental"-cinematic technologies doing 60 fps to increase the intenseness of the movie experience (dunno, never was in such a cinema <!--emo&???--><img src='http://www.unknownworlds.com/forums/html/emoticons/confused.gif' border='0' valign='absmiddle' alt='confused.gif'><!--endemo-->)
The 24 fps in cinemas use the persistence/"inertia" of your eye in dark rooms. Ex. stare at a brighter point some time and then look at a wall... you will still see the bright image (inverted)
I can see the flickering of monitors up to 75 Hz. (If you don't, try to look a litte right or left asides your monitor - you will see it then, as the human eye is more frequency-receptive at its borders (results from evolution, to track moving objects which may attack you)). So a refresh of 85 Hz is an absolute min. for me - 60 Hz gives headache!
I find framedroppers at any rate more irritating in handling a game than a steady "slow" stream of say 30 or 40 fps... and "speed" outnumbers "antialiasing"... <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' valign='absmiddle' alt='wink.gif'><!--endemo-->
HellbillyA whole title out of pity...Join Date: 2002-11-02Member: 3931Members, NS1 Playtester, Constellation
<!--QuoteBegin--.eLiMiNaToR.+Nov 27 2002, 04:46 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (.eLiMiNaToR. @ Nov 27 2002, 04:46 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Hellbilly it finally let me post! By the way I'll give my specs in the morning for now im gonna go pass out, ltr.. So much posting to do, in so little time! <!--emo&::skulk::--><img src='http://www.unknownworlds.com/forums/html/emoticons/skulk.gif' border='0' valign='absmiddle' alt='skulk.gif'><!--endemo--> <!--emo&::sentry::--><img src='http://www.unknownworlds.com/forums/html/emoticons/turret.gif' border='0' valign='absmiddle' alt='turret.gif'><!--endemo--><!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> Nice to see you around mate!
GeForce4Ti4400 OC'ed at Ti4600 speeds Athlon XP 2000
No slowdowns for me... I would run at 1600*1200, but my monitor (even though its big - 19") is crap and flickers like **obscenity** at 1600*1200.
As for framerate, it is true that the human eye only really sees at 35 fps. However, what that means is that we see an average of whatever light has been coming into our eyes for the past 1/35 second. If your computer screen is running at precisely 35 fps, you are only seeing one, perfectly sharp, frame at a time. This looks unrealistic. When you are running at 85 fps (the max refresh rate on most monitors) then you are seeing an average of three frames, which looks a lot better. A movie only refreshes at 24 Hz, but it has motion blur, so it looks a LOT better.
However, with gaming, what is often more important is control responsiveness. If your computer is churning 35 frames per second, it will take 1/35 of a second (29 msec) for your controls to respond to any input. Although human reflexes are considerably slower than that (100-200 msec), your brain can easily tell that the computer screen is not responding as fast as it should. (note: highly skilled Quake gamers may have reflexes considerably faster than that - even the 12 msec delay at 85 fps becomes intolerable, so they universally despise Vsync) However, a lot of this is a trained response; one person may not notice the control lag at 30 FPS, another person may easily tell the difference between a 85 Hz refresh rate and a 100 Hz refresh rate.
Before I got my previous computer, I ran games at low framerate on a crappy Radeon 7000, and I literally could not tell the difference between 30 fps and 60 fps - they all seemed acceptable to me. However, once I got my GeForce4 rig, and everything ran at 85 fps (I like Vsync, and leave it on most of the time), I became incredibly sensitive to even a small framerate drop - 60 fps seemed sluggish compared to 85 fps, and 30 fps is nigh unplayable. Going back to my secondary computer, I cant play anything at all - the framerate is way too low. Even Starcraft and Diablo, which seemed really good before, now seem really jerky.
The ability to discern framerate is really a trained response. I've heard of experienced game benchmarkers (at FiringSquad, Anandtech etc) becoming so sensitive to framerate, they can tell the difference between a game running at 120 fps and one running at 150 fps. (Even considering that the monitor only refreshes at 85-100 Hz!) However, most of us have much less experience in watching high-fps benchmarks, and would be completely unable to tell this difference.
My graphics card has probably nearly melted it self by now and its very very **obscenity**
At 1024x768 or above I get big fat multicoloured rectangles flashing over my screen now and then and it freezes when ever a special effect tries to go off near me (like when a health pack is dropped on me). This is in all HL mods and sometimes in SoF2 but not AvP2 (a whole different kind of problem there: when any types a message or when someone gets killed then the scores and the player name when u pass the cross hair over them doesnt show up. Thats extremly bad when your playing Marines vs Corps)
So i have to play in 800x600
On my other PC i have it set to the max of what the monitor can handle and it runs most spanktasicly. But the RAM is fubared so i get alot of BSOD and random crashes.
i only use 1024x768 mainly coz my comp is 6 years old and i need a new one but im keeping this one around coz its not dead just yet really i av like a voodoo3 128ram 700mhz erga <!--emo&???--><img src='http://www.unknownworlds.com/forums/html/emoticons/confused.gif' border='0' valign='absmiddle' alt='confused.gif'><!--endemo-->
<!--QuoteBegin--Foggy+Nov 26 2002, 11:21 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Foggy @ Nov 26 2002, 11:21 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->my monitor is too small. (17").<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> *COUGH* I play on a 15" monitor.
<!--QuoteBegin--Zarkark+Nov 27 2002, 04:33 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Zarkark @ Nov 27 2002, 04:33 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Also there were some "experimental"-cinematic technologies doing 60 fps to increase the intenseness of the movie experience (dunno, never was in such a cinema <!--emo&???--><img src='http://www.unknownworlds.com/forums/html/emoticons/confused.gif' border='0' valign='absmiddle' alt='confused.gif'><!--endemo-->)<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> Forget what they called that ... UltiMovie or something stupid like that was the pilot name. "Movies at the speed of life" was the tagline, IIRC.
In the still shots, it looked painful, like you were watching the outtakes and you expected a boom mike to hit someone upside the head at any moment. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' valign='absmiddle' alt='biggrin.gif'><!--endemo-->
In the motion shots ... oh sweet holy crap, it was beautiful. What I would give to have the projector and reel from a demo they had of captured racing ... no motion blur needed to fake the speed effect there. It just <b>screamed</b> by. Utterly astounding.
60fps film is still used in things like those "movie rides" at theme parks. There was one that was some manner of Star-Wars esqe space combat, involving inversions (the SEAT moved, not the film) and a knockoff of the trench run on the Death Star. <b>O. M. F. G.</b> <!--emo&:0--><img src='http://www.unknownworlds.com/forums/html/emoticons/wow.gif' border='0' valign='absmiddle' alt='wow.gif'><!--endemo-->
Comments
My father in law likes playing some games but ends up with HL running at 30fps on a:
P3 500mhz
16mb Riva TNT 2 (Viper 550)
WinXP
256mb of ram
Running OpenGL (D3D is very buggy).
Running at 640x480 in 16bit color
It's not just NS, its everything, including DoD and CS..
Now SURELY such a computer must be able to get better fps.
If I misposted, I'm sorry, but you peeps seem to know yer beezwax.
Thanx!
regarding video captured with a camera, it can be played at much lower fps and look fine because each frame of film is (essentially) an average of everything the camera could see for 1/24 of a second (some time in between each frame is lost due to processing/reading/resetting of the digital optics), giving objects that were moving during that time the phenomenon known as motion blur. They appear stretched in their direction of motion because while the digital 'shutter' was open the object moved. Because one frame starts averaging close to where (in time) the other left off, where the stretch on one frame stopped, the stretch on the next frame starts soon after. This creates one continuous path that your eye sees.
However, each 'frame' of a computer game is the static, instantaneous state of the world each time the frame is created, without regard for whatever happened since the last frame was created: no motion blur. Moving objects are in one location one frame and in a new position the next frame, with nothing connecting the two locations. The human eye really doesn't like that, as its used to a world where everything is continuous, not discrete. As a result, it takes lots of frames per second to con the eye/mind into thinking its seeing motion (more frames per second mean that for an object at a given velocity, its discrete positions are closer together, so it is closer and closer to being a continuous path).
So in essence, the distance between successive positions of objects is what determines whether or not the eye percieves something as motion. Motion blur does that easily, but it can also be done by jacking up your fps.
I hope thats a good enough explanation.
The "Human I specs" somewhere up in this thread were right (to my knowledge) and to add:
Military simulations have (had once?) the absolute minimum of 60 fps.
AFAIK over 60 fps your brain doesn't make a difference between "seperate images" and "real life movement" - to a fly 60 fps still is a slide-show <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' valign='absmiddle' alt='wink.gif'><!--endemo-->
Also there were some "experimental"-cinematic technologies doing 60 fps to increase the intenseness of the movie experience (dunno, never was in such a cinema <!--emo&???--><img src='http://www.unknownworlds.com/forums/html/emoticons/confused.gif' border='0' valign='absmiddle' alt='confused.gif'><!--endemo-->)
The 24 fps in cinemas use the persistence/"inertia" of your eye in dark rooms. Ex. stare at a brighter point some time and then look at a wall... you will still see the bright image (inverted)
I can see the flickering of monitors up to 75 Hz. (If you don't, try to look a litte right or left asides your monitor - you will see it then, as the human eye is more frequency-receptive at its borders (results from evolution, to track moving objects which may attack you)). So a refresh of 85 Hz is an absolute min. for me - 60 Hz gives headache!
I find framedroppers at any rate more irritating in handling a game than a steady "slow" stream of say 30 or 40 fps... and "speed" outnumbers "antialiasing"... <!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' valign='absmiddle' alt='wink.gif'><!--endemo-->
Nice to see you around mate!
GeForce4Ti4400 OC'ed at Ti4600 speeds
Athlon XP 2000
No slowdowns for me... I would run at 1600*1200, but my monitor (even though its big - 19") is crap and flickers like **obscenity** at 1600*1200.
As for framerate, it is true that the human eye only really sees at 35 fps. However, what that means is that we see an average of whatever light has been coming into our eyes for the past 1/35 second. If your computer screen is running at precisely 35 fps, you are only seeing one, perfectly sharp, frame at a time. This looks unrealistic. When you are running at 85 fps (the max refresh rate on most monitors) then you are seeing an average of three frames, which looks a lot better. A movie only refreshes at 24 Hz, but it has motion blur, so it looks a LOT better.
However, with gaming, what is often more important is control responsiveness. If your computer is churning 35 frames per second, it will take 1/35 of a second (29 msec) for your controls to respond to any input. Although human reflexes are considerably slower than that (100-200 msec), your brain can easily tell that the computer screen is not responding as fast as it should. (note: highly skilled Quake gamers may have reflexes considerably faster than that - even the 12 msec delay at 85 fps becomes intolerable, so they universally despise Vsync) However, a lot of this is a trained response; one person may not notice the control lag at 30 FPS, another person may easily tell the difference between a 85 Hz refresh rate and a 100 Hz refresh rate.
Before I got my previous computer, I ran games at low framerate on a crappy Radeon 7000, and I literally could not tell the difference between 30 fps and 60 fps - they all seemed acceptable to me. However, once I got my GeForce4 rig, and everything ran at 85 fps (I like Vsync, and leave it on most of the time), I became incredibly sensitive to even a small framerate drop - 60 fps seemed sluggish compared to 85 fps, and 30 fps is nigh unplayable. Going back to my secondary computer, I cant play anything at all - the framerate is way too low. Even Starcraft and Diablo, which seemed really good before, now seem really jerky.
The ability to discern framerate is really a trained response. I've heard of experienced game benchmarkers (at FiringSquad, Anandtech etc) becoming so sensitive to framerate, they can tell the difference between a game running at 120 fps and one running at 150 fps. (Even considering that the monitor only refreshes at 85-100 Hz!) However, most of us have much less experience in watching high-fps benchmarks, and would be completely unable to tell this difference.
At 1024x768 or above I get big fat multicoloured rectangles flashing over my screen now and then and it freezes when ever a special effect tries to go off near me (like when a health pack is dropped on me). This is in all HL mods and sometimes in SoF2 but not AvP2 (a whole different kind of problem there: when any types a message or when someone gets killed then the scores and the player name when u pass the cross hair over them doesnt show up. Thats extremly bad when your playing Marines vs Corps)
So i have to play in 800x600
On my other PC i have it set to the max of what the monitor can handle and it runs most spanktasicly. But the RAM is fubared so i get alot of BSOD and random crashes.
*COUGH* I play on a 15" monitor.
with P4 2.1A Ghz, 512 DDR Ram, 82 GIG HD, Geforce 4 Ti 4200 w/a3
I want to play at 12x10 but hl engine doesn't support.
can't play at 16x12 because LCD monitor doesn't support it.
Forget what they called that ... UltiMovie or something stupid like that was the pilot name. "Movies at the speed of life" was the tagline, IIRC.
In the still shots, it looked painful, like you were watching the outtakes and you expected a boom mike to hit someone upside the head at any moment. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' valign='absmiddle' alt='biggrin.gif'><!--endemo-->
In the motion shots ... oh sweet holy crap, it was beautiful. What I would give to have the projector and reel from a demo they had of captured racing ... no motion blur needed to fake the speed effect there. It just <b>screamed</b> by. Utterly astounding.
60fps film is still used in things like those "movie rides" at theme parks. There was one that was some manner of Star-Wars esqe space combat, involving inversions (the SEAT moved, not the film) and a knockoff of the trench run on the Death Star. <b>O. M. F. G.</b> <!--emo&:0--><img src='http://www.unknownworlds.com/forums/html/emoticons/wow.gif' border='0' valign='absmiddle' alt='wow.gif'><!--endemo-->
- M4H