Do you care if NS2 would support 240hz monitors?
e_Ty
Join Date: 2017-07-31 Member: 232158Members
Recently I discovered that NS2 has a framerate cap of 200fps.
This is insufficient for gaming monitors with 240hz refresh rate.
This poll aims to see how much people care for 240hz support.
pros
+ for a fast paced game motion clarity is important
+ people who own 240hz monitor want to make full use of it
cons
- 240fps would not be easy to sustain during endgame
- extra work for the devs and the difference is "only" 40fps or 20% (not very noticeable)
See 240hz in action: https://youtube.com/watch?v=pUvx81C4bgs
This is insufficient for gaming monitors with 240hz refresh rate.
This poll aims to see how much people care for 240hz support.
pros
+ for a fast paced game motion clarity is important
+ people who own 240hz monitor want to make full use of it
cons
- 240fps would not be easy to sustain during endgame
- extra work for the devs and the difference is "only" 40fps or 20% (not very noticeable)
See 240hz in action: https://youtube.com/watch?v=pUvx81C4bgs
Comments
im yet to see a PC that can handle 200fps constantly in NS2.
i agree it should be uncapped if the user decides but if i recall correctly, it fucks with physics and makes movement slower the higher the fps was
Percentage-increases still scale nicely enough, so I imagine going from 120 to 240 is not too far from 60 to 120. Personally, due to bandwidth and performance limits, I'd rather increase my resolution instead.
The thing is though, the faster the frequency per second, the less significant the effective gain in milliseconds is per frame/second. Pretty sure at 240Hz is starting to go into Commander Data territory But hey uncapped FPS is simply a must have. And unless it screws up the internal engine clock, they might want to fix that first
yeh, we'd prolly need new CPU tech before that will ever be a thing.
How many people could even manage 144fps let alone 240? It would be such a microscopically tiny portion of the community who would benefit. Anyone who could manage 240 is already playing at 144 and getting a huge advantage over people stuck at 60, do we really need to throw away dev time and resources to improve their fps by such a tiny amount?
So many other things they could focus on that would vastly improve the game. (such as further optimizing the game so servers don't require as much to give decent performance)
If the dips occur say 10%-20% of the time well there' still be plenty where the extra frames can be enjoyed.
Use adaptive sync as framerate will always dip when there's lots of entities.
There are slight diminishing returns it's true, but we're far from the limits of perception.
On blurbuster.com they say we see individual frames until 480 fps but it's only around 18000hz that we can't differentiate it from real life so yeaaah.. still some room for improvements.
The votes show there is definite interest. This shouldn't come as a surprise since it's not football manager we're talking about.
People who own a 240hz monitors bought it for the feature alone. So it's a bit icking to play a game that doesn't run it.
Sorry if you can't afford a new monitors but you should ask yourself, if you can't go to a party because you don't have a car (or whatever reason), should we all cancel just because of you? Because that's the essence of your argument here. And to be fair, high refresh rate monitor aren't all that expensive anymore, you should really try to get one if you value your gaming time.
Thanks for your inputs
To be entirely fair the majority of NS2 players aren't coming here to vote..
Even with a small sample size (not even enough to fill a single server) you still get more no votes. Plus most of the people who frequent this forum are no doubt high skill veterans who are already benefiting from having 144hz..
240hz monitors are not cheap like you imply, nor is a rig that can handle pushing 240fps. To make your "party" analogy work you'd have to be limiting everyone to 60hz.. (basically we can all make it to the party, but some just have to walk there)
It's more like the vast majority of the party goers are under 18 and you're throwing away money to have booze that only a small minority of the people there can enjoy. Instead of using it for something everyone could enjoy (like a dj)
This...
But want kind of DJ? I loathe pop/rnb/mumblerap music!
DJ Bach and DJ Mozart of course, gotta get OG up in this house!
https://www.blurbusters.com/
Personally though, I think there isn't really any reason to go past 240hz.
Just to clarify we will only remove the maximum restriction of the maxfps console command. So players can use it to set any fps limit they want at their own risk. That change took only 5 minutes of my time You will run into certain issues when you go beyond 300 fps but fixing those is not really a priority for us.
It is possible with an i7-8700k @ 4,8 ghz. But also this CPU is dropping to fps between 110 and 130 in lategame.
@e_Ty
So praising a video wich shows a stone age game engine that can easy hold 300fps on a toaster and compare this with ns2 is kinda stupid.
Just out of curiosity, what kind of issues? I remember UT2k4 would have serious timing issues after 1000 fps.
This is the sort of ill logic people use with pollingrate on mice as well.
500hz vs 1000hz is a difference of 2ms - 1ms = 1ms. Who on planet earth can utilize 1ms shorter reaction time? 1000hz is useless.
No. That's not the way to think about this. The 1000hz pollingrate updates TWICE as much as the 500hz pollingrate. This is absolutely noticable. It's not about making you react 1ms faster - it's about the smoothness and the responsiveness.
FPS and refreshrates are the same. It's not about the millisecond improvements.
Would I be able to tell apart the difference of 240hz and 480hz monitors? No idea, put me infront of one and I'll tell you. But the idea of breaking down the millisecond improvements is dumb.
It's not the total amount of frames per second which simulates the smoothness of motion we see, it's actually the time delay in between the frames which visually determines the smoothness of motion and yes this number is dependent on HZ/FPS, but that is actually the point. The more frames you manage to cram into a second for example, the smaller the amount of time there is going to be between each of these frames. You know, those diminishing returns. At some point it seems highly unlikely you're going to notice those increasingly smaller differences for the frames to refresh on your screen. However, where this point lies remains to be seen, of course...
It's obvious the ~16ms between frames on 60Hz are a lot more noticeable compared to the ~4ms between frames on a 240Hz monitor. Like I said, I'm sure the increasingly smaller delays between these frames will eventually get to a point where the time difference becomes negligible, introducing the placebo effect of marketing
No if you desire to keep ns2 on the level that it was sold.
???
If I can trade resolution for the ability to see aliens move instead of warp then I surely would,
but I wager that the problem is more about netcode and update rate than a 60hert tv.
How about someone adjust their settings and see if you really do that terrible with 60hertz.
Thanks for starting me on the research... I believe I'll be moving to 120 from 60 as my fps can support it. LOL at 500, at least for me, because I'm pretty sure my one nvidia560 is far from pushing 500 fps.
In older game engines, like those of Quake lineage, the engine calculated important data between frames besides the frames themselves. Things like netcode, prediction, sound and user input needed to happen before the next frame is rendered. If the time between frames got too short, these tasks couldn't complete in time and would either overflow into the active frame rendering time or get cut off and things stopped working properly. The inverse was also true, if the time between frames was too long, it could also cause erratic behavior.
More modern game engines can run into the same and more complex problems like race conditions if the engine is SMP/SMT capable.
Interesting to note... Not too many people understand what a race condition is, but it caused issues in the Therac-25 that lead to injuries and deaths. It's a decent topic to cover in computer ethics. In a paper, I blasted the company for replacing the safety from a hardware switch to a software switch. Then, I compared it to trusting a software safety on a pistol and then pulling the trigger at your temple.
If the game can run into race condition, deadlock etc. because the framerate is too high it is programmed wrong. Yes, it is damned easy to make mistakes in multithreaded applications, but that's a straight up bug solved by better coding practices and debugging.
The key problem is that the time step of the simulation alters the behaviour of the simulation for a variety of reasons, and this has a relatively easy solution that is commonly used in games today.
Ingame movement is a lot like numerical integration of a differential equation. You've got forces acting on the player, like gravity, jetpacks, whatever amount of strafing you allow in the air etc and these can be time and space dependent forces; and what most games do is nothing more complex than the craptastic Euler method rather than attempting to use RK4 or something decent. The problem with using RK4 in a game is that there are these weird exceptions like collisions, jumping, or maybe a bullet imparting some momentum, that you don't necessarily know on the frame where you'd like to apply RK4. RK4 relies on knowing stuff about the future time, that you would if it was a simple equation, but you don't if it's a game where user inputs etc may alter the future.
To make things worse, games like Quake also had constants in the calculation of movement that were of the nature of max amount/frame rather than max amount/second. That explicitly creates a framerate dependence.
There are also quantization and rounding errors if things like firing an automatic gun is implemented wrong. If my frame timer has a resolution of milliseconds and I add up milliseconds until I reach 100 ms and then fire a bullet, then subtract 100 ms from the timer; the problem will be that a framerate corresponding to 6.49 ms will be rounded down to 6 and I will fire much slower and a framerate corresponding to 5.51 ms will be rounded up to 6 and I will fire faster. There are high performance timers in modern windows that solve this problem, although quite costly it is negligible if you only do it once per frame.
The "easy" solution I talked about was to break apart the gameplay loop so that framerate ("ticrate" etc) of the simulation and the framerate of the graphics are their own things that run in parallel; where the simulation framerate runs at a fixed 60 or whatever FPS. This is typical in modern games and only causes problems when the hardware is way too slow to sustain the required ticrate. This is especially common in multiplayer games where you need interpolation/extrapolation of animations, movement and everything of enemy players anyway.
The mouse view is allowed to be completely free and calculated on each graphics frame, where as ordinary movement may be calculated at the server ticrate. Things like firing and movement etc are only guessed locally, but the server second guesses everything and subtly pulls you back into sync so that an error does not build up unless there is catastrophic latency.