I've been getting decreased performance in general ever since Reinforced came out. My FPS is still 30 lower in most places than it used to be. There have been no noticeable improvements.
It's always tempting to blame it on the server. Pass the problem on and all that.
Still, isn't the performance indicator on the server selection screen supposed to account for that? If you've connected to a 90% sever with under 50 ping, there shouldn't be a problem, no?
Most servers always run 100% for the first 10-20min. The majority of the "bad servers" you see with poor performance are games that are 20min+ in, since the majority of servers take a dump after that point. The overclocked ones last like 5min more...
There's a couple constantly populated NA east servers that last about 2min before they're unplayable. Dunno why people keep joining them
You and me must have different definitions of the word crippled...
There should be a small improvement in performance this patch, in all renderers. Samusdroid's comparison screenshots help back this up. (40 fps vs 65 etc.. on avg a 10-15 fps increase)
The hitches are known and being worked on, mostly, but most are pretty benign.
I think a performance drop of over a 100 frames during the course of a round is unacceptable.
Argh. this is what i get for taking your words out of context.
"I think a performance drop of over a 100 frames during the course of a round is unacceptable."
Not if you have 400 fps, its not! /shields self
Yes, I know there's a cap of 200.. my exaggerated point simply being that its all relative.
For example: When you only get 30 fps, getting a 20 fps boost is a world of difference. When you have 200 fps, losing 20 fps is not remotely as impacting.
So no, I would not consider 60 fps to be "Crippling performance".
But yes, i would concur that losing 100 fps during the course of a match is terrible (and I realize that is actually what you were referring to when you used that descriptor)
This game's performance will inherently always start off better than how it ends, due to the increasing complexity and entity counts - nothing new here and no way to solve this as long as LUA is used. If the low end of the performance spectrum for you still yields 60 fps, and the frames aren't jumping wildly 100 fps at a time, and mouse input has been significantly improved... i would disagree with labeling the performance as "crippling"... I might settle for "passable" or "needs improvement."
That being said, I'd never recommend any developer use a scripting language for all of the game logic in a twitch competitive FPS with mountains of RTS complexities. The fact that its as good as it is now constantly amazes me.
edit: and @locklear has some great points, too. Would be really nice to be able to plot actual frame jitter that isnt smoothed/averaged
Exactly why I still don't understand why people have 20+ people severs... I've rarely ran into problems with 16-18 player servers yet the vast majority of people still chose to play in crap performance servers just because its got the most people. Can't server admins check their server performance and alter their player count to make for the best optimal experience. Please... I'm getting sick of having to jump from server to server trying to find one which doesn't suck 15 minutes into the game.
AsranielJoin Date: 2002-06-03Member: 724Members, Playtest Lead, Forum Moderators, NS2 Playtester, Squad Five Blue, Reinforced - Shadow, WC 2013 - Shadow, Subnautica Playtester, Retired Community Developer
Actually if you look at the profiler you will notice that lua is no longer the bottleneck, but the gpu (renderer). Ad the renderer is all new, there are certainly some ways to improve it late game. That said, my fps drops at most 20-30% lategame, something i would consider totally normal
Input lag is finally gone for me, at all resolutions. Biodome feels smoother.
Still get those stutters where it looks like the game is almost hanging, yet my FPS counter shows 30-45fps. Happens in big fights, especially those in Marine bases.
Here's the thing with the new ns2 linux compatibility
Something is borked with the rendering, causing > 0 ms waiting on gpu in situations that never used to have this happen. That's why 60 fps feels like 25 fps, and why you need 100+ fps minimum to play the game without some weird ass mouse feeling.
You and me must have different definitions of the word crippled...
There should be a small improvement in performance this patch, in all renderers. Samusdroid's comparison screenshots help back this up. (40 fps vs 65 etc.. on avg a 10-15 fps increase)
The hitches are known and being worked on, mostly, but most are pretty benign.
I think a performance drop of over a 100 frames during the course of a round is unacceptable.
It just proves that the more you add objects the more it takes time to process. Also there may be objects processed separately on some matters like cyclic animation for ex. You could have (if it is not the case already) the same pointer (position between start and end) for all extractors/harvesters concerning animation. You will only notice it when you see 2 extractor like in Nano.
This game's performance will inherently always start off better than how it ends, due to the increasing complexity and entity counts - nothing new here and no way to solve this as long as LUA is used. If the low end of the performance spectrum for you still yields 60 fps, and the frames aren't jumping wildly 100 fps at a time, and mouse input has been significantly improved... i would disagree with labeling the performance as "crippling"... I might settle for "passable" or "needs improvement."
That being said, I'd never recommend any developer use a scripting language for all of the game logic in a twitch competitive FPS with mountains of RTS complexities. The fact that its as good as it is now constantly amazes me.
edit: and @locklear has some great points, too. Would be really nice to be able to plot actual frame jitter that isnt smoothed/averaged
I'm no programmer but does "entity" mean physical structures too (e.g. shade/crag etc). Of course there're more entities as the game progresses but at any given time, aren't players exposed to only 2 or 3 physical structures plus 2 or 3 other players? If someone is fighting in repair, why is their framerate affected by something in central?
Was in a pug lastnight and about 2 minutes into the game my 60 fps started looking like 60 fps. it still read 10 fps. Had to have someone replace me. It was bad.
That much microstutter was unheard of for me before this patch. To have a reading of 100 (effectively 60) fps while visually dealing with 5-10 is nuts.
Was in a pug lastnight and about 2 minutes into the game my 60 fps started looking like 60 fps. it still read 10 fps. Had to have someone replace me. It was bad.
That much microstutter was unheard of for me before this patch. To have a reading of 100 (effectively 60) fps while visually dealing with 5-10 is nuts.
Umm... did you mean your 60 fps started looking like 10 fps, but still read 60 fps?
I play on max resolution (1920x1080), and on large 24 player matches I start to get terrible stutters and slow downs in some parts of the map. When I look at r_stats, my FPS is around 70-80. This doesn't make any sense, as it feels like my game has slowed down to 20-30fps. Either r_stats is giving me the wrong FPS information, the frame draw time is greatly swinging from high to low making the average value useless, or mouse input is once again screwed when more stuff is going on in the game.
I'm also in the lose 100fps group, where I start games at 170fps, and then drop down to 70fps near the end.
The last time I had microlag was during the beta if someone started a fight. Very annoying. Then it was gone and didn't reappear for me since now. It is not the same lag, it appears totally random, but it is not less annoying. Beside the stutter, the performance seems to be better for me in the early game. But I can't tell you about the late game, because the last 4-5 (already?) builds crash after 6secs - a few minutes.
I just hope the crashes will be fixed soon and that it was all worth the trouble, because they will optimize the new system in the end. At least that is what I expect...
This game's performance will inherently always start off better than how it ends, due to the increasing complexity and entity counts - nothing new here and no way to solve this as long as LUA is used. If the low end of the performance spectrum for you still yields 60 fps, and the frames aren't jumping wildly 100 fps at a time, and mouse input has been significantly improved... i would disagree with labeling the performance as "crippling"... I might settle for "passable" or "needs improvement."
That being said, I'd never recommend any developer use a scripting language for all of the game logic in a twitch competitive FPS with mountains of RTS complexities. The fact that its as good as it is now constantly amazes me.
edit: and @locklear has some great points, too. Would be really nice to be able to plot actual frame jitter that isnt smoothed/averaged
I'm no programmer but does "entity" mean physical structures too (e.g. shade/crag etc). Of course there're more entities as the game progresses but at any given time, aren't players exposed to only 2 or 3 physical structures plus 2 or 3 other players? If someone is fighting in repair, why is their framerate affected by something in central?
Its typically a lot more than that as entities include weapons, projectiles, structures, players, lights, sounds, animated parts of the maps (cinematics), etc; basically anything that isn't part of the walls, floor, or ceiling. Hundreds (up to and exceeding 1000) of entities are not uncommon in NS2.
Another issue is that due to the nature of occlusion culling, the game renders/displays more entities that you can technically see so you don't run around a corner and have structures/players magically appear. In fact, a big part of map optimization is trying to minimize this issue.
Game runs now quite good. Ofcourse my 50-60fps dont feel like 50-60fps but some around 30, but its playable. I think that fps stats are ****** up. Ive got feeling that fps performance/count/smoth is much lower than it shows.
I think loading is quicker now which is great, the pre-caching before was literally 5-7minutes on my machine. Ok it isn't the best specs but out of all the games I play online it was the slowest ever. I get a weird issue that when it has finished loading into the game and I am in Ready Room it has a laggy stage. The textures are not fully loaded and probably for another 1 minute even if I join a team its slowwww.
Other than that for me the game has improved since I started to use DX11 also.
Keep up the good work guys, love this game always have since NS1.
Game runs now quite good. Ofcourse my 50-60fps dont feel like 50-60fps but some around 30, but its playable. I think that fps stats are ****** up. Ive got feeling that fps performance/count/smoth is much lower than it shows.
Check with an external app like fraps. For me the ingame fps counter and fraps agree all the time.
Comments
OMG you're not special bro.
Most servers always run 100% for the first 10-20min. The majority of the "bad servers" you see with poor performance are games that are 20min+ in, since the majority of servers take a dump after that point. The overclocked ones last like 5min more...
There's a couple constantly populated NA east servers that last about 2min before they're unplayable. Dunno why people keep joining them
"I think a performance drop of over a 100 frames during the course of a round is unacceptable."
Not if you have 400 fps, its not! /shields self
Yes, I know there's a cap of 200.. my exaggerated point simply being that its all relative.
For example: When you only get 30 fps, getting a 20 fps boost is a world of difference. When you have 200 fps, losing 20 fps is not remotely as impacting.
So no, I would not consider 60 fps to be "Crippling performance".
But yes, i would concur that losing 100 fps during the course of a match is terrible (and I realize that is actually what you were referring to when you used that descriptor)
This game's performance will inherently always start off better than how it ends, due to the increasing complexity and entity counts - nothing new here and no way to solve this as long as LUA is used. If the low end of the performance spectrum for you still yields 60 fps, and the frames aren't jumping wildly 100 fps at a time, and mouse input has been significantly improved... i would disagree with labeling the performance as "crippling"... I might settle for "passable" or "needs improvement."
That being said, I'd never recommend any developer use a scripting language for all of the game logic in a twitch competitive FPS with mountains of RTS complexities. The fact that its as good as it is now constantly amazes me.
edit: and @locklear has some great points, too. Would be really nice to be able to plot actual frame jitter that isnt smoothed/averaged
Still get those stutters where it looks like the game is almost hanging, yet my FPS counter shows 30-45fps. Happens in big fights, especially those in Marine bases.
Something is borked with the rendering, causing > 0 ms waiting on gpu in situations that never used to have this happen. That's why 60 fps feels like 25 fps, and why you need 100+ fps minimum to play the game without some weird ass mouse feeling.
It just proves that the more you add objects the more it takes time to process. Also there may be objects processed separately on some matters like cyclic animation for ex. You could have (if it is not the case already) the same pointer (position between start and end) for all extractors/harvesters concerning animation. You will only notice it when you see 2 extractor like in Nano.
But i'm not sure it will solve many problems.
I'm no programmer but does "entity" mean physical structures too (e.g. shade/crag etc). Of course there're more entities as the game progresses but at any given time, aren't players exposed to only 2 or 3 physical structures plus 2 or 3 other players? If someone is fighting in repair, why is their framerate affected by something in central?
That much microstutter was unheard of for me before this patch. To have a reading of 100 (effectively 60) fps while visually dealing with 5-10 is nuts.
Umm... did you mean your 60 fps started looking like 10 fps, but still read 60 fps?
I'm also in the lose 100fps group, where I start games at 170fps, and then drop down to 70fps near the end.
I just hope the crashes will be fixed soon and that it was all worth the trouble, because they will optimize the new system in the end. At least that is what I expect...
Another issue is that due to the nature of occlusion culling, the game renders/displays more entities that you can technically see so you don't run around a corner and have structures/players magically appear. In fact, a big part of map optimization is trying to minimize this issue.
Other than that for me the game has improved since I started to use DX11 also.
Keep up the good work guys, love this game always have since NS1.
Check with an external app like fraps. For me the ingame fps counter and fraps agree all the time.