fps reports/rants
natostanco
Join Date: 2011-02-15 Member: 81912Members
Main menu:
dx11 lowest settings / 195-205fps
dx9 lowest settings / 130-140fps
dx11 highest settings / 100-115fps
dx9 highest settings / 95-110fps
Settings impact in main menu:
Reflections / 5fps
Ambient occlusion / 30fps
Anis. Filter / 5fps
Atmospherics / 25fps
AA / 10fps
Shadows / 20fps
Sandbox mode - docking - marines side:
dx11 lowest settings / 130-140fps
dx9 lowest settings / 120-130fps
dx11 highest settings / 110-120fps
dx9 highest settings / 90-100fps
Settings impact in SB:
nope.
I suppose(guess? Imagine? Blabber about?) in the main menu the difference between high/low settings and dx9/dx11 is so huge because your are technically gpu limited since you are not in a server and there is no cpu/networking shizzles, and graphic settings and dx level are all about gpu. In sandbox mode I got those results, which shows a much tighter difference because there is most likely a hint of cpu limitation in action. In actual servers The difference between high/low settings is basically inexistent, I have seen shit drop to 40fps in combat to peak around 130fps and alfs bases with all the infestation range around 60-90fps. I was tempted to say dx11 helps more on high settings then it does on low settings but what the fuck should I know :> my numbers say otherwise, but w/e, plus since everyone in this game is basically cpu limited dx11 didnt help anyone really. (:s)
OpenGL doesn't even work.
The only time I have seen ns2 using all the cores was during load, rendering time, the stuff you do once, that stuff filled all the damn cores but did the rendering in like 10 seconds.
Am I the only one who finds it ironic that they left the most cpu dependent engine of all time (source) to basically build one as much cpu gluttonous as the one they discarded?
I might just have written a bunch of bullshit but w/e, I must justify my shattered dream of steady 120fps somehow.
System:
amd 9370-fx @ 5.1ghz
r9 290
ps: please take numbers as surrounded by a field of tildes ~~~~~
dx11 lowest settings / 195-205fps
dx9 lowest settings / 130-140fps
dx11 highest settings / 100-115fps
dx9 highest settings / 95-110fps
Settings impact in main menu:
Reflections / 5fps
Ambient occlusion / 30fps
Anis. Filter / 5fps
Atmospherics / 25fps
AA / 10fps
Shadows / 20fps
Sandbox mode - docking - marines side:
dx11 lowest settings / 130-140fps
dx9 lowest settings / 120-130fps
dx11 highest settings / 110-120fps
dx9 highest settings / 90-100fps
Settings impact in SB:
nope.
I suppose(guess? Imagine? Blabber about?) in the main menu the difference between high/low settings and dx9/dx11 is so huge because your are technically gpu limited since you are not in a server and there is no cpu/networking shizzles, and graphic settings and dx level are all about gpu. In sandbox mode I got those results, which shows a much tighter difference because there is most likely a hint of cpu limitation in action. In actual servers The difference between high/low settings is basically inexistent, I have seen shit drop to 40fps in combat to peak around 130fps and alfs bases with all the infestation range around 60-90fps. I was tempted to say dx11 helps more on high settings then it does on low settings but what the fuck should I know :> my numbers say otherwise, but w/e, plus since everyone in this game is basically cpu limited dx11 didnt help anyone really. (:s)
OpenGL doesn't even work.
The only time I have seen ns2 using all the cores was during load, rendering time, the stuff you do once, that stuff filled all the damn cores but did the rendering in like 10 seconds.
Am I the only one who finds it ironic that they left the most cpu dependent engine of all time (source) to basically build one as much cpu gluttonous as the one they discarded?
I might just have written a bunch of bullshit but w/e, I must justify my shattered dream of steady 120fps somehow.
System:
amd 9370-fx @ 5.1ghz
r9 290
ps: please take numbers as surrounded by a field of tildes ~~~~~
Comments
The issue is that the longer a game goes on, the more models are added, and the harder the servers and clients have to work. The Spark engine is very efficient in and of itself, it is the game code that is causing the problems, there is just too much going on.
If you remember NS pushed the absolute boundaries of the HL1 engine, well unsurprisingly , NS2 pushes the absolute boundaries of the Spark engine. The engine itself is solid, NS2 is just a very big game.
Also.. do not discount your original findings with AO and atmospherics... They destroy FPS, and the former (AO) comes with inherent mouse input delay. (all AO does)
Oh and lastly, Fullscreen DX9 is the only officially supported rendering method, all else is still a work in progress. DX11 comes with better frametimes so it "feels" smoother.. but it does not match the mouse input delay of DX9. (which is equal to Quake 3's three frame delay)
I haven't touched a single implementation of AO that hasn't had the same issue. (even farcry 3's custom AO)
I ASSUME it's because It's an additional frame because its post processing.
If you're not sensitive to that sort of thing, then yea all that matters is beefier hardware to swallow that 20 fps cost. (though AO medium isnt that taxing)
Wiki:
"Instead of rendering 3D objects directly to the display, the scene is first rendered to a buffer in the memory of the video card. Pixel shaders are then used to apply post-processing filters to the image buffer before displaying it to the screen."
Basically from my understanding, it's using pixel depth to determine application, treating that frame like a texture and buffering it prior to modification before sending it to you. I could be way off tho.
Edit: p.s. newer algorithms are coming out and are 7x better performing and even look better. http://floored.com/blog/2013/ssao-screen-space-ambient-occlusion.html
(also, i haven't confirmed yet, but i believe the one who wrote the research paper on this at nvidia is Max's brother. :P)
Why do you think all AO comes with inherent mouse input delay?
It actually depends on the type of AO from what I understand. Whether or not it uses previous frame information to generate a more accfurate depth buffer to prevent shimmering. And even then that would be rare for a game to purposefully delay a frame just to increase the temporal stability of something for the next.
Unless NS2 is purposefully waiting on the next frame drawing to draw its AO.. it would only otherwise increase input latency due to the amount of MS it takes to render a frame. But just by the amount which the frame takes longer to render. Which woulld be 30% more MS in the case of NS2 on average. (the difference in feeling of 120 fps and 90)
Someone with more technical knowledge can correct me, but I cannot think of many post processing techniques which would purposefully delay a frame... screen spacepost processing techniques exist to decrease rendering time usually. Non, post processed version of the effects are otherwise too expensive.
Just going off of what Max told me..
You can definitely feel the input delay though if you test for it and are sensitive to that sort of thing.
After doing some research, I've found some interesting tid bits about how traditional AO (made by crytek) is a pixel shader and "a pixel shader is the only kind of shader that can act as a postprocessor or filter for a video stream"
But like you said there are now multiple solutions.. and the only place where I can find info about a delay is the post processing done to the high frequency noise to make it look shaded:
In regards to performance, it will still be significantly better then NS2 with compareable entcount - hell the hardware that can run 24 ppl on NS can run 64 + mods in CSS, even when the entcount is similar and on roughly the same tickrate, and on the client you may even get ridiclious fps rates.
Ent numbers are useless on their own. What do the ent's include? Dropped weapons? Fragmented models?
The difference you have to remember is that ent's in NS2 have a HUGE amount of logic. Not only do they have the physics code (As i have found out on GorgeCraft, physics code is a huge resource usage), but they also have their abilities, so every ent on the map is thinking every tick:
Crags seeing if you are within range and if you need healing
Shifts seeing if you are in range and need cloacking
Whips seeing if you are within range for a slap or bomb.
Sentries seeing if you are in range for a few bullets in the face.
I could go on, but you get the point, there are very few entities in NS2 that don't have masses of work to do every tick. You see NS2 doesn't just have entities, it has Dynamic entities, and many of them (I'm looking at you here infestation) have a lot of responsibility and take up a lot of CPU work.
Please detail the CS entities, so we can compare and see what each of those entities has to do each round. Rather than just saying this game has X vs Y number of entities.
What I don't get is why AO is on by default. (It still is, right?) As you wrote, it causes a big FPS decrease, and if it also adds input lag, that's even worse. How is trading in FPS and responsiveness for some eye candy a good idea?
If the standard response to "help I got crappy FPS" is turning off a default setting, something seems wrong.
Try switching to windowed mode with alt-enter before alt-tabbing, then go fullscreen again after tabbing back.
It works well for tabbing out. My only issue is that I'm often left wondering if I'm actually back to fullscreen, as there are times when you can't toggle it (map loading etc), and there is no visual indication.