Water distorts weapon/welder HUD's over it

RejZoRRejZoR Slovenia Join Date: 2013-09-24 Member: 188450Members, NS2 Playtester, Reinforced - Shadow
Image for better presentation. I don't think water shader should interact with elements that are meters away from it on the Z axis. Seems like a Z-buffer issue here. Only affects weapon and welder HUD's tho.

aneshp8r2mvs.jpg

Comments

  • BeigeAlertBeigeAlert Texas Join Date: 2013-08-08 Member: 186657Members, Super Administrators, Forum Admins, NS2 Developer, NS2 Playtester, Squad Five Blue, Squad Five Silver, NS2 Map Tester, Reinforced - Diamond, Reinforced - Shadow, Subnautica Playtester, Pistachionauts
    edited November 2017
    RejZoR wrote: »
    Image for better presentation. I don't think water shader should interact with elements that are meters away from it on the Z axis. Seems like a Z-buffer issue here. Only affects weapon and welder HUD's tho.
    aneshp8r2mvs.jpg

    The issue is because emissive things are rendered before refractions are calculated. Since purely emissive surfaces (like holograms, particles, etc) aren't fully opaque, they aren't depth-tested, which means that later, when it's calculating the refraction mask, it doesn't know that there's a big, bright emissive thing in front of it that would be blocking it.

    Now I suppose it might be possible to draw emissive stuff on _after_ the refractions are performed, but that may actually end up looking worse... as that means nothing emissive will ever be distorted. The emissive pass isn't just for holograms and particles -- it's also for any texture that "emits" light, which can include stuff that's a part of the world like the texture of a light-prop, or maybe some blinking computer lights or something. In this case, any time something refractive is placed in front of this background, the lights will appear to separate from their surroundings because they do not distort with everything else. Is that worse than the interface distorting? Maybe, maybe not.

    I'll look into this.

    EDIT: Hmm, I just tried it out, and noticed it actually doesn't distort particles, I guess they're drawn on after refractions. This makes me think it might be okay to switch the ordering of the render passes to fix this. There IS an alternative method to get the best of both worlds, but it comes at a performance cost.
  • meterumeteru there Join Date: 2016-09-10 Member: 222203Members
    Why not not give the stuff which has no depth per se, a "virtual" depth. Like hud gets 0, the stuff of the player itself gets 1, and world gets 2 .. skybox 3?
    The bug would still happen on world surfaces, but it would be less noticeable.
  • BeigeAlertBeigeAlert Texas Join Date: 2013-08-08 Member: 186657Members, Super Administrators, Forum Admins, NS2 Developer, NS2 Playtester, Squad Five Blue, Squad Five Silver, NS2 Map Tester, Reinforced - Diamond, Reinforced - Shadow, Subnautica Playtester, Pistachionauts
    edited November 2017
    It actually already does this... sort of. The range of depth values is separated into 3 "zones", the view model zone, the world zone, and the skybox zone.
    The issue isn't depth testing, because there's no way on current hardware to depth test for transparent/emissive stuff like the welder hud.
    See, the way real-time graphics works is any time an opaque object is rendered into the frame buffer, it also leaves behind its depth in the depth buffer. This allows all subsequent objects to be rendered to test against the depth buffer to ensure only pixels that are closer to the camera are being rendered into the frame buffer. This works great for opaque objects because we know for a fact that we will never ever be able to see anything behind an opaque object... because it's opaque...
    You can kind of think of it like one of these things:
    pinart5st.jpg

    See, there's exactly one layer of "depth" given there.
    Now, when drawing transparent stuff like particles or holograms, we test against the depth buffer to see if the pixel is visible, but since it isn't an opaque object, we _DO NOT_ write to the depth buffer. This allows holgrams and particles to "stack up" over each other (otherwise particles would look like black hexagons with a particle texture on top, with no background visible... completely unacceptable).

    So currently the way spark renders is:
    1. Render all opaque objects.
    2. Render all emissive objects (since at this point we know the depth buffer isn't going to be changing anymore since we're done rendering all opaque objects.)
    3. Render a distortion map by running all material shaders that write to the distortion outputs. These are depth-tested (so we don't see water distorting a wall in front of it).
    4. Distort the scene based on the distortion map.

    So in step 3, you can see where the problem is: we generate the distortion map, but the depth-testing involved doesn't know about the holograms!
    It may be possible to fix the distortion by moving step 2 to after step 4, but this means if you were to say... look at a lit up wall texture through something with distortion (eg that giant window in atmo on biodome), the lights won't be distorted at all, and it could be quite jarring.

    Another solution is to render out a second depth map for JUST distorting objects. Then do two separate passes for particles and holograms -- one before the refraction pass for particles/holograms that are behind refracting surfaces, and another after the refraction pass for particles/holograms in front of refracting surfaces. I've already tried this out, and it works quite well in practice... save for the fact that it creates a ~10% drop in frame rate... I've got some ideas for how to optimize this, but just haven't had the time to experiment further.
Sign In or Register to comment.