Fade Objects

CaptainFearlessCaptainFearless CO, US Join Date: 2016-12-14 Member: 224941Members
So apparently it is in game now, and I really like it. It has its up's and down's, but the only big issue is that sometimes it loads things in from the inside out. For example the precursor array, it loads the inside in first, then the outside. A new player would get completely spoiled!

Comments

  • JackeJacke Calgary Join Date: 2017-03-20 Member: 229061Members
    See, when graphics programming is done correctly, the GPU is only "thinking" (as in drawing - or rendering, in technical parlance) about the stuff you can see, but it has to keep track of everything in the game world. Thus, if your character is standing outside of a building, it has to "know" what's inside, but it doesn't have to actually render it yet because you can't see it anyway.
    This is one of the many reasons why multiplayer, especially PvP, is massively more complex than single-player. With the client program "knowing" about what's inside the building but the player doesn't, hacking the client means the player can know about what's inside the building. And thus start to break the simulation, at least as far as PvP and even PvE goes.
  • cdaragorncdaragorn Join Date: 2016-02-07 Member: 212685Members
    @scifiwriterguy a very good read. I just want to add that while it is better to avoid rendering things the player can't actually see, it is not a zero sum game. The "z-buffer" you keep talking about is not actually something that exists on the GPU. Anything you hand to the GPU will get rendered.

    The work of figuring out what should be rendered and what can't be seen and therefore shouldn't is entirely on the CPU. It's also not very easy to figure out. I can only assume that this z-buffer is a concept that exists in Unity as a way to take care of doing that work for you, I'm only familiar with implementing it myself. It's absolutely worth doing, but if you don't have a decent CPU you will still see a lot of pop-in and other side effects because of it.
  • 0x6A72320x6A7232 US Join Date: 2016-10-06 Member: 222906Members
    More info, for those interested (with maths!) :

    https://en.wikipedia.org/wiki/Z-buffering
  • ShuttleBugShuttleBug USA Join Date: 2017-03-15 Member: 228943Members
    Sooo much knowledge O_o
  • pie1055pie1055 Join Date: 2016-12-05 Member: 224603Members
    Sounds related to seeing objects inside your base while inside your cyclops, unless that got fixed in the last couple months.
  • CaptainFearlessCaptainFearless CO, US Join Date: 2016-12-14 Member: 224941Members
    No in the stable build that still happens to me.
  • scifiwriterguyscifiwriterguy Sector ZZ-9-Plural Z-α Join Date: 2017-02-14 Member: 227901Members
    cdaragorn wrote: »
    @scifiwriterguy a very good read. I just want to add that while it is better to avoid rendering things the player can't actually see, it is not a zero sum game. The "z-buffer" you keep talking about is not actually something that exists on the GPU. Anything you hand to the GPU will get rendered.

    The work of figuring out what should be rendered and what can't be seen and therefore shouldn't is entirely on the CPU. It's also not very easy to figure out. I can only assume that this z-buffer is a concept that exists in Unity as a way to take care of doing that work for you, I'm only familiar with implementing it myself. It's absolutely worth doing, but if you don't have a decent CPU you will still see a lot of pop-in and other side effects because of it.

    Correct to an extent. :) You're absolutely right that it's not a zero-sum process, and maybe I could've been clearer on that. It's about resource budgeting, not getting something for free.

    Z-buffering isn't a Unity thing; it's been a 3D graphics thing since the pre-Doom era. Also, the z-buffer is frequently now part of the GPU as a subprocess.

    You doubtless already know this, but for everybody else, the GPU isn't a monolithic object; it's effectively a mini-motherboard with a dedicated purpose. A GPU runs multiple simultaneous pipelines, allowing incredible power and flexibility - moreso than the single-channel days of flat monochrome graphics. Back when a computer had a few megabytes of system memory (or less) and either very, very little or just no dedicated video memory, everything had to be handled by the CPU with the exception of actually rendering (drawing) objects. With the advent of GPUs with massive amounts of cache memory, the dynamic has changed. It had to; the demands were skyrocketing.

    Different engines handle the process different ways, but it's become commonplace that Z-buffering is an engine runtime process that's GPU-resident. (Back when you had so little video memory you had to dole it out with an eyedropper, z-buffs lived entirely through the CPU and system RAM - these days, well, devs are getting sloppy.) It still burns up a few compute cycles and some cache memory, but nothing like what running it through the renderer (also a GPU process) would do. OpenGL was one of the first systems to shove the z-buff over to the GPU, and most design principles have fallen in line over the years since. The problem with a CPU-resident z-buff is transit time; to transition from a CPU z-buff to the GPU renderer, the data has to go through the system bus, and that creates a bottleneck. For games that want 40+ frames/sec with no drops, the z-buff had to move; the mobo is just too dang slow. Many GPU cards now actually include memory specifically set aside for z-buffering to improve performance - and push up those all-important benchmark scores.

    Of course, if the GPU is too slow or - more commonly - video memory too tight, then the z-buff and its cache gets shoved over to the system board. That's when pop-ins become absolutely insane; you're seeing the actual lag time between data being sent from the system board to the GPU and then running through the renderer. When all that data is on the video card to start with, pop-ins only happen with bad programming or engine faults. When the z-buff lives on the CPU and its cache - or worse - though...enjoy the pop-ins. ;)

    (Yes, I still have flashbacks from OpenGL programming trauma. Manual configuration of memory, buffers, and drawing by equation will do that to you. I hated OpenGL. Still do, truth be told.)

    In the end, it comes down to a relatively simple-seeming but actually complex question to determine where the z-buffer is going to live: who's doing less work? If the GPU is overloaded, then it goes to the CPU because it has resources available. Otherwise, it gets "elevated" to the GPU, particularly if the CPU is too busy running the calculations needed to make the game itself work and is running harder than the GPU.

    That said, you're still right; some engines still only run the z-buffer as a CPU-resident process thread, but that's most likely going to go completely extinct before too long. :) Demands for 60+ frames/sec and insanely detailed game worlds are making that system bus a real pain in development. Back when you were only shoveling 65kB texture files around and 100kB sprites, eh, you could make it work without a huge headache. Doom did it, and did it pretty well. (Very well for the era.) But these days, with multi-megabyte textures and ultra-high polygon count objects, it's just too many vertices and skin graphics to be flinging back and forth over the comparatively slow system bus.
Sign In or Register to comment.