Get The Latest Compile Tools

145791020

Comments

  • CageyCagey Ex-Unknown Worlds Programmer Join Date: 2002-11-15 Member: 8829Members, Retired Developer, NS1 Playtester, Constellation Posts: 1,751
    QUOTE (Edgecrusher @ Sep 3 2003, 03:32 PM)
    meh i downloaded the p10 tools and tried to use them in hammer (beta 3.5).

    this is what i get when i try to compile


    ** Executing...
    ** Command: Change Directory
    ** Parameters: C:\SIERRA\Half-Life


    ** Executing...
    ** Command: Copy File
    ** Parameters: "C:\SIERRA\Half-Life\Ns\maps\ns_proxima6.map" "C:\SIERRA\Half-Life\ns\maps\ns_proxima6.map"


    ** Executing...
    ** Command: C:\DOCUME~1\NEWMIC~1\MYDOCU~1\MAPPIN~1\CAGEYS~1\hlcsg.exe
    ** Parameters: "C:\SIERRA\Half-Life\ns\maps\ns_proxima6"


    ** Executing...
    ** Command: C:\DOCUME~1\NEWMIC~1\MYDOCU~1\MAPPIN~1\CAGEYS~1\hlbsp.exe
    ** Parameters: "C:\SIERRA\Half-Life\ns\maps\ns_proxima6"


    ** Executing...
    ** Command: C:\DOCUME~1\NEWMIC~1\MYDOCU~1\MAPPIN~1\CAGEYS~1\hlvis.exe
    ** Parameters: "C:\SIERRA\Half-Life\ns\maps\ns_proxima6"


    ** Executing...
    ** Command: C:\DOCUME~1\NEWMIC~1\MYDOCU~1\MAPPIN~1\CAGEYS~1\hlrad.exe
    ** Parameters: "C:\SIERRA\Half-Life\ns\maps\ns_proxima6"

    it doesn't actually seem to bother itself about compiling... any ideas whats wrong?

    Download the dlls in the first post and add them to the directory where you've placed the p10 tools.
    XP-Cagey

    Recommended Reading: NS Mapping Guidelines | Mapping Forum FAQ
    Nostalgia: Power Cells Thread
  • OlmyOlmy Join Date: 2003-05-08 Member: 16142Members, NS1 Playtester, Contributor, NS2 Developer, NS2 Map Tester, Reinforced - Diamond Posts: 1,444 Advanced user
    edited September 2003
    thanks cagey smile.gif
  • ZaziZazi Join Date: 2002-05-26 Member: 672Members, NS1 Playtester, Contributor Posts: 491
    Here's an interesting error I procured last night while compiling Widow...

    I'm using Cagey's p10 build of tools, along with that -subdivide too provided by LumpN. Anyway, whenever I compile, I get an error like, "Bad surface extents, face 7" followed by the coordinates of a few brushes. A few things to note:

    -Bad surface extents are caused by stretching/scaling a texture too much. No texture in Widow has been scaled beyond 2.5. The texture in question (I'll get to 'texture' later) is not scaled at all.
    -It only lists one texture as the culprit, and that had never changed through various test compiles to see if it'd work. There is nothing wrong with the texture. It is a perfect 128x128 texture.

    This happens whenever I have the -subdivide flag set, no matter the value. Presently, I have it at
    -subdivide 512 and it does not work. According to the tools, and Cagey himself, 512 is the maximum size for -subdivide.

    If anyone has any sort of advice on this matter, please let me know as I will love you forever if I can get this fixed.
  • CageyCagey Ex-Unknown Worlds Programmer Join Date: 2002-11-15 Member: 8829Members, Retired Developer, NS1 Playtester, Constellation Posts: 1,751
    QUOTE (Zazi @ Sep 17 2003, 12:10 PM)
    According to the tools, and Cagey himself, 512 is the maximum size for -subdivide.

    512 is pegged as the maximum in the tools, but the practical maximum for lit faces appears to be 256 * scaled texture units on the face, so values over 256 don't appear to be very useful.

    Posting the error here for reference:
    CODE


    BuildFacelights:
    Error:
    for Face 7 (texture nos_port_wall22) at
    (1456.000 -1728.000 -384.000) (1456.000 -1728.000 -480.000) (1168.000 -1728.000 -480.000) (1168.000 -1728.000 -384.000)
    Error: Bad surface extents (17 x 6)
    Check the file ZHLTProblems.html for a detailed explanation of this problem



    Bad texture scaling is one way to arrive at this error, but the error itself is caused by having too many lighting patches along the face in a single direction. A face can support a maximum of 16 patches in either direction, and the face in question has been divided up into a 17 x 6 patch square, triggering the error.

    Since patches are 16*texture scale units wide by default, 256 is the largest maximum face size that can support unscaled faces if the face is aligned and rotated to match the face. If you're going to use subdivide 512, I think you'll have to scale up your textures to 2x normal for this section of code to work properly. It doesn't appear to care what patch scale settings are set when it makes this check; the inconsistency could probably be labeled a bug.

    LumpN's tool is designed to turn slighly misaligned plane information output from Hammer (with nonzero components under the magnitude of the tools' error tolerance) into axial planes by zeroing out tiny coordinate values. This can mean the difference between a face that is exactly 256 units or a miniscule amount over 256 units wide when -subdivide 256 is selected, triggering the error.

    Here is exactly what's going on:

    First, RAD is calculating the max and min S,T coordinate pairs for each point in the face's winding. The texture vectors have already been scaled and rotated into place.
    CODE

           for (j = 0; j < 2; j++)
           {
               val = v->point[0] * tex->vecs[j][0] +
                   v->point[1] * tex->vecs[j][1] + v->point[2] * tex->vecs[j][2] + tex->vecs[j][3];
               if (val < mins[j])
               {
                   mins[j] = val;
               }
               if (val > maxs[j])
               {
                   maxs[j] = val;
               }
           }


    Then, it's dividing by 16 to get the relative patch locations - texsize becomes the difference in the number of patches.
    CODE

       for (i = 0; i < 2; i++)
       {
           l->exactmins[i] = mins[i];
           l->exactmaxs[i] = maxs[i];

           mins[i] = floor(mins[i] / 16.0);
           maxs[i] = ceil(maxs[i] / 16.0);

     l->texmins[i] = mins[i];
           l->texsize[i] = maxs[i] - mins[i];
    }


    If the face is to be lit, and the number of patches in either direction is > 16, it throws the error you're getting:
    CODE

    if (!(tex->flags & TEX_SPECIAL))
    {
     if ((l->texsize[0] > 16) || (l->texsize[1] > 16))
     {

      <snip>
      Error( "Bad surface extents (%d x %d)\nCheck the file ZHLTProblems.html for a detailed explanation of this problem", l->texsize[0], l->texsize[1]);
     }
    }


    The face above is 288 units wide, so the 17 patch width count is accurate assuming that the texture is aligned to the face (or world since the face is an axial wall) -- you'll need to either up the scale or drop the subdivide to 256. This section of code needs to be reworked to be aware of the texscale patch options, but I'm not going to hack it at this point; I'll have a better solution in the new tools.

    If you're still seeing the error with a subdivide of 256 on normally scaled faces, double check to see that the texture is aligned and rotated to match the face -- you can use the formula above to double check the exact value that is causing the error.
    XP-Cagey

    Recommended Reading: NS Mapping Guidelines | Mapping Forum FAQ
    Nostalgia: Power Cells Thread
  • antyanty Join Date: 2003-02-05 Member: 13143Members Posts: 87
    Great job XP-Cagey!
    I've used your tools and I like them. But I need one thing:
    There is a zhlt_noclip key, but it only works with brush based enitys. Could you please add support to point based entitys?
    It would help me a lot smile.gif
    anty.info - Webdevelopment Blog
    imageimage
  • KageKage Join Date: 2002-11-01 Member: 2016Members Posts: 410
    Point based entities don't clip.
    user posted image
  • CageyCagey Ex-Unknown Worlds Programmer Join Date: 2002-11-15 Member: 8829Members, Retired Developer, NS1 Playtester, Constellation Posts: 1,751
    QUOTE (anty @ Sep 19 2003, 07:22 AM)
    Great job XP-Cagey!
    I've used your tools and I like them. But I need one thing:
    There is a zhlt_noclip key, but it only works with brush based enitys. Could you please add support to point based entitys?
    It would help me a lot smile.gif

    If you're being blocked by a point entity in game, it's a function of the mod's code--like Kage said, point entities don't have intrinsic clip information stored in the BSP. Instead of a change in the tools, you'll need to petition the mod designers for an additional flag on the entity to turn off any imitation clip code when the entity loads.

    Out of curiosity, what point entity for what mod is causing the problem?
    XP-Cagey

    Recommended Reading: NS Mapping Guidelines | Mapping Forum FAQ
    Nostalgia: Power Cells Thread
  • antyanty Join Date: 2003-02-05 Member: 13143Members Posts: 87
    I've the problem, that I can't add a model in NS without geting blocked ingame...
    anty.info - Webdevelopment Blog
    imageimage
  • CageyCagey Ex-Unknown Worlds Programmer Join Date: 2002-11-15 Member: 8829Members, Retired Developer, NS1 Playtester, Constellation Posts: 1,751
    QUOTE (anty @ Sep 19 2003, 03:24 PM)
    I've the problem, that I can't add a model in NS without geting blocked ingame...

    Which point entity are you using to represent the model? env_sprite? monster_furnature?
    XP-Cagey

    Recommended Reading: NS Mapping Guidelines | Mapping Forum FAQ
    Nostalgia: Power Cells Thread
  • antyanty Join Date: 2003-02-05 Member: 13143Members Posts: 87
    I'm using cycler, because monster_furniture doesn't exist in 2.0 anymore, and env_sprite will crash VHE because it wants to show you the model as a sprite. Also I think you mean cycler_sprite, which crashes the editor either...
    Only way would be to copy the mdl into another folder and after mapping back to compile. But I think there must be a better methode...
    anty.info - Webdevelopment Blog
    imageimage
  • OlljOllj our themepark-stalking nightmare Fade Join Date: 2002-12-12 Member: 10696Members Posts: 3,844
    I guess i hit max map planes of p10 compile tools.
    Most planes are in the void.
    Any change to increase it more?
    "What are we going to do tonight, Brain?"
    "The same thing we do every night, Pinky, trying to launch Steam."

    I like:
    user posted image
  • CageyCagey Ex-Unknown Worlds Programmer Join Date: 2002-11-15 Member: 8829Members, Retired Developer, NS1 Playtester, Constellation Posts: 1,751
    QUOTE (Ollj @ Sep 20 2003, 04:15 PM)
    I guess i hit max map planes of p10 compile tools.
    Most planes are in the void.
    Any change to increase it more?

    HLCSG uses the BSP format to pass plane information to HLBSP, which is the first point where the tools know what's in the void. Since the BSP format stores plane numbers using 16 bits, 2^16 or 64K is the highest number of planes that the BSP format can theoretically store. The only way to increase the count would be to change the method HLCSG uses to report plane information to HLBSP, which is more effort than I want to spend on the old tools. The new tools don't use BSP format for intermediate steps, so they don't have this problem.
    XP-Cagey

    Recommended Reading: NS Mapping Guidelines | Mapping Forum FAQ
    Nostalgia: Power Cells Thread
  • ReveReve Join Date: 2003-09-23 Member: 21142Members Posts: 64

    Cagey, have been following this post for quite a while but finally registered to actually post something smile.gif

    Firstly, thanks for some very cool tools. I've been using them for compiling counterstrike maps (and not particularly taxing ones at that) but I've been finding some of the improvements since custom 17 (and in the Merls tools too) useful. I find opt_plns a very nice tool (although I can't for the life of me work out how to make it work correctly using QuArK's compile system, I have to use it independently).

    Any progress being made recently on the tools? I'm quite interested in your plans for integrating opt_plns functionality into the main compile tools (frankly it seems like it's patching a bug produced in maps by the original compile tools being messy but this being sorted is obviously a good thing).

    How about an update to your homepage with a sexier home page and a page on these compile tools? smile.gif

    Keep up the good work!

    Reve
  • CageyCagey Ex-Unknown Worlds Programmer Join Date: 2002-11-15 Member: 8829Members, Retired Developer, NS1 Playtester, Constellation Posts: 1,751
    QUOTE (Reve @ Sep 23 2003, 02:48 PM)
    Any progress being made recently on the tools? I'm quite interested in your plans for integrating opt_plns functionality into the main compile tools (frankly it seems like it's patching a bug produced in maps by the original compile tools being messy but this being sorted is obviously a good thing).

    p12 needs to be tested before release, but it's about ready to go -- the "SDF::4" should-have-been-a-developer message mentioned a few posts ago is gone, I commented out the accidental inclusion of experimental HLRAD code that p11 uses (so Kage's error report should be taken care of), and I'm shifting default MAX_MAP_LIGHTING back to 6MB.

    I'm actually spending most of my time on the replacement toolset--I've made some significant progress with it, but it's not ready for testing yet. The new tools are actually better than the "p" series at when it comes to allowing planes in a map--there isn't a cap on the total number as long as the final count after opt_plns style optimization is under 32K because I'm not passing a BSP file from stage to stage.


    QUOTE
    How about an update to your homepage with a sexier home page and a page on these compile tools? smile.gif


    I've spent some time on this in the last few days--a new homepage will hopefully be up sometime in the next two weeks smile.gif.
    XP-Cagey

    Recommended Reading: NS Mapping Guidelines | Mapping Forum FAQ
    Nostalgia: Power Cells Thread
  • ReveReve Join Date: 2003-09-23 Member: 21142Members Posts: 64
    Cool, I'm looking forward to it, it sounds exciting smile.gif

    Do you know if the p series have been used to compile any of the official NS maps?

    Once you get the new tools to beta stage, you ought to release them as XPCHLT or XHLT (hrm, that sounds better smile.gif) with a version numbering of 3.0.0 or 2.6.0 to properly differentiate them from the older series of tools (am I correct in thinking that you won't be able to mix some tools from the custom builds/original zhlt with the new tools because you no longer use BSP for the intermediate compile steps? In this case, perhaps 3.0.0 numbering is the best bet).

    I noticed that you've said that you've tried contacting Merl about getting these improvements included in the custom builds but not had a response. Perhaps once the new tools get stable enough, you could try and get them made the official ones (as Merl doesn't seem to be doing anything any more). You said somewhere that you wanted to use the new tools for your CV - well I don't see a better way than having them made the official tools smile.gif You can keep a p series of the new tools going on this forum for those who want to play with the cutting edge (and help report bugs) and keep the stable ones (with bugfixes) as the official.

    Anyway, keep up the good work! I'm sure I'm not the only one who is eagerly awaiting the new tools you're writing.

    Full documentation bundled in the zip would also be nice wink.gif

    Reve

    ps: The first post, you have the link to the tools as the p10 ones, but incorrectly have the link to the p11 ones also in red nearer the end of the post. Might want to change that smile.gif
  • NerdIIINerdIII Join Date: 2003-04-05 Member: 15230Members Posts: 822
    Cagey, I don't code in C++, but I guess it is not possible to load the dlls dynamically to be able to give out a message if they don't exist, is it?

    Reve, including opt_plns in QuArK is the same as with the other compile tools, just add it as fith compile tool and the activate it for all compile modes (Half-Life -> customise menu...)
    All you ever wanted to know about -texchop & -subdivide:
    Texture dimensions vs. appearance, memory, and r_speeds
  • CageyCagey Ex-Unknown Worlds Programmer Join Date: 2002-11-15 Member: 8829Members, Retired Developer, NS1 Playtester, Constellation Posts: 1,751
    edited September 2003
    QUOTE (NerdIII @ Sep 24 2003, 12:05 AM)
    Cagey, I don't code in C++, but I guess it is not possible to load the dlls dynamically to be able to give out a message if they don't exist, is it?

    Well, this appears to be one of those forehead-slapping moments, but I learned tonight that using /MT with the compiler would have killed the issue outright--it appears that it's possible to distribute without the dll requirement after all, but when I tried to figure that out several months back I was looking in the wrong properties page of the visual studio project dialog. mad.gif

    Unfortunately, the static runtime versions of the tools have a large enough increase in exe file sizes that I'd be better off redistributing the runtime dlls anyway (Microsoft has given permission to do this):

    HLCSG: 92K -> 268K
    HLBSP: 64K -> 124K
    HLRAD: 108K -> 192K
    HLVIS : 48K -> 104K
    RIPENT: 24K -> 76K

    Total increase: 428K

    I haven't wanted to include the 459K pair of DLLs (multithreaded runtime for .Net and Microsoft's C++ standard library implementation for .Net) within the zip file since I didn't want the bandwidth hit, but since MS is giving a green light to distribute and the size is comparable, I'll probably begin offering two download links with p12 - an "upgrade from p series" link without the dlls present, and a "full" link that includes the dlls embedded in the zip so that people who didn't get the tools from this thread will be able to use them immediately.

    To answer the original question -- if I compile using the settings I've been using, the multithreaded runtime library is actually being loaded on demand, and the crash is happening the first time I call a function that requires something over the DLL boundary. Microsoft doesn't publish a list of safe functions, but it appears that the header code doesn't require anything, so if Win32's FindLibrary function doesn't require the DLL (and it really shouldn't), it might be possible to exit with a genuine error message before the program attempts to load the runtimes.

    I'm also about to wipe my programming box and upgrade to Windows XP Pro / .Net 2003 from Windows 2000 Pro / .Net (original) -- we'll have to see if 2003 has yet another version of the runtimes. I plan on asking a few people if they'd be willing to test compile using p12 in the next few days to see if my release canditate is ready for publication--it'll be the last version compiled under the original MSVC.Net.

    EDIT:

    QUOTE
    ps: The first post, you have the link to the tools as the p10 ones, but incorrectly have the link to the p11 ones also in red nearer the end of the post. Might want to change that smile.gif


    Thanks Reve, fixed it wink.gif
    XP-Cagey

    Recommended Reading: NS Mapping Guidelines | Mapping Forum FAQ
    Nostalgia: Power Cells Thread
  • ReveReve Join Date: 2003-09-23 Member: 21142Members Posts: 64

    Nerd, I am familiar with the QuArK customise menu and tried adding it that way, but I couldn't get it to work sad.gif Have you tried this and got it to work? What were the additional parameters you used? (the full list of what you used in each of the boxes). What versions of QuArK?

    Cheers

    Reve
  • watch_me_diewatch_me_die Join Date: 2002-11-10 Member: 8107Members Posts: 562
    Reve if you use Batch Compiler to compile there's already presets for it that include all of the latest tools smile.gif

    I'm unsure of the method that quark uses for compiling but if it continues to run in the background like hammer does this would be faster as well biggrin.gif
  • ReveReve Join Date: 2003-09-23 Member: 21142Members Posts: 64

    I know batch compiler does support the new tools but I would like to use QuArK as it means I can just compile straight from the editor. You can't just stick the path to opt_plns as the last compile tool as it seems to try running it on the .map file (if I remember correctly) rather than the bsp. Duh sad.gif

    Perhaps opt_plns could be designed such that if it detects a .map file being passes to it rather than a .bsp, it realises this and compiles the named bsp instead. Perhaps you could add a switch to it such as -map or -accceptmap or something so it doesn't have a problem with this. Any chance?

    Perhaps I'll try talking to the QuArK guys about it too, see if I can get any joy from them.

    Although of course, once the new tools are out, this won't matter any more smile.gif

    Reve
  • ReveReve Join Date: 2003-09-23 Member: 21142Members Posts: 64

    OK I hope this isn't considered off-topic (I've only just got into writing on gaming forums).

    I have a map for a mod (in this case CS but it could just as easily be NS). I want to compile using custom textures not included with half life or the mod. I can use wadinclude to do this, but if I also use a texture which IS included in the mod, but NOT included in half-life itself, it seems that I must wadinclude that texture too sad.gif

    So my question is, do the compile tools support excluding specific wad files and/or all mod folders (ie all the wad files in that mod folder) from being wadincluded but still allow the map to be compiled?

    Also, if nowadtextures is used, or wadinclude zhlt.wad (ie trying to include the the clip, null, bevel etc textures in the bsp), does the compiler automatically stop these being included in the bsp? It ought to really.

    Reve
  • KageKage Join Date: 2002-11-01 Member: 2016Members Posts: 410
    Just use -wadinclude on the .wads that you want to be included (the ones that don't come with ns). Avoid using -nowadtextures, as it includes all of the textures in all of the wads that you are using.

    You really should use either Batch Compiler or make your own batch files. Mapping programs use up a load of RAM, and if it were free, it could help speed up the compile process.
    user posted image
  • NerdIIINerdIII Join Date: 2003-04-05 Member: 15230Members Posts: 822
    Reve, I don't want to go too much off topic, but before someone says to him/herself: Good that I am using Hammer... I will post it here.
    If you don't specify 'fixed command-line arguments' QuArK will automagically add 'maps\mapname.map' to the commandline so the tools know what to do. Just hover over the edit field and a help text will appear that tells you how to add the bsp file
    All you ever wanted to know about -texchop & -subdivide:
    Texture dimensions vs. appearance, memory, and r_speeds
  • antyanty Join Date: 2003-02-05 Member: 13143Members Posts: 87
    I just found a possible bug in the compiler tools: when u use hlrad with the -extra flag, you still get the max patch error. but if you dont use it, everything works. Don't know if this is a bug...
    anty.info - Webdevelopment Blog
    imageimage
  • KageKage Join Date: 2002-11-01 Member: 2016Members Posts: 410
    -extra automatically divides the texture chop value by two at runtime.

    I got this from Batch Compiler's description.
    user posted image
  • antyanty Join Date: 2003-02-05 Member: 13143Members Posts: 87
    that would "reset" the increased limit again, so that would be a reason for that error wink.gif
    anty.info - Webdevelopment Blog
    imageimage
  • WolfWingsWolfWings NS_Nancy Resurrectionist Join Date: 2002-11-02 Member: 4416Members Posts: 595
    Just a quick sporking of the thread, to see if there's any news from XP-Cagey about his new compile toolset? =^.^=

    Also, regarding that, I was wondering how effeciently it will handle 'opaque' entities, compared to the current toolset that, to be perfectly frank, hurls up all over the place and is about as speedy as a broken-down car when you enable opaque entities...
    user posted image
  • CageyCagey Ex-Unknown Worlds Programmer Join Date: 2002-11-15 Member: 8829Members, Retired Developer, NS1 Playtester, Constellation Posts: 1,751
    edited October 2003
    QUOTE (WolfWings @ Oct 2 2003, 05:15 AM)
    Just a quick sporking of the thread, to see if there's any news from XP-Cagey about his new compile toolset? =^.^=

    Also, regarding that, I was wondering how effeciently it will handle 'opaque' entities, compared to the current toolset that, to be perfectly frank, hurls up all over the place and is about as speedy as a broken-down car when you enable opaque entities...

    Heh - this is the reason why I didn't commit to a Summer release date smile.gif.

    I'm currently getting a website/forum set up for suggestions and early feedback -- once that's done I'm going to grab a small team of people from the mapping community so that I can start finalizing the featureset of each stage's initial implementation. I don't have $80 to pay IPB for SQL Server support, and PHPBB2 has some pretty serious limitations, so I'm rolling my own forum in ASP.Net and SQL Server which has taken an extra week or so. The site will be up "soon".

    Re: opaque entities -- The current HLRAD compares every traceline to every opaque face in the potential zone of intersection individually--it's brute force and a very slow way to do things. If you convert items back and forth from opaque entities to worldspawn in the new toolset, there won't be any apparent difference in speed--and the shadows will be perfectly identical, too (unlike the current tools). For lighting purposes, opaque entities ARE solid walls in the worldspawn under the new tools. I haven't forgotten about toggled shadows--finding the best way to integrate them is part of my current to-do list.

    EDIT: from from -> forth from... must not be awake yet smile.gif
    XP-Cagey

    Recommended Reading: NS Mapping Guidelines | Mapping Forum FAQ
    Nostalgia: Power Cells Thread
  • WolfWingsWolfWings NS_Nancy Resurrectionist Join Date: 2002-11-02 Member: 4416Members Posts: 595
    I figured it must be making some hellacious exponentially-expensive calculation for opaque entities, as Nancy is at the point that it's faster to compile with -chop 64 -texchop 32 -extra -noopaque than with -chop 256 -texchop 256 at this point. And I have every intention of having a -extra compile when Nancy rolls out the door to the masses. :-)

    Thanks for the update, Cagey, and good luck with the forums. I've seriously considered abusing LiveJournal as a developer-oriented forum before, simply because it has so much power for interlinked groupings of posts that multiple independant people can post, reply, and discuss. Yeah, I'd need to reword most of the text it displays first, but at the core I always see LJ more as a shared news and discussion forum than a 'Journal' in the classic sense.

    Still surprised 'opaque' entities weren't just pushed into a 'fake WorldSpawn' but I guess the biggest limitation was the simple fact that the existing HLT has to run entirely to and from a BSP file.
    user posted image
  • CageyCagey Ex-Unknown Worlds Programmer Join Date: 2002-11-15 Member: 8829Members, Retired Developer, NS1 Playtester, Constellation Posts: 1,751
    QUOTE (WolfWings @ Oct 2 2003, 08:25 AM)
    Still surprised 'opaque' entities weren't just pushed into a 'fake WorldSpawn' but I guess the biggest limitation was the simple fact that the existing HLT has to run entirely to and from a BSP file.

    I think the primary concern would be properly intersecting the opaque entity vis hulls with the worldspawn when you're building the hybrid--I'm sure there are good BSP union algorithms out there, but I haven't found one yet--building the new hull from primatives seems the easy approach at the moment. I'll be using the original brushes from the entities to build my temporary lighting BSP hull, which HLRAD can't currently access.

    I'm considering using the same temporary BSP hull method of entity inclusion with VIS to have VIS-blocking entities... They wouldn't cut the vis hull so they'd be less effecient at blocking than actual walls (fewer nodes meaning larger granularity for VIS control), but it'd still offer significant improvement over what's available now.

    The current tools are built on straightforward functional (vs. OO) code, which making reuse without an overhaul difficult. I'll basically be redoing the BSP assembly stage at least two extra times in the latter part of the compile with different sets of brushes, and that'd mean some extensive changes to the current system. Few things screw up a codebase faster than propogation of functionality via cut-and-paste, which is one of the main reasons I'm starting over with an eye on encapsulation.
    XP-Cagey

    Recommended Reading: NS Mapping Guidelines | Mapping Forum FAQ
    Nostalgia: Power Cells Thread
145791020
Sign In or Register to comment.