Zhlt 3.0 Beta

123578

Comments

  • BulletHeadBulletHead Join Date: 2004-07-22 Member: 30049Members Posts: 2,530
    edited October 2004
    BALEETED for being BAD ADVICE to follow on what the actual problem was (I must learn to read CAREFULLY)
    Post edited by Unknown User on
    QUOTE (DragonMech @ Jul 9 2005, 09:19 PM)
    QUOTE (Sonic @ Jul 9 2005, 06:49 PM)
    I wish my butcheeks could propel me up flights of stairs are terrifying speeds.

    I sense a custom title right there... :D
  • AnpheusAnpheus Join Date: 2004-09-30 Member: 32013Members Posts: 63
    edited October 2004
    Excuse me, but are you bloody ****************ing insane? I censored that myself.


    Listen: Infinite loop = it will consume processor and virtual memory until there is nothing left. Nada! What happens when it runs out is beyond my current knowledge. Hell, that's if it doesnt crash.

    Giving it Realtime status would not give it 95% of my CPU usage, it would give it 100%. I would not be able to use my computer short of shutting it down and praying.




    Now, that burst of anger at the very, very poor reasoning and very, very poor advice that NONE of you should EVER follow... Here is what I got after running it for an hour and five minutes of CPU time:


    BuildFacelights:
    10%...20%...30%...40%...50%...60%...70%...80%...90%... (210.56 seconds)
    BuildVisLeafs:
    10%...20%...30%...40%...50%...60%...70%...80%...90%... (1844.45 seconds)
    visibility matrix : 81.1 megs
    MakeScales:
    10%...20%...30%...40%...50%...60%...70%...80%...90%... (2184.77 seconds)
    SwapTransfers:
    10%...20%...30%...40%...50%...60%...70%...80%...90%... (2966.42 seconds)
    Transfer Lists : 267441920 : 267.44M transfers
    Indices : 69598060 : 66.37M bytes
    Data : 1069767680 : 1020.21M bytes
    Bounce 1 GatherLight:
    10%...20%...30%...40%...50%...60%...70%...80%...90%... (2045.59 seconds)
    Bounce 2 GatherLight:

    I killed the process because by then it had allocated 1.8GB of memory usage.

    Edit: Yes, I know the numbers do not add up to 1.8 GB, but that was the combined usage of all programs running at the time. I'd say... 1.6GB was wholly because of HLRad. And no, HLRad wasnt the only slow process... all the compile tools ran very slowly. But, not nearly on the scale of HLRad.
  • BulletHeadBulletHead Join Date: 2004-07-22 Member: 30049Members Posts: 2,530
    edited October 2004
    ah, aha! Sorry, misread your origional getup

    I thought it was just taking forever to compile! I didn't realize you meant it kept looping

    How do you know for certain it is looping? Do you have a RAM recovery tool (MemTurbo or MaxMem)? Have you defragged in a while? (diskkeeper)? Have you run a hard drive cleanup (accessories/systemtools/diskcleanup)?

    If the first 2 are no, go to www.downloads.com and search for the program name I put in parenthesis. If #3 is a no, then do it.

    It COULD help... especially the defragging. That will make it 300% easier for your computer to find the stuff it needs! I recommend DiskKeeper, as it has never given me a problem in the 2 or so years i've used it.

    Keeping your RAM free is important too... Windows XP Home ITSELF takes about 75 meg of ram to load... XP Pro takes about 100. However, when it is done loading up, they both free... about... 75% of that back up... so XP Home is still keeping about 15 - 20 meg of RAM in a "void" per say, while XP Pro is keeping around 25 meg. This can be cleared up. MaxMem forces your PC to write any info kept in the RAM and Virtual Memory to the Hard Disk, thus freeing it up... however, it will require 100% of your CPU for a few moments to a few minutes to do this.

    Mem Turbo, on the other hand, just aids Windows in clearing, freeing, and defragging unused RAM. A handy tool, as it will continually monitor and adjust yer ram cache, as well as, when needed, perform intensive "sweeps" to clear up as much RAM as it can


    Another good tool is CacheBooster.. it comes pre-set with many different configurations for your PC's memory cache and CPU clock priorites based on what you set it to (to my knowledge... I have no way of proving that it actually does just that) I do know for certain this has helped me squeeze some extra speed from my PC, so it's worth a shot. Again, this is available at www.downloads.com


    Hope some of this helps!



    **EDIT- what is your RAM size? Your Virtual Memory? CPU speed? Front Side Bus? Clock Speed?

    Have you tried just letting it run for around 2 hours? I often have compile times well into 3 hours (I often just let it run overnight) but then again I'm only on a 2.6 gig P4 with only 512 DDR SDram... I have over 10 gig of virtual memory allocated as the max, and 5 gig as the minimum... so yeah... lol
    QUOTE (DragonMech @ Jul 9 2005, 09:19 PM)
    QUOTE (Sonic @ Jul 9 2005, 06:49 PM)
    I wish my butcheeks could propel me up flights of stairs are terrifying speeds.

    I sense a custom title right there... :D
  • FragBait0FragBait0 Join Date: 2003-12-16 Member: 24431Members Posts: 58
    I love the theory behind installing more programs to get rid of old ones and their mess.

    Dosen't windows have that defrag thingy with it... I don't like the wasted CPU time on the stupid graphs myself so i run it on the command line smile-fix.gif

    Those memory programs have a tendancy to have windows simply clear out its cache of files - this will end up giving you drive flog. They use up a lot of memory and thus force windows to do this. The programs themselves don't do jack ****.

    If RAD needs more RAM then windows will clear files out the cache and make some room - all by itself. Why push DLLs that RAD requires out of RAM for them to only be loaded up again when it is run?

    Sorry but all these "system maintinance" tools are a load of crap - but thats just my opinion. smile-fix.gif

    Last is that the proper site is download.com - although trying downloads.com will redirect you to the download.com page smile-fix.gif
  • BulletHeadBulletHead Join Date: 2004-07-22 Member: 30049Members Posts: 2,530
    edited October 2004
    yeah, I've always known it as downloads.com tounge.gif, always worked for me

    Nah, the windows built in one takes FOREVER to run... and isn't very efficient. I have heard this from MANY of my dad's friends at work, most of whom are skilled at building their own PC's, 1 who is a beta tester for MicroSoft, and 1 who is a skilled network technician / computer programmer. They all said that their preference was DiskKeeper, mostly because it's very fast and efficient.

    SpyBot is free as well as good, and it compliments Mozilla FireFox VERY well... it will often catch incoming viruses / spyware / adware minutes before Norton of McAffe does smile-fix.gif It will also provide experienced users easy access to the internals of their PC to work with tough problems

    Trust me, I love these programs... they are all very small (I think SpyBot is the largest, at around 12 or so meg... most of that is information updates, as he trys to update his virus / spyware definitions around once a week... yes, it's like, 2 people that work on this... and most is just one guy...)


    Also, yes, windows WILL clear up ram... eventually. It is VERY inefficient at doing it, as it goes from Ram, to Virtual Memory, to Paging File, and then finally as a perm info slot on your Hard Disk... where as MaxMem goes straight from RAM to the HD. Also, Mem Turbo simply helps windows organize / track your RAM smile-fix.gif
    QUOTE (DragonMech @ Jul 9 2005, 09:19 PM)
    QUOTE (Sonic @ Jul 9 2005, 06:49 PM)
    I wish my butcheeks could propel me up flights of stairs are terrifying speeds.

    I sense a custom title right there... :D
  • AnpheusAnpheus Join Date: 2004-09-30 Member: 32013Members Posts: 63
    edited October 2004
    QUOTE (BulletHead @ Oct 29 2004, 11:34 PM)
    ah, aha! Sorry, misread your origional getup

    I thought it was just taking forever to compile! I didn't realize you meant it kept looping

    How do you know for certain it is looping? Do you have a RAM recovery tool (MemTurbo or MaxMem)? Have you defragged in a while? (diskkeeper)? Have you run a hard drive cleanup (accessories/systemtools/diskcleanup)?

    If the first 2 are no, go to www.downloads.com and search for the program name I put in parenthesis. If #3 is a no, then do it.

    It COULD help... especially the defragging. That will make it 300% easier for your computer to find the stuff it needs! I recommend DiskKeeper, as it has never given me a problem in the 2 or so years i've used it.

    Keeping your RAM free is important too... Windows XP Home ITSELF takes about 75 meg of ram to load... XP Pro takes about 100. However, when it is done loading up, they both free... about... 75% of that back up... so XP Home is still keeping about 15 - 20 meg of RAM in a "void" per say, while XP Pro is keeping around 25 meg. This can be cleared up. MaxMem forces your PC to write any info kept in the RAM and Virtual Memory to the Hard Disk, thus freeing it up... however, it will require 100% of your CPU for a few moments to a few minutes to do this.

    Mem Turbo, on the other hand, just aids Windows in clearing, freeing, and defragging  unused RAM. A handy tool, as it will continually monitor and adjust yer ram cache, as well as, when needed, perform intensive "sweeps" to clear up as much RAM as it can


    Another good tool is CacheBooster.. it comes pre-set with many different configurations for your PC's memory cache and CPU clock priorites based on what you set it to (to my knowledge... I have no way of proving that it actually does just that) I do know for certain this has helped me squeeze some extra speed from my PC, so it's worth a shot. Again, this is available at www.downloads.com


    Hope some of this helps!



    **EDIT- what is your RAM size? Your Virtual Memory? CPU speed? Front Side Bus? Clock Speed?

    Have you tried just letting it run for around 2 hours? I often have compile times well into 3 hours (I often just let it run overnight) but then again I'm only on a 2.6 gig P4 with only 512 DDR SDram... I have over 10 gig of virtual memory allocated as the max, and 5 gig as the minimum... so yeah... lol

    No, it obviously isnt looping. It's simply stuck for such a stupendous amount of time that it literally runs out of virtual memory allocation ability (it just 'freezes' at this point) and then it will sit there for hours with 0-20% cpu usage (nothing else is taking the CPU... so I guess it's just performing simple calculations)

    I use MaxMem, and I had about 400MB free of 512 when I started it, and it very quickly consumes this memory.

    Defragged last week.

    RAM: 512
    Virtual Memory: Irrelevant, when Windows runs out of virtual memory it simply allocates more to the page file until it exceeds a certain threshhold or runs out of hard drive space. There appears to be no maximum, as every time I compile the map it reaches some unpredictable amount between 1.3 and 1.8GBs and then stops allocating more.
    CPU Speed: 1.3GHz (non-overclocked)
    Front Side Bus: Unaware.



    Please listen very carefully: It will compile if I leave it running. It isn't an infinite loop like I thought, this is evident by the log I posted. You are being very condescending and I ask that you please stop. I know how to run a computer, ok? Now let's be clear: your suggestions are neither helpful nor do they present any new information.

    QUOTE
    Sorry but all these "system maintinance" tools are a load of crap - but thats just my opinion.


    Very, very incorrect. Windows comes with a very, very basic set of system maintanence tools. I would change to a different defragmentation program, such as Symantec's, if I had the money.


    Now, here's the thing: This huge compile time is entirely from a recent edit to my map. I have the map and I am waiting for a programmer here to profile it and figure out why it is getting 'stuck'. For example: Compiling without the new addition takes, oh, 10-15 minutes of time total. Adding this part of the map multiplied compile time by, oh, seven times the original. And it multiplied the vismatrix and data by, what appears to be over five times.

    It's rediculous.


    Another Edit:

    Hey Amckern, I was thinking... since hlvis already calculated the vismatrix for which leafs can view each other leaf... why create a vismatrix that consists of every patch in the entire map when you can create a vismatrix for each leaf. If the leaf has no other leafs visible, and it contains no light sources, then there is, simply put, no reason to calculate radiosity in it. Likewise, if a lightsource in Leaf A exists, and Leaf A can see Leaf B, but neither can see Leaf C, there is no reason for them to share a vismatrix. Here are some examples:

    Leaf A can see Leaf B, and Leaf B can see Leaf C... but C and A cannot see eachother. As a result, if there are zero bounces, a light in Leaf A will only illuminate Leaf A and Leaf B. If there is one bounce, it may illuminate Leaf C as well, because it is possible for light in Leaf B to bounce into Leaf C.

    Leaf A <-> Leaf B <-> Leaf C <-> Leaf D... C cannot see A, D cannot see A or B. If -bounce 2 is used, light from Leaf A can reach Leaf D.


    The point is this, if you make a matrix consisting of every single patch in your entire map, there is a very high probability that there is significant overlay or waste. Example is the Ready Room in most NS Maps. The Ready Room is completely seperate from the rest of the map most of the time. So any calculation that occurs in the ready room should consist only of radiosity calculations between leafs that connect to eachother. Understand? There is no reason to create all the null data for patches that cannot view other patches because the leafs have already seperated them. It's performing the same calculation twice. First in HLVis it splits the map into sections that view eachother, then again in HLRad it checks whether or not a Patch in Leaf A can see a Patch in Leaf B... does it ever check whether or not Leaf A can see Leaf B? It's redundant to perform these calculations.


    Edit AGAIN: With some edits to my map I was able to compile it in under an hour, but it's still rediculously high for what little I added.

    But now I have a new problem:

    ** Executing...
    ** Command: F:\VALVEH~1\ZHLT\hlbsp.exe
    ** Parameters: "f:\steam\steamapps\layeredr31"

    hlbsp v2.5.3 rel Custom Build 1.7p15 (Jun 3 2004)
    Zoner's Half-Life Compilation Tools -- Custom Build
    Based on code modifications by Sean 'Zoner' Cavanaugh
    Based on Valve's version, modified with permission.
    Submit detailed bug reports to (webmaster@xp-cagey.com)

    BEGIN hlbsp

    Command line: F:\VALVEH~1\ZHLT\hlbsp.exe f:\steam\steamapps\layeredr31

    Current hlbsp Settings
    Name | Setting | Default

    |
    |

    threads [ 1 ] [ Varies ]
    verbose [ off ] [ off ]
    log [ on ] [ on ]
    developer [ 0 ] [ 0 ]
    chart [ off ] [ off ]
    estimate [ off ] [ off ]
    max texture memory [ 4194304 ] [ 4194304 ]
    priority [ Normal ] [ Normal ]

    noclip [ off ] [ off ]
    nofill [ off ] [ off ]
    noopt [ off ] [ off ]
    null tex. stripping [ on ] [ on ]
    notjunc [ off ] [ off ]
    subdivide size [ 240 ] [ 240 ] (Min 64) (Max 512)
    max node size [ 1024 ] [ 1024 ] (Min 64) (Max 8192)


    Error: ReadSurfs (line 3329): 16593 > g_numplanes



    END hlbsp



    Huh?


    HLCSG had no problems.


    Edit again: You know what, forget it. It works again. Don't ask me how.
    Post edited by Unknown User on
  • amckernamckern Join Date: 2003-03-03 Member: 14249Members, Constellation Posts: 970 Fully active user
    Hey alls

    It seems the major issue is my complier

    If i complie the "error free" 1.7 code, i still get the same crashes

    So here we go, this is the lattes build of the ZHLT 3.0 SDK (Source) http://ammahls.com/zhlt/zhlt30_beta4-src.rar (481kb)

    Changes Include
    New Rad Code by -Nem-
    Inclusion of Cagey tools (Count_ents / opt_plns)
    Readme file
    todo.txt
    No Frils!
  • FragBait0FragBait0 Join Date: 2003-12-16 Member: 24431Members Posts: 58
    Anpheus, your BSP problem you had there....could that have been caused by the CSG changes?
    If you changed the map and recompiled then we may not find out but if you didnt then who knows.
    As for the system tools...well you have your opinion and I have mine smile-fix.gif
    I digress.


    Amckern, I tried to build the beta 4 source...only got BSP, VIS and RAD to build.

    Now I guess it may be because I don't have the new CSG but RAD got broken with a bad surface extents error.
    So I tried changing the subdivide in BSP (to 192) and suddenly RAD works just fine smile-fix.gif

    Of course, its a bit nasty on the polycounts to use a subdivide of 192 and I'm sure the bad surface extents error message is not supposed to happen anyway sad-fix.gif

    I'm using VC++ .NET 2003 if it helps any.
    I also have VC++ 6 but it all opened in the newer one when i double clicked so there it is..
  • AnpheusAnpheus Join Date: 2004-09-30 Member: 32013Members Posts: 63
    I was using p15 with the exception of RAD, which was p13.
  • The_Real_NemThe_Real_Nem Join Date: 2002-12-16 Member: 10900Members Posts: 53
    QUOTE (amckern @ Oct 31 2004, 06:51 AM)
    Hey alls

    It seems the major issue is my complier

    If i complie the "error free" 1.7 code, i still get the same crashes

    So here we go, this is the lattes build of the ZHLT 3.0 SDK (Source) http://ammahls.com/zhlt/zhlt30_beta4-src.rar (481kb)

    Changes Include
    New Rad Code by -Nem-
    Inclusion of Cagey tools (Count_ents / opt_plns)
    Readme file
    todo.txt

    Does this include the p15 mathutil.cpp? (I'm not on my development computer right now.)

    BTW, you can call me Nem, not -Nem- (Nem was taken when I registered).

    Nem
  • amckernamckern Join Date: 2003-03-03 Member: 14249Members, Constellation Posts: 970 Fully active user
    yes this version has the p15 mathutil

    you can download the p10 mathutil from http://www.unknownworlds.com/forums/in...post&id=1296562 this is the pre "broken" rad code, that started about p11

    Also, the orignal sln files where made for .net 2003 (Version = 7.10), but i edited them so they would open in .net 2002 (Version = 7.0), as thats what i use, the sln files would not make the code broken

    @Nem are Ryan Greg? AKA Batch Complie, GCF, Mapview, and Terian Gen fame?

    Adam
    No Frils!
  • MendaspMendasp I touch maps in inappropriate places Valencia, Spain Join Date: 2002-07-05 Member: 884Members, NS1 Playtester, Contributor, Constellation, NS2 Playtester, Squad Five Gold, NS2 Map Tester, Reinforced - Shadow, WC 2013 - Shadow, Community Dev Team Posts: 4,175 Fully active user
    QUOTE (amckern @ Nov 1 2004, 04:02 AM)
    @Nem are Ryan Greg? AKA Batch Complie, GCF, Mapview, and Terian Gen fame?

    Yes, it's him, just check his profile, or look at the threads he has started...
  • ReveReve Join Date: 2003-09-23 Member: 21142Members Posts: 64
    edited November 2004
    amckern - your todo.txt lists "Fix light_env bug" - I have that marked down as fixed in beta 1 on the docs site, is that incorrect?
  • amckernamckern Join Date: 2003-03-03 Member: 14249Members, Constellation Posts: 970 Fully active user
    its in its out

    its part of why rad was crashing, but now i dont know why it has a crash

    CAN SOME ONE PLEASE COMPILE THE TOOLS, AND SEND ME A ZIP OF THE BINARYS (The .sln defalt to build relase)
    No Frils!
  • The_Real_NemThe_Real_Nem Join Date: 2002-12-16 Member: 10900Members Posts: 53
    I would but the solution files are now for Visual Studio .NET 2005? Why is this? Older versions are backwards compatible, not forwards compatible.

    Nem
  • amckernamckern Join Date: 2003-03-03 Member: 14249Members, Constellation Posts: 970 Fully active user
    edited November 2004
    strange

    that download SHOULD WORK for 2002 and above

    i might as well, just copy the .sln files back across from the source code zip, after i do a reinstall of 2002

    SORRY, I just thought that downsizeing the number in the Version = xx section of the .sln files did the job, looks like it has not

    any way i found the rad crash bug

    CODE
    {
       unsigned        x;
       vec3_t          point;
       const dplane_t* plane;
       const Winding*  winding;
    int             returnVal = -1;
    int             facenum;
    vec_t*   endpoint;
    bool blocked = false;

    *endpoint = *p2;
    }


    mathutil.cpp

    Line 190

    warning C4700: local variable 'endpoint' used without having been initialized

    This error is past my know how, and i'm also geting sick of code, code, code, all day at tafe, all night at home, but its almost done, almost

    @Nem - I have also ran around all my files, and updated your name to just plain old Nem - and also added your web address to the docs.doc file
    No Frils!
  • MendaspMendasp I touch maps in inappropriate places Valencia, Spain Join Date: 2002-07-05 Member: 884Members, NS1 Playtester, Contributor, Constellation, NS2 Playtester, Squad Five Gold, NS2 Map Tester, Reinforced - Shadow, WC 2013 - Shadow, Community Dev Team Posts: 4,175 Fully active user
    so the new version will have a not-crashing RAD? \o/
  • amckernamckern Join Date: 2003-03-03 Member: 14249Members, Constellation Posts: 970 Fully active user
  • FragBait0FragBait0 Join Date: 2003-12-16 Member: 24431Members Posts: 58
    I could only open them in .NET 2k3 after changing the version numbers in the .vcproj files.
    Meh.

    Amckern, I got CSG to build - some MAX_VALUE thingy was causing it to die. Put a #ifndef there and it went away... *shrug
    Ah, statreg.h, line 120 or so.
    Anyway you can get the CSG, BSP, VIS and RAD binaries I built here:
    http://users.tpg.com.au/dplevi/zhlt3_beta4...d_fragbait0.rar
    As the link suggest, there are some serious issues.
    CSG is reporting a huge amount of brush errors - beta 2 says none.
    BSP gets stuck in an eternal loop...well I left it for about 5 minutes.. after the first hull but before the second starts...accoring to the log tounge.gif
    I didn't bother with VIS or RAD right now.

    Netvis and ripent wouldn't compile and I CBFed looking at it.
  • The_Real_NemThe_Real_Nem Join Date: 2002-12-16 Member: 10900Members Posts: 53
    Thanks amckern, it seems what was screwing it up from me was that netvis.vcproj was still 8.0 (everything else was 7.0).

    Nem
  • amckernamckern Join Date: 2003-03-03 Member: 14249Members, Constellation Posts: 970 Fully active user
    fragbait WTH?

    Please explain that wih more thought to your words

    And yes, netvis is going to be tossed back to 1.7, becuase for some reason i cant find, nor can any 1 at that, the orignal files that netvis was built with (and get them to work)

    if u go to the erc's MHLT Page, u can find the link in the text at the top of the page
    No Frils!
  • FragBait0FragBait0 Join Date: 2003-12-16 Member: 24431Members Posts: 58
    I thought it was pretty clear.

    CSG wouldn't build untill I added a #ifndef around the MAX_VALUE constant on line 120 or so of statreg.h
    CSG is telling me I have a heap of brush errors despite beta 2 reporting none.
    BSP seems to eternally loop..okay it prints out all the info for SolidBSP [hull 0] but hangs before the "BSP generation successful, writing portal file" message comes up.
    VIS seems fine.
    RAD prints out weird messages about faces being too big.

    Netvis and ripent would not compile.

    And the files I did get compiled are available here:
    http://users.tpg.com.au/dplevi/zhlt3_beta4...d_fragbait0.rar

    Why is that hard to follow?
  • AJenboAJenbo Join Date: 2004-06-14 Member: 29298Members Posts: 30
    sorry that i have been away for so long but the forum stoped mailing me when there was updates

    to make the code compile under 2005 rund the auto converter weizard (will autolunch) and then you have to coment some parts of the code out

    part one is in cmdlib.cpp
    QUOTE
    char* ptr = strrchr(path,'.');
    if(ptr == 0)
    { *extension_position = -1; }
    else
    { *extension_position = ptr - path; }

    ptr = max(strrchr(path,'/'),strrchr(path,'\\'));
    if(ptr == 0)
    { *directory_position = -1; }
    else
    {
      *directory_position = ptr - path;
      if(*directory_position > *extension_position)
      { *extension_position = -1; }
     
      //cover the case where we were passed a directory - get 2nd-to-last slash
      if(*directory_position == strlen(path) - 1)
      {
      do
      {
        --(*directory_position);
      }
      while(*directory_position > -1 && path[*directory_position] != '/' && path[*directory_position] != '\\');
      }
    }


    next open filelib.cpp and coment out
    QUOTE
    time(&start);

    and
    QUOTE
    time(&end);


    that should be it.

    if you want i can zip up the beta3 code and you can have a look at it.
  • AnpheusAnpheus Join Date: 2004-09-30 Member: 32013Members Posts: 63
    Here's a legitimate feature request:


    For all lights I want three new fields to potentially replace the light color field, and possibly a switch to determine which method to use (0 or null of course meaning the default.) These methods would add variation to the lights on compile time to make them not all the same brightness & color.

    Fields & descriptions for Light* entities
    _lighttype - 0 is static, 1 is range color, 2 is range brightness, 3 is range color & brightness
    _lightmin - Determines minimum color, brightness, or both
    _lightmax - Determines maximum color, brightness, or both

    When RAD runs it inserts values adhering to the range to apply to the light.

    Fields & descriptions for info_texlights entity should be similar, except it should be something like [texturename][some character that cant be placed in a texturename][field]

    Like +0random~_type, !random_min and the like.


    Either that or make a new entity with a field that allows you to specify all of these attributes in one key.

  • CageyCagey Ex-Unknown Worlds Programmer Join Date: 2002-11-15 Member: 8829Members, Retired Developer, NS1 Playtester, Constellation Posts: 1,751
    Since I'm no longer part of ZHLT development, could you please replace the bug reporting email address with one that will be helpful to users for the next release? I'm still getting asked questions about the tools and I haven't looked at the source in four months.

    Just search and replace webmaster@xp-cagey.com in the source. Thanks smile-fix.gif.
    XP-Cagey

    Recommended Reading: NS Mapping Guidelines | Mapping Forum FAQ
    Nostalgia: Power Cells Thread
  • AnpheusAnpheus Join Date: 2004-09-30 Member: 32013Members Posts: 63
    Actually, a more intelligent option would be _dither R G B L (luminosity), and correspondingly for textures [texname]_dither R G B L

    The dither option would be simpler and present a range. So if I had lights that I wanted to average at #0080FF (a nice light blue) but I wanted to add some heavy variation to those lights, as though they were not all functioning properly, I could specify the _dither field.



    This would create a lot more realism, as specifying that each texlight have the exact same properties is a horrendous approach.
  • amckernamckern Join Date: 2003-03-03 Member: 14249Members, Constellation Posts: 970 Fully active user
    @ cagey

    That was done back in beta 1, i dont know if people are asking you about the p set, and did not know there was a beta set, but becuase the p set is more stable then beta 2, becuase of the known issues, that are geting fixed for beta 4, i refer people to use the p13/15 tools, and replace the tool that is broken

    @Anpheus

    This sounds good, as it is with one of my old maps, i had to use point lights, that then bleed all over each other, and gave a poor result

    I'll look into it, seein as i have no done my job applaction - I now have more time (I was gona send in the tools, but sent an old mod i was working on, becuase of all the bugs in the tools)

    @Anpheus
    This wont be part of beta 4, there are to many bugs to work out still, so it might be part of beta 5, or 3.0 final

    @myself (amckern)
    Haha, these tools are starting to sound like the NS Betas

    Adam
    No Frils!
  • amckernamckern Join Date: 2003-03-03 Member: 14249Members, Constellation Posts: 970 Fully active user
    UpDate UpDate
    Q. Will there be any ZHLT Style Tools for HL2?
    A. At this point in time, I am talking with Alfred @ valve, and seeing if I can obtain the Complie tools source before the HL2 SDK Source is out. If any one would like to work on these tools, please contact amckern@yahoo.com and place ZHLT some where in the message, or post
    No Frils!
  • amckernamckern Join Date: 2003-03-03 Member: 14249Members, Constellation Posts: 970 Fully active user
    It looks my site is not working for some people, i hope its a dns attack (i have always wanted one)

    It should fix its self up with in the next day or so
    No Frils!
  • ReveReve Join Date: 2003-09-23 Member: 21142Members Posts: 64
    edited November 2004
    w00t! - www.zhlt.info is number 3 on Google for zhlt, number 4 on Yahoo! for Zoner's Half-Life Tools (bit further down on Google for that) and 10 for zhlt, and number 1 on the new msn tech preview for zhlt and 4 for Zoner's Half-Life Tools.

    amckern, how is progress on beta4? I read on your site that it's pretty much done... do you want to bung me the changelog/release notes by email (actually if you don't mind I could do with the change logs for all the previous beta versions as I've noted a few bits have changed, like that beta 3 thing being pulled).
Sign In or Register to comment.