.cfg For Max Graphics
SDJason
Join Date: 2003-05-29 Member: 16841Members
<div class="IPBDescription">want EYE CANDY</div> OK.. first off, i just jumped from a 1.8ghz 256meg ram Geforce 2 mx/400 computer, to a 3.2ghz 800fsb 2gb ram Geforce 6800 YAY FOR ME
BUT... what i want is.... ALL eye candy in NS/hl.. i know there are configs to squeeze every frame out of this engine, i want the reverse, i wanna squeeze out every bit of quality.....
I know glow effects arent available, but i want everything else!! all of it!!! GIMME EYE CANDY.. haha
Can someone help me??
TY in advance
~Jason
BUT... what i want is.... ALL eye candy in NS/hl.. i know there are configs to squeeze every frame out of this engine, i want the reverse, i wanna squeeze out every bit of quality.....
I know glow effects arent available, but i want everything else!! all of it!!! GIMME EYE CANDY.. haha
Can someone help me??
TY in advance
~Jason
Comments
By the way..why did you get a 6800.
what console commands?
that allows up to 4096 decals, like blood sprays and bullet holes.
also check around the customization forum for some high quality models.
By the way..why did you get a 6800. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Because the 6800 is the best card available with more instructions, the X800 cheats on anti aliasing too. Also nvidia makes the best video cards to play half life on, considering almost everyone uses openGL.
Word. That's the best way to make Half-Life look purty.
mp_decals "4096"
r_mmx "1"
r_shadows "0"
r_novis "1"
r_mirroralpha "1"
gl_cull "0"
gl_clear "1"
gl_keeptjunctions "1"
gl_texsort "1"
gl_overbright "1"
gl_polyoffset "0.1"
gl_flipmatrix "0"
gl_wateramp "0"
gl_ztrick "0"
gl_dither "0"
gl_smoothmodels "0"
gl_spriteblend "1"
gl_lightholes "1"
gl_monolights "0"
gl_round_down "0"
gl_picmip "0"
gl_playermip "0"
gl_texturemode GL_LINEAR_MIPMAP_LINEAR
gl_max_size "1024"
gl_palette_tex "1"
precache "1"
hpk_maxsize "1.0"
fastsprites "0"
d_spriteskip "0"
All I can think of atm. The difference is minimal anyway.
Right click on your desktop & click properties. Go to settings; then hit the advanced button. Under the advanced option, hit the OpenGL tab. There should be a slide bar that says:
<!--c1--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>CODE</b> </td></tr><tr><td id='CODE'><!--ec1-->
|--------------|---------------|----------------|--------------|
Performance Balanced Quality
<!--c2--></td></tr></table><div class='postcolor'><!--ec2-->
Just slide the bar over to 'quality.' My freind that told me about this said the difference was noticable, but I haven't tested this.
Hope that helps!
By the way..why did you get a 6800. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Because the 6800 is the best card available with more instructions, the X800 cheats on anti aliasing too. Also nvidia makes the best video cards to play half life on, considering almost everyone uses openGL. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
If you are running a machine like that you should not be looking at how the card performs in half-life. Look at HL2, where all nvidia cards suck.
And don't tell anyone.. shh... the 6800 is still second-place, shown up by the X800. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo-->
In any case, for best graphical candy you'll probably want to snag the detail texturepack . A number of forumites figured out the math behind how to set up a 'reinforcing' pattern, that makes the existing texture look better, without looking overly-tiled. There has been talk of high-def polypacks, but none of them really have held true to the original NS look and feel... most going for cyber-dragon-ninja-Z-warriors, or special-ops-ninja-pajama-boy-Z-operative look and feel.
If you were on an ATi, I'd suggest grabbing the TruForm-friendly modelpack... but that won't really do you much good.
Turning on AA and AF would probably help... but then again, with nVidia not conforming to the AF spec, even when you tell it to turn its 'optimizations' off, I can't really say one way or the other if it'd look better or not. <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
Now HL2 indeed ran bad on 5xxx series, that wasn't because those were manufactured by nVidia or that the game was not optimized for nVidia cards. It was solely that those cards SUCKED when it came to PS 2.0 VS 2.0 as used in DX9 standards. This problem has been fixed by the 6xxx series, so I can't see why HL2 would run bad on the newer cards, please enlighten me or check up the infos before posting nonsense. <!--emo&::nerdy::--><img src='http://www.unknownworlds.com/forums/html//emoticons/nerd.gif' border='0' style='vertical-align:middle' alt='nerd.gif' /><!--endemo-->
PS. I'm no Ati or nVidia fanboy, I just get the card which gives me the most performance/$$.