Dude if you have a geforce 4 your FPS should NEVER EVER dip below 60fps in halflife EVER. I dont even have a video card and i stay above 40 at all times.
There is a way to boost your fps with geforce cards but i forget it has something to do with vsync i think. You right click on desktop>properties>settings>advanced>then your card tab and see what you. If anyone else knows what im trying to get across i believe you can explain it better.
<!--QuoteBegin--|SemperFi|+Dec 24 2002, 11:06 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (|SemperFi| @ Dec 24 2002, 11:06 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--><!--QuoteBegin--Fopher+Dec 25 2002, 03:57 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Fopher @ Dec 25 2002, 03:57 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->What's the best card for it's price out lately? Something along the lines of runnin flawlessly smooth on UT2K3.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> That would probably be the geforce2.
What are your system specs? Maybe other things are slowing your card down. Not to mention its the mx series, i think the best one is the TI series. Not 100 percent sure though.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> A GF2 runs UT2k3 no where near "flawlessly smooth" atleast not with a 1.2 Ghz CPU.
Get atleast a GeForce4 Ti 4200 or Radeon 9500 Pro.
Truly, the HL engine is the worst possible benchmark for the modern cards out now (of engines that are actually still played with, of course). The only difference in view between GF2 and GF4 would be anti-aliasing, and that hardly does anything noticeable, especially at high rez. There's no reason to run at lower than 1024x768. Heck, with HL, there's no reason to run below 1280x960 if you've got at least a GF2.
My GF3 ti200 gets... well I don't know how high the frames get. My monitor only goes up to 60 Hz at 1280x960 and HL for some reason will just tell me 60, and not how many it can process. Meh. I don't run AA tho... It's only worth using in HL and Q3 engines and I don't ever feel like turning it on and off when I change games. Meh.
<!--QuoteBegin--BedwettingType+Dec 25 2002, 05:15 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (BedwettingType @ Dec 25 2002, 05:15 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Here we go again.... measuring each other's processor size.
<!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' valign='absmiddle' alt='wink.gif'><!--endemo--><!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> Yeah well what did you expect to see?
With that topic title <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' valign='absmiddle' alt='biggrin.gif'><!--endemo-->
<!--QuoteBegin--|SemperFi|+Dec 24 2002, 11:58 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (|SemperFi| @ Dec 24 2002, 11:58 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Dude if you have a geforce 4 your FPS should NEVER EVER dip below 60fps in halflife EVER. I dont even have a video card and i stay above 40 at all times.
There is a way to boost your fps with geforce cards but i forget it has something to do with vsync i think. You right click on desktop>properties>settings>advanced>then your card tab and see what you. If anyone else knows what im trying to get across i believe you can explain it better.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> vsync caps your maximum FPS to what your monitor's refresh is set at. I.e. if you have 60Hz refresh and a geforce 100000000 but vsync is on, you will NEVER go over 60fps. Disable it to get your 1-trillion fps reading <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' valign='absmiddle' alt='biggrin.gif'><!--endemo--> That's what you mean, right?
Just turn down your GF's antialiasing and filtering and whatnot and your fps will get a MAJOR increase, and it's no even that noticable in HL.
as far as i heard UT2003 cannot do any kind of bump mapping and if it did then people would say "Doom3? what's that? oh who cares"
the best graphic cards in the half-life engine are just gonna make it easier to use 32bit color and trilinear interpolation and i think AA and maybe calculate things to higher precision so you see less blockiness when you look closer at walls caused by integer rounding off errors. most of the hardware of the latest graphic cards will be wasted though coz half-life is so old and needs to explicitly use those new features. half-life doesn't even use higher than 8 bit textures.
EDIT- damn, how could i forget to mention resolution.
I run HL in OpenGL rendering on a Athlon XP 1600+, 256 DDR, and a GeForce4 Ti4400 at 1024x768, with 32bit color enabled, 8x Anistropic filtering, Texture Sharpening enabled, 2x FSAA (though 4x makes no noticable difference) and HL runs at a constant 85fps, my fps_max due to refresh. I run HL in OpenGL rendering on a Athlon XP 1600+, 256 DDR, GeForce4 Ti4400, SB Audigy.
Vsync is actually a very useful device, in that it makes sure the picture on the screen is constant at all times.
Without Vsync, your card updates the video overlay as many times as it can (for arguement's sake, let's say 100fps). What exact does this mean for the average gamer?
Simply this. As your Radeon 9700 kicks into overdrive and pumps out a solid 100fps, your monitor still only refreshes at 60Hz (cycles a second). This means that the picture in the buffer is changing more often than it can be drawn on the screen. As HL renders each frame, your brand new video card will have changed the image slightly, and you get an effect known as shearing, which means that the image you're looking at on your screen is actually a composite of several different images pulled from the video buffer. This is especially noticable when you're making fast movements.
You're left with two choices, right? Play at the standard 60 fps and have the game run smoothly, or jump to the 100fps HL can support, and deal with a choppy screen.
Both options suck.
The thing to do, for people in the know is boost the Vsync cap along with your fps. With Vsync set to match your fps, you get an incredibly smooth, very playable game, at a fps you love, and a picture you can tolerate.
For older OS's, this Vsync feature is sometimes available right in the video drivers, but most often, you'll have to track down a seperate hack.
I found mine, "RefreshFix", for my ATI Radeon 7500 @ guru3d.com
Remember, gamers in the know use Vysnc, and use it well.
I remember going to a friend's dorm (back in college) and seeing him play counter strike. He was going "look at my frame rate it's a constant 100!" Back then I guess that was kinda cool, but then I gasped: "Was that shearing I just saw?"
"What do you mean? what are you talking about?" he replied. So I told him to strafe back and forth and look at a corner (pointy end towards him). And sure enough, there's a slice right across the middle for a split second. As soon as I told him about that, he started notcing it everywhere and asked me how to get rid of it. So I told him "Vsync, my friend. Vsync."
Problem is he had a crappy monitor that only got 60hz at 1024 res so he was stuck with either shearing 100 fps or a strobing shear-free screen at 60. Haha, sucker.
The moral of the story is: Do everything possible to have at least a 75Hz refresh rate monitor at all resolutions. And turn vsync on all the time. No strobing, great fps, good times.
<!--QuoteBegin--tojo13+Dec 24 2002, 09:37 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (tojo13 @ Dec 24 2002, 09:37 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->only 800 x 600 but i can get my fps up to 500+<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> 500+ FPS? Er...you do know that the human eye doesn't read too well beyond 60 FPS or so, right? Hell, I doubt your monitor's refresh rate is even that fast. (Not that getting an insanely high FPS at 800x600 resolution in the frickin' half-life engine is a spectular feat per se...)
<!--QuoteBegin--uranium - 235+Dec 25 2002, 02:39 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (uranium - 235 @ Dec 25 2002, 02:39 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->(Psst: This is the part where someone with a TNT2 says "The human eye can only see blah-de-blah so you wasted your money")<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> I have a Geforce 4 myself.
<!--QuoteBegin--NickBlasta+Dec 25 2002, 01:53 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (NickBlasta @ Dec 25 2002, 01:53 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Here are some. Copy paste, as the host doesn't support displaying images on this forum.
My rig: Athlon XP 2400+, Abit KR7-a RAID, 768mb Crucial PC2100, Retail Radeon 8500 Shots are 1600x1200x32, no AA, 16x ansio
Oh, and since people seem to be posting UT2k3 shots, here's one of mine. <a href='http://upload.ipaska.com/122002/nbut11.jpg' target='_blank'>http://upload.ipaska.com/122002/nbut11.jpg</a><!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> haha trying to use amdmachine's hosting here, lol.
<!--QuoteBegin--QuantumSlip+Dec 25 2002, 04:24 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (QuantumSlip @ Dec 25 2002, 04:24 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->haha trying to use amdmachine's hosting here, lol.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> Yeah.. that's why I said copy paste. I used that hosting to show those shots over at genmay, so I figured I'd just post the urls here.
<!--QuoteBegin--Error404:+Dec 25 2002, 06:24 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Error404: @ Dec 25 2002, 06:24 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Whats the maximum resolution anyone has run NS with a decent framerate?
I'm sure that if you have a Nvidia based CPU, and you get the latest drivers it does make a difference to OpenGL gaming.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> I run HL at 1600*1200, and yea at times I don't get that great FPS to be honest... max is probably around 70 something but at times it can drop bellow 30 wich is not nice <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' valign='absmiddle' alt='sad.gif'><!--endemo-->
The reason I can't bear to play it at a lower res though is that the text displayed in the "hive sight" become too obstructive :/ At 1600*1200 the text is smal and doesn't get in the way.
...anyway the framrate doesn't drop that low unless there is some transparent effects like umbra etc on the screen.
btw I only have a GF2 GTS, and to be honest if you have any type of GF4 or Radeon 9000+ I doubt there is any reason to play at any other resolution than 1600*1200 if your monitor can support it that is...
<b>Note: Some of the color was changed during conversion to .jpg, so bear with it.</b>
<img src='http://home.insightbb.com/~bghsurge/images/nsscreenshot.jpg' border='0' alt='user posted image'><!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> AHAHHAHAHAHA that was an AWESOME PIC!
<!--QuoteBegin--RazorClaw+Dec 25 2002, 07:58 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (RazorClaw @ Dec 25 2002, 07:58 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->I run HL at 1600*1200, and yea at times I don't get that great FPS to be honest... max is probably around 70 something but at times it can drop bellow 30 wich is not nice <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' valign='absmiddle' alt='sad.gif'><!--endemo-->
btw I only have a GF2 GTS, and to be honest if you have any type of GF4 or Radeon 9000+ I doubt there is any reason to play at any other resolution than 1600*1200 if your monitor can support it that is...<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> I'm running a Geforce2 Ti, I've never tried running the mods at 1600x1200, not because it lowers the framerate but because it makes the text too small to read, I suppose it doesn't matter though in NS.
<!--QuoteBegin--NickBlasta+Dec 25 2002, 07:53 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (NickBlasta @ Dec 25 2002, 07:53 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Oh, and since people seem to be posting UT2k3 shots, here's one of mine. upload.ipaska.com/122002/nbut11.jpg<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd--> Ha BR r teh winnar!1
Heres where i had a little mishap trying to jump through the gate thingy. Suffice to say my teamates werent happy.
BTW to all those people who say use V-Sync or don't use it and have tearing or use max_fps and have lag, there is one simple solution. Use triple buffering and V-Sync combined with max_fps of 9999. Although I'm not sure if HL supports triple buffering...
Comments
There is a way to boost your fps with geforce cards but i forget it has something to do with vsync i think. You right click on desktop>properties>settings>advanced>then your card tab and see what you. If anyone else knows what im trying to get across i believe you can explain it better.
<!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' valign='absmiddle' alt='wink.gif'><!--endemo-->
That would probably be the geforce2.
What are your system specs? Maybe other things are slowing your card down. Not to mention its the mx series, i think the best one is the TI series. Not 100 percent sure though.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
A GF2 runs UT2k3 no where near "flawlessly smooth" atleast not with a 1.2 Ghz CPU.
Get atleast a GeForce4 Ti 4200 or Radeon 9500 Pro.
The only difference in view between GF2 and GF4 would be anti-aliasing, and that hardly does anything noticeable, especially at high rez. There's no reason to run at lower than 1024x768. Heck, with HL, there's no reason to run below 1280x960 if you've got at least a GF2.
My GF3 ti200 gets... well I don't know how high the frames get. My monitor only goes up to 60 Hz at 1280x960 and HL for some reason will just tell me 60, and not how many it can process. Meh.
I don't run AA tho... It's only worth using in HL and Q3 engines and I don't ever feel like turning it on and off when I change games. Meh.
<!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' valign='absmiddle' alt='wink.gif'><!--endemo--><!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
Yeah well what did you expect to see?
With that topic title <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' valign='absmiddle' alt='biggrin.gif'><!--endemo-->
There is a way to boost your fps with geforce cards but i forget it has something to do with vsync i think. You right click on desktop>properties>settings>advanced>then your card tab and see what you. If anyone else knows what im trying to get across i believe you can explain it better.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
vsync caps your maximum FPS to what your monitor's refresh is set at. I.e. if you have 60Hz refresh and a geforce 100000000 but vsync is on, you will NEVER go over 60fps. Disable it to get your 1-trillion fps reading <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' valign='absmiddle' alt='biggrin.gif'><!--endemo--> That's what you mean, right?
Just turn down your GF's antialiasing and filtering and whatnot and your fps will get a MAJOR increase, and it's no even that noticable in HL.
To the Batmobile!
dadadadadadadadad BATMAN!!....
I need some sleep.
the best graphic cards in the half-life engine are just gonna make it easier to use 32bit color and trilinear interpolation and i think AA and maybe calculate things to higher precision so you see less blockiness when you look closer at walls caused by integer rounding off errors. most of the hardware of the latest graphic cards will be wasted though coz half-life is so old and needs to explicitly use those new features. half-life doesn't even use higher than 8 bit textures.
EDIT- damn, how could i forget to mention resolution.
Edit : Forgot some details.
Without Vsync, your card updates the video overlay as many times as it can (for arguement's sake, let's say 100fps). What exact does this mean for the average gamer?
Simply this. As your Radeon 9700 kicks into overdrive and pumps out a solid 100fps, your monitor still only refreshes at 60Hz (cycles a second). This means that the picture in the buffer is changing more often than it can be drawn on the screen. As HL renders each frame, your brand new video card will have changed the image slightly, and you get an effect known as shearing, which means that the image you're looking at on your screen is actually a composite of several different images pulled from the video buffer. This is especially noticable when you're making fast movements.
You're left with two choices, right? Play at the standard 60 fps and have the game run smoothly, or jump to the 100fps HL can support, and deal with a choppy screen.
Both options suck.
The thing to do, for people in the know is boost the Vsync cap along with your fps. With Vsync set to match your fps, you get an incredibly smooth, very playable game, at a fps you love, and a picture you can tolerate.
For older OS's, this Vsync feature is sometimes available right in the video drivers, but most often, you'll have to track down a seperate hack.
I found mine, "RefreshFix", for my ATI Radeon 7500 @ guru3d.com
Remember, gamers in the know use Vysnc, and use it well.
"What do you mean? what are you talking about?" he replied. So I told him to strafe back and forth and look at a corner (pointy end towards him). And sure enough, there's a slice right across the middle for a split second. As soon as I told him about that, he started notcing it everywhere and asked me how to get rid of it. So I told him "Vsync, my friend. Vsync."
Problem is he had a crappy monitor that only got 60hz at 1024 res so he was stuck with either shearing 100 fps or a strobing shear-free screen at 60. Haha, sucker.
The moral of the story is: Do everything possible to have at least a 75Hz refresh rate monitor at all resolutions. And turn vsync on all the time. No strobing, great fps, good times.
My rig: Athlon XP 2400+, Abit KR7-a RAID, 768mb Crucial PC2100, Retail Radeon 8500
Shots are 1600x1200x32, no AA, 16x ansio
upload.ipaska.com/122002/unfunf.jpg
upload.ipaska.com/122002/eggjump.jpg
upload.ipaska.com/122002/pimpwalkin.jpg
upload.ipaska.com/122002/marineidiots.jpg
Oh, and since people seem to be posting UT2k3 shots, here's one of mine.
upload.ipaska.com/122002/nbut11.jpg
<!--emo&;)--><img src='http://www.unknownworlds.com/forums/html/emoticons/wink.gif' border='0' valign='absmiddle' alt='wink.gif'><!--endemo--><!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
I've been told size doesn't matter.
Perhaps I was lied to?
500+ FPS? Er...you do know that the human eye doesn't read too well beyond 60 FPS or so, right? Hell, I doubt your monitor's refresh rate is even that fast.
(Not that getting an insanely high FPS at 800x600 resolution in the frickin' half-life engine is a spectular feat per se...)
<!--QuoteBegin--uranium - 235+Dec 25 2002, 02:39 AM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (uranium - 235 @ Dec 25 2002, 02:39 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->(Psst: This is the part where someone with a TNT2 says "The human eye can only see blah-de-blah so you wasted your money")<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
I have a Geforce 4 myself.
My rig: Athlon XP 2400+, Abit KR7-a RAID, 768mb Crucial PC2100, Retail Radeon 8500
Shots are 1600x1200x32, no AA, 16x ansio
<a href='http://upload.ipaska.com/122002/unfunf.jpg' target='_blank'>http://upload.ipaska.com/122002/unfunf.jpg</a>
<a href='http://upload.ipaska.com/122002/eggjump.jpg' target='_blank'>http://upload.ipaska.com/122002/eggjump.jpg</a>
<a href='http://upload.ipaska.com/122002/pimpwalkin.jpg' target='_blank'>http://upload.ipaska.com/122002/pimpwalkin.jpg</a>
<a href='http://upload.ipaska.com/122002/marineidiots.jpg' target='_blank'>http://upload.ipaska.com/122002/marineidiots.jpg</a>
Oh, and since people seem to be posting UT2k3 shots, here's one of mine.
<a href='http://upload.ipaska.com/122002/nbut11.jpg' target='_blank'>http://upload.ipaska.com/122002/nbut11.jpg</a><!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
haha trying to use amdmachine's hosting here, lol.
Yeah.. that's why I said copy paste. I used that hosting to show those shots over at genmay, so I figured I'd just post the urls here.
I'm sure that if you have a Nvidia based CPU, and you get the latest drivers it does make a difference to OpenGL gaming.
I'm sure that if you have a Nvidia based CPU, and you get the latest drivers it does make a difference to OpenGL gaming.<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
I run HL at 1600*1200, and yea at times I don't get that great FPS to be honest... max is probably around 70 something but at times it can drop bellow 30 wich is not nice <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' valign='absmiddle' alt='sad.gif'><!--endemo-->
The reason I can't bear to play it at a lower res though is that the text displayed in the "hive sight" become too obstructive :/ At 1600*1200 the text is smal and doesn't get in the way.
...anyway the framrate doesn't drop that low unless there is some transparent effects like umbra etc on the screen.
btw I only have a GF2 GTS, and to be honest if you have any type of GF4 or Radeon 9000+ I doubt there is any reason to play at any other resolution than 1600*1200 if your monitor can support it that is...
<a href='http://www.clan187.net/images/100fps.jpg' target='_blank'>proof </a>
i got a "3d prophet 7500 ddr 64 mb"
<b>Note: Some of the color was changed during conversion to .jpg, so bear with it.</b>
<img src='http://home.insightbb.com/~bghsurge/images/nsscreenshot.jpg' border='0' alt='user posted image'><!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
AHAHHAHAHAHA that was an AWESOME PIC!
btw I only have a GF2 GTS, and to be honest if you have any type of GF4 or Radeon 9000+ I doubt there is any reason to play at any other resolution than 1600*1200 if your monitor can support it that is...<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
I'm running a Geforce2 Ti, I've never tried running the mods at 1600x1200, not because it lowers the framerate but because it makes the text too small to read, I suppose it doesn't matter though in NS.
I'll give it a go.
upload.ipaska.com/122002/nbut11.jpg<!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
Ha BR r teh winnar!1
Heres where i had a little mishap trying to jump through the gate thingy. Suffice to say my teamates werent happy.
Oh and Ragdoll physics own you.
BTW to all those people who say use V-Sync or don't use it and have tearing or use max_fps and have lag, there is one simple solution. Use triple buffering and V-Sync combined with max_fps of 9999. Although I'm not sure if HL supports triple buffering...