Ati Radeon 9200

UltimaGeckoUltimaGecko hates endnotes Join Date: 2003-05-14 Member: 16320Members
<div class="IPBDescription">and its graphics options</div> As stated in the title I have an ATI Radeon 9200. My fps in Half-Life never drop below 30 and go up to 70, but for a 128mb graphics card I was expecting something a bit higher. During a game of DoD someone said that AA and Pixel Shaders can slow a gpu to a crawl (in an attempt to get...lots of fps! <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo--> ), so I went to my options to try and lower the settings, but here's the problem...

I have no idea what the stuff means really.
SMOOTHVISION is Anti Aisaling (spelling), but I don't know about the rest of it. When I set it lower my text in game gets weird black outlines.

There's Ansiotropic Filtering, no idea what it does, and it's set to application's preference for Open GL and D3D.
There's Smoothvision, Disabled for Open GL, Applications Preference (Quallity) for D3D.

Then there's sliders for Texture detail, Mip Map lighting and vertical sync, which I have no idea what anything but texture detail is. Someone said I should set AA to 2x or 3x.

So how would I go about making it efficient as opposed to good looking, or fast. I want a serene blend of quality and performance <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo--> ...and I don't think the main slider is going to cut it...since Application Preference basically means it runs it as high as possible. Any help?



Listening to: Runaway Train, by Tom Petty
...unless I have the artist wrong...

Comments

  • Marik_SteeleMarik_Steele To rule in hell... Join Date: 2002-11-20 Member: 9466Members
    edited August 2003
    My recommendadion is that you get the latest drivers from Omegacorner (<a href='http://www.omegacorner.com' target='_blank'>http://www.omegacorner.com</a>). They're not just any drivers; they're the official ones with some behind-the-scenes changes for better visual quality and/or performance. The effect is the same as if you had a ton of so-called "video card optimization programs," without running the risk that one of those 3rd party programs causes more problems than it solves.

    Be sure to read the entire readme as well, just to make sure you don't do something that negates the effect of these drivers from the start.
  • DubbilexDubbilex Chump Join Date: 2002-11-24 Member: 9799Members
    edited August 2003
    <!--QuoteBegin--Marik_Steele+Aug 17 2003, 10:13 PM--></span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Marik_Steele @ Aug 17 2003, 10:13 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> My recommendadion is that you get the latest drivers from Omegacorner (<a href='http://www.omegacorner.com' target='_blank'>http://www.omegacorner.com</a>).  They're not just any drivers; they're the official ones with some behind-the-scenes changes for better visual quality and/or performance.  The effect is the same as if you had a ton of so-called "video card optimization programs," without running the risk that one of those 3rd party programs causes more problems than it solves.

    Be sure to read the entire readme as well, just to make sure you don't do something that negates the effect of these drivers from the start. <!--QuoteEnd--></td></tr></table><span class='postcolor'><!--QuoteEEnd-->
    He ain't lyin' <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html/emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif'><!--endemo-->


    Fo' Sho.'
  • UltimaGeckoUltimaGecko hates endnotes Join Date: 2003-05-14 Member: 16320Members
    Far as I can tell the fps are the same as before with this driver. Except now there's about 5 more options to mess with in the display settings....You've succeded in confusing me more, Marik <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo--> . Maybe I'll just tinker with it until my computer burns out and stops working...

    This might have fixed the text thing, but I'm to tired to test another game besides the fps I just did. Just not sure what the AA should be at... 3x Performance, 2x Quality, 2x Performance...no idea. I'll thank you for the extra display options though, even if I have no idea what they do.

    Does anyone know what Vertical Sync, Anisotropic Filtering, Fast Write (which the read me says I should never use anyway, makes you wonder why it's there) and Truform are supposed to do? Darn me and my graphics card illiterateness. Eventually I'll find out that that driver was the best thing to ever happen to my computer....so I'll just thank you now, even though I don't see a difference yet...only tried Half-Life though.



    Listening to: When I Come Around, by Green Day
    ...freshly typed for every post!
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    edited August 2003
    Vertical Sync - Keeps your video card from drawing too fast, resulting in 'tearing'.. essentially the video card trying to draw frames faster than your monitor can fully display them. For optimal visual quality, turn this on.

    Anisotropic Filtering - When viewing a texture at an angle, it gets kinda blurry and unpleasant... moreso with the less of the surface you see, due to the angle. Aniso samples the texture more properly, resulting in a clean, crisp appearance to severe-angle surfaces (such as if your viewpoint is nearly parallel to a wall). This is a BIG slowdown, but results in very good visual quality. I'd turn it off.

    Fast Write - A method for a program to write directly to the video card. It was attempted, works in some configurations where the integrator can balance and tweak things to perfection. If you don't know what it is, make sure it's turned OFF as it can simply slow things down, if not result in instability (crashes).

    Truform - An ATI enhancement, adding detail to models transparently. Easily THE BIGGEST SLOWDOWN of all. If you know what you're doing, enable it. Otherwise, leave it turned off... especially on the budget ATI cards, which do not handle Truform on-card, but offload it onto your CPU (which is even slower). It essentially splits all the polygons given and rounds them, resulting in a more natural, organic appearance.. but only in games with support for Truform.
    Half-Life *does* have TF support, and limits it to models. But as noted, it'll drop your framerate in the gutter even if it looks beautiful. (also: in HL, set ati_npatch to 0, and ati_subdiv to 0 just to be certain)


    'Application Preference' does NOT mean 'maximum setting', by the way. It actually means 'default to OFF, unless the program tells me otherwise'. On a 9200 without tweaking, you'll want to set Anisotropic Filtering and Smoothvision to 'Application Preference' for best framerate. You're running with 2x FSAA enabled, which WILL slow down the frames renderable. Setting this to application preference will result in an immediate and noticable boost in speed, as the card does not have to deal with essentially rendering at double the resolution, then sampling it down to your selected one.

    Additionally: How many megs of RAM are on the video card mean jack squat in terms of speed. They don't make it run any faster, other than in avoiding swapping out textures. That's pretty much all those gobs of real estate are for... texture files. Newer games may use up those 128MB, but Half-Life runs just fine on a 32MB vid card, with space to spare. :b


    In HL, the maximum FPS defaults to 72. Unless you fix this value, it'll never go over that. And the hard-limit that the engine itself can handle is 100fps... doesn't matter if you set the cap to 105, 300, or fnord. You'll only get 100, tops. All of which matters little to none if you turn on VSync as recommended, as most monitors cap at 60hz due to microsoft stupidity, so you'll get a hard-limit at 60fps with it on. The important part is that it should be a nice, STABLE 60fps... meaning smoother gameplay without unexpected frameskips when moving into a heavily-populated area (or less severe ones at least). And with the FPS-dependence removed in v2.0, it doesn't matter how many frames per second you're pumping... your guns fire just as fast, JPs give as much thrust, and you build just as quickly at 15fps as you do at 100fps.
  • Marik_SteeleMarik_Steele To rule in hell... Join Date: 2002-11-20 Member: 9466Members
    edited August 2003
    [edit]Talesin beat me to it with some better/more accurate explanations for some things, but I'll leave the rest of my post here for the links to comparison screenies.[/edit]

    You probably already know what anti-aliasing/smoothvision is and what it does, but in case you don't, this should explain it nicely. Click on "No AA," "2x," etc. for comparison screenshot.
    <a href='http://www.nvnews.net/reviews/prolink_gf3_ti200/mp_antialiasing.shtml' target='_blank'>http://www.nvnews.net/reviews/prolink_gf3_...ialiasing.shtml</a>
    Look at relatively straight lines, like the sides of Max's pant legs and jacket sleeves; see those jagged edges with no AA on? Now see 2x and 4x (QC may be an nvidia-card-only thing, but it's effectively their 3x). The higher the number for anti-aliasing, the fewer jagged edges you'll see, and the lower your framerate will be. Personal preference. If I run a game at 800x600, I'll use 2x; if I run it at 1024x768, I'll turn it off.

    Vertical sync (as far as I know) makes sure that when your screen changes from one image/frame to another, the whole screen will change at once (rather than, say, the top half refreshing and then the bottom half refreshing, resulting in some odd sawed-in-half look somewhere in the middle for a frame or few). It lowers framerate, but for some people having vertical sync off is more irritating to their eyes than a low framerate. Personal preference.

    Anisotropic filtering is a way of making sure that textures on walls and characters looks good, even if it's far away. Go to any long hallway or large room, and you'll notice that at a certain distance away, the floor/walls suddenly have a lower-quality look to them. The game engine assumes that at distances that long, you don't really care. But for some games (including NS, where distinguishing from a skulk or a wall at a longer distance means life or death) you may appreciate having it on. Omegacorner has some comparison screenshots at the bottom of their page for <a href='http://www.omegacorner.com/nvidia.htm' target='_blank'>Nvidia drivers</a>....hmm, from the looks of it, just having the Omega drivers instead of the standard ones is already better looking than anisotropic filtering with the official drivers. Play around with it and see if it makes a difference to you.

    Fast writes: good, you read the readme.

    Truform: this is something specific to ATI, but unsupported in many/most games. According to CSnation, valve has (reportedly) been working on adding support for it since 2001, but I don't know if it did get into any of the publically-released patches. They may have given up on it to work harder on HL2. But some comparison screenshots of internal tests are here: <a href='http://www.anandtech.com/video/showdoc.html?i=1517&p=6' target='_blank'>http://www.anandtech.com/video/showdoc.html?i=1517&p=6</a>. Look at the character's knees, elbows, head, back of the shoulder, or any other surface that is supposed to be curved but simply isn't when Truform is off. You'll have to look on google to find out which of your games supports it.
  • NecroticNecrotic Big Girl&#39;s Blouse Join Date: 2002-01-25 Member: 53Members, NS1 Playtester
    Um I have teh 9200 and with the out of the box drivers and no tweakage I get 100fps...I know this may sound condescending but its the only problem I had (because I'm that damn smart) but have you got fps_max and fps_modem set to 100 ;D
  • UltimaGeckoUltimaGecko hates endnotes Join Date: 2003-05-14 Member: 16320Members
    wtih the fps_max and fps_modem at 100 i get 40-60 instead of 25-50. I want to set the smoothvision (AA) to 2x based on performance, and the AF to 4x, but when I do the text in game gets a weird black outline...Sortof like the text isn't the same reslution as the display. It looks really annoying. When it's set to defaults I don't get it, and I'm not sure what it's from...I think AF, but...bah.



    Listening to: Bohemian Rhapsody, by Queen
    ...Scaramouch, scaramouch will you do the fandango?
  • TenSixTenSix Join Date: 2002-11-09 Member: 7932Members
    edited August 2003
    Ansiotropic Filtering = Better texture quality

    Smooth Vision or whatever = Makes polygons look more 'meshed', can really make crap look wierd too.

    Anti-Aliasing (AA) = Basically blurs your graphics very slightly, I recommend 2x at most, at its highest setting it looks kinda ugly.

    V-Synch = Turn it on and you lose some FPS, but things look much smoother. Turn it off and you gain FPS but you will notice some jerky animations.

    Of course I have a GFX5200, but the above options are universal. Don't worry much about turning any of them to max, the only thing that would impact your FPS in a game like HL would perhaps be AA. But now a days theres no reason to turn off stuff like V-Synch or Ansiotropic Filtering unless you want 400fps in Quake 3 instead of 350.
  • OttoDestructOttoDestruct Join Date: 2002-11-08 Member: 7790Members
    AF really isn't worth it IMO, just use AA 2x.
  • TalesinTalesin Our own little well of hate Join Date: 2002-11-08 Member: 7710NS1 Playtester, Forum Moderators
    edited August 2003
    Smoothvision == Antialiasing

    Antialiasing == Large framerate hit, even at 2x. Default is 'Application Preference' which means 'off unless the program specifically tells me to turn it on', which very few have.

    It's the AA and Aniso that are hurting your framerate. Set them both to Application Preference, which is the default, which is OFF.



    (edit) Amazing how many people think they know what their video card options do, but are dead wrong... (/edit)
  • TenSixTenSix Join Date: 2002-11-09 Member: 7932Members
    On any high end card theres no need to tweak for fps on a 5 year old game like Half-Life. Crank everything up and make it look perty before HL2 comes out and your reduced to having to play in 800x600 with everything set to low <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html/emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif'><!--endemo-->
  • Soylent_greenSoylent_green Join Date: 2002-12-20 Member: 11220Members, Reinforced - Shadow
    Talesin I run XP and I changed my settings with a program called nVidia Refresh Rate Fix MKII when I had a gf4 mx440(more appropriately called geforce 2 super duper ultra), to disallow all framerates except the ones I wanted(basically windows thinks the only safe framerate is the one I want it to run). I can't remember what little third party hack I used for my radeon 9600pro to force it to use the max refresh rate.

    60 Hz simply flickers like mad, you don't have to look at it with the "corner" of your eye which is more sensitive to changes in the intensity of light(and worse at colors), you can look straight at it and see it flicker. Conviniently, DX tries to default to 75 Hz(much more pleasant) if it detects that setting, why might that be huh? <!--emo&:angry:--><img src='http://www.unknownworlds.com/forums/html/emoticons/mad.gif' border='0' style='vertical-align:middle' alt='mad.gif'><!--endemo-->
  • Soylent_greenSoylent_green Join Date: 2002-12-20 Member: 11220Members, Reinforced - Shadow
    TenSix, why would that be? High end cards seems to manage HL2 without any problems. The radeon 9800 pro 128 MB version was used at the e3 demo and it had a constant 60 FPS at max in-game settings(at some resolution like 800x600 or possibly 1024x768, I guess projectors don't flicker at 60 Hz, vsync on), FSAA and AF was turned of. The game is currently being optimized and is going to work even better by the time it is out. The minimum specs are very low(not yet completely determined, but it should run at something like pIII-700 MHz, with DX-6 hardware, and 128 MB mem), I suspect it will look horrid but it might not run THAT bad.
  • Soylent_greenSoylent_green Join Date: 2002-12-20 Member: 11220Members, Reinforced - Shadow
    edited August 2003
    Sorry for posting so much but addressing more than one person in each post isn't good.

    fps_modem does nothing at all anymore, don't bother with it. Bother with forcing winxp to run openGL with a refresh rate higher than 60 Hz if you are using a CRT, set the fps_max to match it. Don't turn off vsync all it does is cause frame tears, if your monitor has a refresh rate of 60 Hz and your HL says your fps is 100 then you should take the hint, HL isn't showing all those 100 fps it is only drawing them in memory and doing the computations, but it can not show them all.

    With vsync on your frame rate can only take values that are roughly refresh rate divided by an integer, this is good, since when your framerate varies up and down and up and down like a roller coaster it is much more noticable than an even framerate, even if it is slightly lower and free of hideous frame tears. If you are CPU limited you should see your framerate vary sporadically even with VSYNC on since the CPU-load is much more variable and pretty independent of what you look at unless there are dynamic lights or such. If you are CPU limited you should see little or no difference in framerate from increasing the resolution.

    My old mx440 could do a pretty constant 85 fps(my refresh rate) at 1024x768 with AA set to 2X when I gave it enough CPU power(as well as memory bandwith if that was a problem) <!--emo&:0--><img src='http://www.unknownworlds.com/forums/html/emoticons/wow.gif' border='0' style='vertical-align:middle' alt='wow.gif'><!--endemo-->. However my old PC(pIII-600 MHz, 320 MB sdram) and the previously mentioned mx440 only managed a meager 70-ish fps when looking at a clean wall, and diping as low as 15 fps or so in battles.
  • UltimaGeckoUltimaGecko hates endnotes Join Date: 2003-05-14 Member: 16320Members
    So I guess I'll just leave it how it is then...funny that Necrotic gets 100fps and I don't...maybe it's because I have a 2.0ghz (well 1998mhz) AMD and only 256 mbs of RAM (working on fixing that...)...I don't think those would drop my framerate though...

    Still works better than may ATI Rage Pro Turbo 8mb one upstairs.10-30fps with all the settings at low (for dod and ns anyway). It has 386mb of ram though...I stole the mic from that one...wonder if the bus speed is the same...512 mbs of ram couldn't hurt this one, heh.

    The fact that default can be off might explain why the text resolution differs from normal resolution when AA and AF are on. Thanks for the info on what the stuff does everybody.



    Listening to: Runaway Train, by (It says Tom Petty but it doesn't sound like him...)
    ...maybe I'll ask my sister.
  • ANeMANeM Join Date: 2003-05-13 Member: 16267Members, Constellation
    edited August 2003
    Ya... running a 2ghz without at least 512mbs of ram will screw you up quite a bit...
    512mbs is pretty much the bare minimum for a 2+ghz computer. Any less and its little wonder why you have problems.. most recent Windows builds take 50-100mbs of ram just to run, even more if you are running XP with all the bubbly effects and the animations.. that junk can take 10-40mbs alone. That doesn't leave much for NS. for best results (on XP) turn off all the effects until it looks ugly and sharp like Windows 98. With corners sharp enough to slice open an HA and fatally wound an onos.
  • pardzhpardzh Join Date: 2002-10-25 Member: 1601Members
    Hmm, increased performance from these Omegacorner drivers?

    But are they stable and reliable? I'm hesitant to install third party drivers, even if it does gimme a bump...

    I'm not really concerned about AF or AA in HL... my 9800 Pro handles AAx6 and AFx4 at 100fps, hehe. <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo-->
  • Soylent_greenSoylent_green Join Date: 2002-12-20 Member: 11220Members, Reinforced - Shadow
    As for the old MHz debate, let me try to give a slight insight into why it is wrong to compare different processors solely by clock frequency. This is from memory and thus slighty on the 'ranting' side.

    GHz mean very little too performance, if you specify only the frequency you say absolutely nothing. Currently the worlds fastest computer uses a big cluster of CPUs operating at 800 MHz and it has 10 TB of RAM(weeeeee <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html/emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif'><!--endemo-->). Even when you specify only that it is a single CPU system and it's CPU is operating at some frequency you say nothing. It depends on functional units inside the processor, on efficient use of resources, on memory bandwidth on the application at hand on the amount and types of cache.

    A very good example of this is that an athlon XP 1700+ (that's 1566 MHz I think, with 256 KB L2 cache and 128 KB L1 cache) can beat a celeron overclocked to 3.5 GHz in a game benchmark. This is a rare example and I would sadly have a hard time finding it again <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html/emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif'><!--endemo--> , but what happened is that the smaller cache of the celeron was just not enough to fit the data needed so it had to keep reading it from RAM which slows things down entirely because it can do nothing but sit idle and wait for data much of the time. This was a one of the more CPU intensive UT2k3 benchies with a graphics card fast enough to shift the bottleneck to the CPU.

    in some applications a celeron is just a lovely processor though, but not newer games. It is as good as a regular p4 at media encoding and such if you give it the same memory bandwidth(which is not allways an easy thing to do).

    Pentium 4 and celeron both have very deep floating point and integer pipelines. This allows you to ramp up the frequency since each stage requires less work to complete but there are more stages, the upside is that every stage can be kept busy with a different computation and you get things done more quickly if you can keep the machinery busy and avoid misstakes. If however you suddenly encounter a conditional branch the processor must have good branch prediction to guess what the outcome is or it will have to wait until the relevant instruction is complete. Modern processors take the guess approach, if it guesses wrong it has done some faulty calculations based on this and it basically just throws them out the window and starts over. If you have a processor with a long pipeline you lose more if a branch prediction is wrong.

    The athlon processor has 3 floating point units, a p4 has only one but it is operating at twice the frequency of the rest of the processor. In addition the p4 has deeper pipelines which make some applications suffer. To further complicate things some pentium 4's have hyperthreading which is actually a dissadvantage in most games which are typically single threaded applications(though the performance hit is small). New p4's have very high memory bandwidth making them more appropriate for some applications. The new opterons and athlon 64's are 64-bit processors, that does not necessarily mean that they are twice as fast as a simularily architectured 32-bit processor, it means that they can handle 64-bit data as quickly as a 32-bit processor does 32-bit data. You can handle higher precision or larger numbers with the same ease. This is not the only advantage, you can also address more memory(a 32-bit processor can 'only' address 4 GB).

    There are so many things that are more important than pure clock frequencies. Clock frequencies are only an accurate description of performance when comparing different processors with the same architecture.

    As it happens, an athlon XP 2GHz is roughly equivalent to a 2.4 GHz p4 in many games and in floating point intensive applications. Since athlon XP lacks SSE2 optimizations it is generally poor in heavily SSE2 optimized applications. It is very difficult to give a direct comparison between 2 processors without going into special case and giving the results of numerous real world tests as well as benchmarks.
Sign In or Register to comment.