Remove work on my GPU shoulders

AlunduynAlunduyn Italy Join Date: 2014-12-05 Member: 199963Members
edited December 2014 in Technical Support
There has been tons of threads about performance on NS2. Read many of them and the main problem seems to be CPU. Not mine tho.

I have awful FPS drops, from 50 to 2 in any moment, mostly fight but not exclusively. My settings are already to super low (decals lifetime to 0, everything off except physics multithreading, GPU Texture RAM to 1GB, VSync) and I downscaled the resolution from 1920x1080 to 1600x900. I won't whine about "OMG I've 9k FPS on *insert game*", but I really need a hint right now. Game is unplayable as it is now, and I'd really want to keep up (14hrs on record).

Bottleneck is totally GPU : my wait time is from 0 to 5ms during the smooth times, and it grows to 20 or even 50ms, making the framerate drop to 5/10 fps.

I was thinking about lightning : r_stats states always 13 lights rendered. Is there any way to manually remove part of them? My GPU (GTX 260M) is pretty outdated, even if vRAM is 1GB, so texture handling should not be the problem. I guess that remoning things to calculate would make its life easier. Simplifying effects could also help. Could not find any option, be it in game or in config.

Is there any chance to act directly on these? If not, I don't see many chances of playing this jewel well. Reducing the resolution just always increases the FPS by 5 (if at 1920 I go from 45 to 3, at 1600 I go from 50 to 8), so playing at 800x600 would probably make the drops just temporary nuisances, but I'd really like to not go down 1600x900 as resolution.

Thanks for reading

edit : CPU is an intel i7 1.6GHz x4. Not essential to know I guess, but sharing doesn't hurt.
«1

Comments

  • GhoulofGSG9GhoulofGSG9 Join Date: 2013-03-31 Member: 184566Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Squad Five Silver, Reinforced - Supporter, WC 2013 - Supporter, Pistachionauts
    You might misunderstood something here waiting for world update = waiting for cpu !

    And your cpu is ways below the minimum requirements. Believe me playing ns2 is mostly impossible without having at least more than 2 Ghz per core.

    Try out enabling physic multi-threading (might help a little bit) and if you want further support please take a screen-shot of your r_stats.
  • NordicNordic Long term camping in Kodiak Join Date: 2012-05-13 Member: 151995Members, NS2 Playtester, NS2 Map Tester, Reinforced - Supporter, Reinforced - Silver, Reinforced - Shadow
    I agree with ghoul your cpu is the bottleneck, and your gpu is a bit on the low side too.

    Some additional ways you might pull a few fps more is by playing on servers that use ns2+ mod. With it you can set a low lights option, reduce atmospherics, and some other minor performance tweaks. I really don't think this will help you enough to make the game playable. Ns2+ also has a tons of other cool features too.

    Also, I would recommend playing on smaller servers (16 people) if possible. The more players, the worse your performance can be in those large battles especially in the late game. I recommend this not because of gameplay, but just performance. Even with that I don't think it would be enough.
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    edited December 2014
    Yea, your GPU performs less than an 8800 GTX, which came out almost 9 years ago.

    Also, the minimum required specifications for the game are almost universally agreed as being too low, just for your information.
    I'd consider the "Recommended" specs being the bare minimum; A core 2 duo 3 ghz is still going to be slow and together with that 450 gpu you wont even be able to hold 60 fps.
    So, in general I would say that your laptop is just too weak to play NS2 enjoyably, unfortunately. You probably have issues running other modern games (that aren't on the source engine) like the new assassins creeds, witcher 2, crysis, metro, shadow of mordor etc?

    My condolences if you got tricked into buying that laptop; I really hate retailers who sell "gaming laptops" to people who are unfamiliar, and generally sell low clockrate CPUs and bottom of the current generation GPUs. So many friends of mine have had this happen to them.

    If you end up needing any help building a new rig, feel free to ask :)


  • AeglosAeglos Join Date: 2010-04-06 Member: 71189Members
    Hijacking the thread, but I'd like to ask for help with my rig. Bottlenecks are gpu and world update thread (what is that anyway?).

    Currently have this
    Processor Intel® Core™ i5-2320 CPU @ 3.00GHz
    Video Card NVIDIA GeForce GTX 550 Ti
    Memory 8.2 GB

    and am ordering a GTX 970.

    Do I need to get a new processor? Will an SSD help with FPS? I load pretty quickly despite my HDD.
  • NordicNordic Long term camping in Kodiak Join Date: 2012-05-13 Member: 151995Members, NS2 Playtester, NS2 Map Tester, Reinforced - Supporter, Reinforced - Silver, Reinforced - Shadow
    Aeglos wrote: »
    Hijacking the thread, but I'd like to ask for help with my rig. Bottlenecks are gpu and world update thread (what is that anyway?).

    Currently have this
    Processor Intel® Core™ i5-2320 CPU @ 3.00GHz
    Video Card NVIDIA GeForce GTX 550 Ti
    Memory 8.2 GB

    and am ordering a GTX 970.

    Do I need to get a new processor? Will an SSD help with FPS? I load pretty quickly despite my HDD.
    World update is cpu. You do not need a new processor. An ssd never helps fps in any game, but it will make things much nicer for OS and loading times. I load in 10-12 seconds on an ssd.
  • AeglosAeglos Join Date: 2010-04-06 Member: 71189Members
    Well yeah, Ghoul said so too. But whats the difference between that and waiting for cpu?

    Also, if that is a bottleneck, why don't I need a new processor?
  • GhoulofGSG9GhoulofGSG9 Join Date: 2013-03-31 Member: 184566Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Squad Five Silver, Reinforced - Supporter, WC 2013 - Supporter, Pistachionauts
    Aeglos wrote: »
    Well yeah, Ghoul said so too. But whats the difference between that and waiting for cpu?

    Also, if that is a bottleneck, why don't I need a new processor?

    There is no "waiting for cpu" anymore in the r_stats due to the fact that ns2 is now somewhat supporting multi-threading.

    So "waiting for world update" = waiting for cpu to calculate the world status of the current tick (basically calculate what is going on based on incoming network packages and client input etc. in-game) and pass it to the GPU so that the GPU can start rendering.

    Hope that answers your question ;)
  • ATFATF Join Date: 2014-05-09 Member: 195944Members
    Overclocking it to 3.6 shouldn't take more than 10 minutes.

    As for the Laptop CPU, that is indeed trickery.
    Says turbo clock is 2.8 GHz on intel site for i7-720QM. If the Bios allows it (prob won't), straight out oc to 3.2 (optional slight undervolt) might help. As others have said getting even that to playable levels.. tough.
  • AeglosAeglos Join Date: 2010-04-06 Member: 71189Members
    Thanks Ghoul. Don't understand the technical details but got the gist of it.

    @ATF‌ I'm not familiar or comfortable with overclocking though. I'm not really technically competent. Would it not be safer to get a new processor? Worried about overheating and stuff.

    I don't know, I feel like fumbling around installing factory stuff is easier than modiyfing existing stuff.
  • GhoulofGSG9GhoulofGSG9 Join Date: 2013-03-31 Member: 184566Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Squad Five Silver, Reinforced - Supporter, WC 2013 - Supporter, Pistachionauts
    Aeglos wrote: »
    Thanks Ghoul. Don't understand the technical details but got the gist of it.

    @ATF‌ I'm not familiar or comfortable with overclocking though. I'm not really technically competent. Would it not be safer to get a new processor? Worried about overheating and stuff.

    I don't know, I feel like fumbling around installing factory stuff is easier than modiyfing existing stuff.

    That's to some point true depending on if you did build your pc with oc'ing in mind or not. Meaning do you have Z77 board, also what cpu cooler do you use etc. .

    Overall over-clocking a i5 non k cpu is never the best idea these days as the gain vs power need and heat is not very effective.

    So in your case i guess OC'ing wouldn't really help you a lot at all ;)
  • AeglosAeglos Join Date: 2010-04-06 Member: 71189Members
    I have an MSI P67A-C45. Cooler? No idea. Just regular fans probably. I didn't build it myself then, had someone else help me, but I suppose I should start doing things myself.
  • matsomatso Master of Patches Join Date: 2002-11-05 Member: 7000Members, Forum Moderators, NS2 Developer, Constellation, NS2 Playtester, Squad Five Blue, Squad Five Silver, Squad Five Gold, Reinforced - Shadow, NS2 Community Developer
    NS2 has a somewhat complex performance profile.

    Main sequence of steps in generating a frame:
    1. Check if the GPU is overloaded (has too many outstanding frames of data) , waiting until the GPU catches up
    2. Setup for rendering (draining stuff from the Lua world into the Spark engine)
    3. Split off a WorldUpdateJob in its own thread, which will use the Lua VM to prepare data for the next step
    4. Do the actual rendering pipeline work (which does not use the Lua VM, it's all C++)
    5. Wait for the Update Job to finish.
    6. Finish off the frame (using the Lua VM)
    7. Present it to the GPU and wait until the GPU has accepted it (driver pushing data to the graphics card)

    Note that as the LuaVM is not multithreaded, it has to be handed off between threads, with only one thread using it at a time.

    There are three wait points here which are shown on r_stats
    - waiting for GPU : shows how much time the thread spends waiting for the GPU in #1 + #7
    - wait for renderer > 0 : shows the time between the end of the UpdateWorldJob and when the render finished, so step #4 is the bottleneck
    - wait for update > 0 : is time spent in step #5, ie the UpdateWorldJob is the CPU bottleneck.

    Note that while in any one frame, only one of wait for update/render can be > 0, what is shown is the average. So a little bit of flickering between renderer/update thread shows that they are balanced.

    The UpdateWorldJob is normally the CPU bottleneck in lategame, while the render is usually the CPU bottleneck in the early game.

    You can use these to figure out where your bottleneck is and what you can do to improve performance.

    Only measure during lategame combat stuff; measuring when the game is idling is useless.

    1. If the GPU limits you during lategame, try dropping graphics settings.
    2. If the render thread is the limit during lategame, drop graphics settings
    3. If the update world job is the limit ... make sure you have physics multithreading on, that's the only one that affects the world update job speed. Then increase your graphics options until things are balanced.
  • AlunduynAlunduyn Italy Join Date: 2014-12-05 Member: 199963Members
    You might misunderstood something here waiting for world update = waiting for cpu !

    And your cpu is ways below the minimum requirements. Believe me playing ns2 is mostly impossible without having at least more than 2 Ghz per core.

    Try out enabling physic multi-threading (might help a little bit) and if you want further support please take a screen-shot of your r_stats.

    Indeed, I'm waiting for my GPU, all other values aren't that bad (1/2 ms worst cases).
    I attached 3 screenhots of my r_stats. I was spectating. First one is in an easy landscape.<img src="https://us.v-cdn.net/5019629/uploads/FileUpload/c7/79fb2adfe1e199c35b575ca2a013c2.jpg&quot; />
    Second one is in fight, but smooth.<img src="https://us.v-cdn.net/5019629/uploads/FileUpload/aa/90ef60850ac7d3f9e14b8b9f46e9cc.jpg&quot; />
    Third one is just after a great spike (wasn't able to capture exactly in the middle due to the spike itself). GPU waiting time is going back to 0, but a 22 is visible, showing that it was quite higher. FPSs are still 10 and other 2 "waiting for" are not that bad (7 in world theter, because CPU is not great, but non even close to the 22+ of the GPU).<img src="https://us.v-cdn.net/5019629/uploads/FileUpload/2c/72add7c0b59a772365f3dda240f37d.jpg&quot; />
    Hope these can help.

    Nordic wrote: »
    I agree with ghoul your cpu is the bottleneck, and your gpu is a bit on the low side too.

    Some additional ways you might pull a few fps more is by playing on servers that use ns2+ mod. With it you can set a low lights option, reduce atmospherics, and some other minor performance tweaks. I really don't think this will help you enough to make the game playable. Ns2+ also has a tons of other cool features too.

    Also, I would recommend playing on smaller servers (16 people) if possible. The more players, the worse your performance can be in those large battles especially in the late game. I recommend this not because of gameplay, but just performance. Even with that I don't think it would be enough.

    Will check NS2+, was considering adding it universally, features looked quite awesome, this definitively convinced me.
    IronHorse wrote: »
    Yea, your GPU performs less than an 8800 GTX, which came out almost 9 years ago.

    Also, the minimum required specifications for the game are almost universally agreed as being too low, just for your information.
    I'd consider the "Recommended" specs being the bare minimum; A core 2 duo 3 ghz is still going to be slow and together with that 450 gpu you wont even be able to hold 60 fps.
    So, in general I would say that your laptop is just too weak to play NS2 enjoyably, unfortunately. You probably have issues running other modern games (that aren't on the source engine) like the new assassins creeds, witcher 2, crysis, metro, shadow of mordor etc?

    My condolences if you got tricked into buying that laptop; I really hate retailers who sell "gaming laptops" to people who are unfamiliar, and generally sell low clockrate CPUs and bottom of the current generation GPUs. So many friends of mine have had this happen to them.

    If you end up needing any help building a new rig, feel free to ask :)


    I read about the minimum being not-enough minimum, but I don't feel like being on an Extremely Low-End RIG. I know it's a laptop, it can't even stand a chance against a desktop, I was knowing that when buying, but portability questions and price convinced me. I wasn't misinformed, I didn't regret the choice at all, being able to move it is priceless, and I need to move a lot. Besides, yes, my PC has 5 to 6 years now, but it was a high-end piece of hardware when it came out. The problem might just be hardware failure, but it looks weird because I haven't any problem with other new games (at low specs obvs, can't afford to play Witcher 2 at High, but I don't suffer of insane FPS drops. Metro 2033 is Mid-Graphics, too, and have a great 60 FPS experience. Don't own other games, but I could mention DotA2, Warframe, PoE and Sanctum 2 being played at low-mid settings without problems. Oh, and resolution was 1920x1080 on every one).
    About the CPU, is it that crucial? The game makes so little use of multiple cores? Minimum states 4GHz of computation required, with my 4 cores I can get 6,4GHz (1,6x4), but never thought that 3rd and 4th core could not be even used. Is it that necessary to have high clocks on fewer cores? ty for pointing out.
  • NordicNordic Long term camping in Kodiak Join Date: 2012-05-13 Member: 151995Members, NS2 Playtester, NS2 Map Tester, Reinforced - Supporter, Reinforced - Silver, Reinforced - Shadow
    The CPU is that crucial, I would refer to Matsos post. I don't think it is a hardware failure, ns2 is just that much of a performance eater.
  • matsomatso Master of Patches Join Date: 2002-11-05 Member: 7000Members, Forum Moderators, NS2 Developer, Constellation, NS2 Playtester, Squad Five Blue, Squad Five Silver, Squad Five Gold, Reinforced - Shadow, NS2 Community Developer
    edited December 2014
    Alunduyn wrote: »
    About the CPU, is it that crucial? The game makes so little use of multiple cores? Minimum states 4GHz of computation required, with my 4 cores I can get 6,4GHz (1,6x4), but never thought that 3rd and 4th core could not be even used. Is it that necessary to have high clocks on fewer cores? ty for pointing out.

    The client on average uses about 2.5 cpu. It makes use of upto 4 cores with physics multithreading on, but only in short bursts. Most of the time, the speed of the game will increase linearly by Ghz. Performance will improve by a little with a third core, not much by the fourth core and nothing beyond that.

    The idea that a quad core at 2Gz == 1 core @ 8Ghz ... does not square with reality.

    Not for most programs, but most definitely not for NS2.

  • Soul_RiderSoul_Rider Mod Bean Join Date: 2004-06-19 Member: 29388Members, Constellation, Squad Five Blue
    During the Beta for NS2, I had a temporary rig which was a NVidia 250GT, and an I3 530 processor. This is more powerful than your laptop, and I maxed out at 26fps if I was lucky. Although performance of NS2 has improved, I honestly don't see low end spec machines working on NS2 without major issues.
  • Kouji_SanKouji_San Sr. Hινε Uρкεερεг - EUPT Deputy The Netherlands Join Date: 2003-05-13 Member: 16271Members, NS2 Playtester, Squad Five Blue
    It's actually of no use comparing the Spark Engine to other game engines, the fact of the matter is that it is indeed a resource hog and those recommended specs are IMHO the absolute minimum specs.

    Basically, for NS2 your laptop is way too low in specs to be able to brute force it, other engines simply aren't as dependent on sheer brute force of the CPU. Spark might be one of the most modable engines out there and perfect for small dev teams, but it pays the price of having high requirements on minimum settings. Also NS2's content is mostly what is causing this AFAIK, remove powergrid/infestation and the engine suddenly runs a lot smoother (NS2:Combat for instance)
  • AlunduynAlunduyn Italy Join Date: 2014-12-05 Member: 199963Members
    edited December 2014
    Kouji_San wrote: »
    It's actually of no use comparing the Spark Engine to other game engines, the fact of the matter is that it is indeed a resource hog and those recommended specs are IMHO the absolute minimum specs.

    Basically, for NS2 your laptop is way too low in specs to be able to brute force it, other engines simply aren't as dependent on sheer brute force of the CPU. Spark might be one of the most modable engines out there and perfect for small dev teams, but it pays the price of having high requirements on minimum settings. Also NS2's content is mostly what is causing this AFAIK, remove powergrid/infestation and the engine suddenly runs a lot smoother (NS2:Combat for instance)
    I'm not complaining about the engine, mods make games live longer, so it's definitively worth some performance. Apart from this, due to the high customization that gives the engine, there isn't really anything to remove from graphics quality? Specific lights (compensated by increasing brightness), complicated models, rooms rendered behind walls...? Steam Workshop had a "super low" texture pack with almost solid color stuff to release the vRAM from work. It's not working anymore. I'm looking for this kind of stuff mostly
    Soul_Rider wrote: »
    During the Beta for NS2, I had a temporary rig which was a NVidia 250GT, and an I3 530 processor. This is more powerful than your laptop, and I maxed out at 26fps if I was lucky. Although performance of NS2 has improved, I honestly don't see low end spec machines working on NS2 without major issues.
    "Powerful" is not a great word to describe performance behavior. The fact that most of the time is 50 FPS with sudden drops to 5 FPS makes even wonder how much less powerful it may really be, or how much performance Unknown Worlds has got out of their game. I'd like to point back the topic on "what to do", not "how big is your RIG". I already stated that I know it's outdated hardware, I know I can't expect 360FPS at HIGH, I'm asking about removing stuff from the render Q, to release my GPU from work. Or CPU. I was quite sure the problem was the first but I'm not that sure anymore.
    matso wrote: »
    The idea that a quad core at 2Gz == 1 core @ 8Ghz ... does not square with reality.
    I know the story of the cores stuff, but 1core 3GHz > 4core 1,6GHz for NS2? I'm guessing this from these posts, but it sounds quite insane, even having spoke of not high-fps-oriented engine.

  • matsomatso Master of Patches Join Date: 2002-11-05 Member: 7000Members, Forum Moderators, NS2 Developer, Constellation, NS2 Playtester, Squad Five Blue, Squad Five Silver, Squad Five Gold, Reinforced - Shadow, NS2 Community Developer
    Alunduyn wrote: »
    matso wrote: »
    The idea that a quad core at 2Gz == 1 core @ 8Ghz ... does not square with reality.
    I know the story of the cores stuff, but 1core 3GHz > 4core 1,6GHz for NS2? I'm guessing this from these posts, but it sounds quite insane, even having spoke of not high-fps-oriented engine.

    How many cores are used by a program depends on the program. The NS2 client was designed to use two cores, because that was the target platform. Considering how small the UWE NS2 team was and how difficult it has been to get it to run reasonably well on two cores, it did not make any sense whatsoever of spending resource to make it run better on quad cores.

    A 2 core 3Ghz platform will run NS2 "reasonably" (for a generous value of reasonable) well. A 4 core 1.6Ghz platform ... won't.

    That being said, I'd guess that a quad 1.6Ghz runs the game pretty much as well as a 2x3Ghz did on release. Which is to say, not very well... things have improved a lot since release (no lua JIT _at all_ back then, and a whole load of tiny improvements in performance).

    It is highly unlikely that things ever will improve enough to make a quad 1.6Ghz a viable platform though.

    Sorry, but that's just reality.
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    edited December 2014
    Well I am glad you did not get tricked :) but just fyi:
    Alunduyn wrote: »
    Besides, yes, my PC has 5 to 6 years now, but it was a high-end piece of hardware when it came out.
    Sorry to be the bearer of bad news, but it wasn't :-/ It was pretty entry level still, at least as far as gaming goes.*

    Mobile GPUs better than yours at that time: GTX 275, GTX 280, GTX 285, GTX 295 (not counting AMD)
    Mobile CPUs better than yours at that time: (Yours was actually at the bottom of available CPUs) Core i7-740QM, Core i7-820QM, Core i7-840QM, Core i7-920XM, Core i7-940XM and 11 more in the "Arrandale" category..

    Two very resourceful sites that help compare CPUs and GPUs (prices, performance, value) :
    http://videocardbenchmark.net/
    http://cpubenchmark.net/

    *Laptops will always be inferior when it comes to gaming, unfortunately, just do to the power requirements and heat dissipation issues they face.
    That's besides the actual difference in similar hardware, too. Here's a good example: Your card versus the desktop version

    I've owned (needlessly wasted $) an Alienware laptop and build my own... so let me just say that if you're looking for bare bones systems that give you a good deal for a lot of power, check out http://www.sagernotebook.com/
    You won't get Dell level of customer support, but you'll get good prices for the hardware. (and they are less likely to offer BS configurations)
  • AlunduynAlunduyn Italy Join Date: 2014-12-05 Member: 199963Members
    edited December 2014
    matso wrote: »
    Alunduyn wrote: »
    matso wrote: »
    The idea that a quad core at 2Gz == 1 core @ 8Ghz ... does not square with reality.
    I know the story of the cores stuff, but 1core 3GHz > 4core 1,6GHz for NS2? I'm guessing this from these posts, but it sounds quite insane, even having spoke of not high-fps-oriented engine.

    How many cores are used by a program depends on the program. The NS2 client was designed to use two cores, because that was the target platform. Considering how small the UWE NS2 team was and how difficult it has been to get it to run reasonably well on two cores, it did not make any sense whatsoever of spending resource to make it run better on quad cores.

    A 2 core 3Ghz platform will run NS2 "reasonably" (for a generous value of reasonable) well. A 4 core 1.6Ghz platform ... won't.

    That being said, I'd guess that a quad 1.6Ghz runs the game pretty much as well as a 2x3Ghz did on release. Which is to say, not very well... things have improved a lot since release (no lua JIT _at all_ back then, and a whole load of tiny improvements in performance).

    It is highly unlikely that things ever will improve enough to make a quad 1.6Ghz a viable platform though.

    Sorry, but that's just reality.

    I'm not whining, game is great as it is, I was just digging for more specific informations, and these are. Thanks for sharing. Will consider Overclocking tho, maybe getting those to 2 may help a little.
    IronHorse wrote: »
    Well I am glad you did not get tricked :) but just fyi:
    Alunduyn wrote: »
    Besides, yes, my PC has 5 to 6 years now, but it was a high-end piece of hardware when it came out.
    Sorry to be the bearer of bad news, but it wasn't :-/ It was pretty entry level still, at least as far as gaming goes.*

    Mobile GPUs better than yours at that time: GTX 275, GTX 280, GTX 285, GTX 295 (not counting AMD)
    Mobile CPUs better than yours at that time: (Yours was actually at the bottom of available CPUs) Core i7-740QM, Core i7-820QM, Core i7-840QM, Core i7-920XM, Core i7-940XM and 11 more in the "Arrandale" category..

    Two very resourceful sites that help compare CPUs and GPUs (prices, performance, value) :
    http://videocardbenchmark.net/
    http://cpubenchmark.net/

    *Laptops will always be inferior when it comes to gaming, unfortunately, just do to the power requirements and heat dissipation issues they face.
    That's besides the actual difference in similar hardware, too. Here's a good example: Your card versus the desktop version

    I've owned (needlessly wasted $) an Alienware laptop and build my own... so let me just say that if you're looking for bare bones systems that give you a good deal for a lot of power, check out http://www.sagernotebook.com/
    You won't get Dell level of customer support, but you'll get good prices for the hardware. (and they are less likely to offer BS configurations)

    Let's not bring the talk out of contest. It's a laptop, and due to this hardware will always be forced to face more trouble than a desktop (these being room and heat), and thus a "high tier" graphic card for a desktop will always beat a laptop one. But comparing a desktop and a laptop makes no more sense than comparing a cluster Supercomputer with a really high end desktop. They are completely different platforms (not exactly to be totally honest, those are just clusters, but I'm sure you can get the point). Alienware laptops are not exactly what a gaming RIG should be, but as I stated before portability was not an option, and assembling my personal laptop was kind of out of discussion, considering that heat dissipation and room issues would have still be present. I'll take the fault and rephrase my sentence : "Besides, yes, my PC has 5 to 6 years now, but it was a high-end piece of laptop hardware when it came out." About having spent well or not my money, I'd say this was worth. Thanks for the sites tho, will check them, but I really don't think I will be going for hardware update, nor for a desktop anytime soon, mostly because I can't afford both an hardcore Desktop and a mobile laptop.

    P.S.: I hate BBCode. I really do. I had to rewrite all my posts at least once. Sad me is sad.
  • ATFATF Join Date: 2014-05-09 Member: 195944Members
    @Alunduyn, saw you on Wooza's the day before yesterday. The 42 player server is not just stressing the server hardware, it's tough on clients, too. While I am well aware how much fun that place it, as long as you're running NS2 on a potato, consider playing on <20 slot servers. There are decent ones with likable people: Hellarious Basterds, YOClan and Survival of the Fattest.

    Aeglos wrote: »
    @ATF‌ I'm not familiar or comfortable with overclocking though. I'm not really technically competent. Would it not be safer to get a new processor? Worried about overheating and stuff.
    I don't know, I feel like fumbling around installing factory stuff is easier than modiyfing existing stuff.
    Might aswell reply as the question was directed at me personally:
    DEXTER JETTSTER: It depends.
    OBI-WAN: On what, Dex?
    Dexter grins.
    DEXTER JETTSTER: On how good your manners are... and how big your pocketbook is...
  • AeglosAeglos Join Date: 2010-04-06 Member: 71189Members
    ATF wrote: »
    Aeglos wrote: »
    @ATF‌ I'm not familiar or comfortable with overclocking though. I'm not really technically competent. Would it not be safer to get a new processor? Worried about overheating and stuff.
    I don't know, I feel like fumbling around installing factory stuff is easier than modiyfing existing stuff.
    Might aswell reply as the question was directed at me personally:
    DEXTER JETTSTER: It depends.
    OBI-WAN: On what, Dex?
    Dexter grins.
    DEXTER JETTSTER: On how good your manners are... and how big your pocketbook is...

    I suppose I can outsource the technical difficulties away, but I probably don't need manners in that case.

    Kidding aside, I have been recommended a cpu below my maximum budget and been advised that I have to watch out for bottlenecking my gpu as well. Thanks for the help though. Appreciate it.

  • ArchieArchie Antarctica Join Date: 2006-09-19 Member: 58028Members, Constellation, Reinforced - Supporter, WC 2013 - Supporter
    edited December 2014
    Fine, for that small input lag that wont be added at all, enjoy your low fps.
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    nizb0ag wrote: »
    In console type r_sync 1 and disable minimal on top right of marines, tell me how it goes :) just test it.
    This will incur an additional frame of buffering and will increase your mouse input delay.
    I do not recommend this advice.
  • ArchieArchie Antarctica Join Date: 2006-09-19 Member: 58028Members, Constellation, Reinforced - Supporter, WC 2013 - Supporter
    edited December 2014
    Fine, for that small input lag that wont be added at all, enjoy your low fps.
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    edited December 2014
    16 to 32 ms of extra mouse input delay for 1 extra buffered frame. (based on 60 to 30 fps)
    It's the same exact amount as Vsync, but instead of capped frames it can give a few more fps in some cases, so the delay amount will vary.
  • ArchieArchie Antarctica Join Date: 2006-09-19 Member: 58028Members, Constellation, Reinforced - Supporter, WC 2013 - Supporter
    edited December 2014
    What are you using as the test bench, what hardware and where are you getting the 16-32ms of extra input lag from? o.O that's very unusual considering that 32ms of extra mouse input lag is similar to that of using a large TV as your monitor or even a console remote to play.

    Actually just about everyone in AusNS who has used it and on Australian servers aren't reporting this mysterious 32ms of input lag? neither am i so i'm wondering why it is so laggy for you and not laggy for us at r_sync 1, actually anything about 1 will give you larger input lag and won't actually help your fps, it lowers it in some cases, which is why i said it was broken on February second this year (4 patches before that latest update)

    "Yep, which is why i mentioned if you're already getting higher FPS this will only add to your input lag, which is why i recommended to just disable it.

    the input lag is minimal if your fps is around your refresh rate i've found, that means even if my refresh rate is currently 93(modified with cru 1.0) then even 120 fps the input lag is barely noticible, however any higher and it seems to feel like mouth smoothing is on because of how it interacts with the frames."

    Anyway, what ever man, Just let op decide what he wants, your hardware is different to his.

    http://esreality.com/post/2640619/input-lag-tests-ql-csgo/
  • AsranielAsraniel Join Date: 2002-06-03 Member: 724Members, Playtest Lead, Forum Moderators, NS2 Playtester, Squad Five Blue, Reinforced - Shadow, WC 2013 - Shadow, Subnautica Playtester, Retired Community Developer
    calm down. Fact is, r_sync adds one frame of input lag, thats just how it works. Double buffering/vsync works the same way, and adds the same amount of input lag, i hope you dont dispute that.

    The 16-32ms are simple to calculate. 16ms is for 60 fps (1second / 60 = 16ms) and 32 is for 30 fps (1 second / 30 = 32 ms). That also simple math.

    Now the debate is of course if those 16-32ms ADDITIONAL inputlag (which are added on top of your monitor delay) is acceptable or not. For some it is, for some its not, thats why some people use double buffering, some dont.

    So while your suggestion is interesting, as the r_sync command was previously unknown to me, just dont forget that it adds, by its nature, input lag.

    You can test it yourself by setting r_sync to a higher value, then you will have a very noticeable input lag.
  • ArchieArchie Antarctica Join Date: 2006-09-19 Member: 58028Members, Constellation, Reinforced - Supporter, WC 2013 - Supporter
    edited December 2014
    AMD/NVDIA by default have buffer set to 3 (3 frames), adding on a fourth with the NS2 engine can benefit that, ofcourse it should be 16-32ms but at the end of the day most people aren't noticing the input lag because they are already getting worse FPS.

    Test case, most gamers are using over 125hz default mouse-rate, something closer to 500hz-1000hz, when it all adds up the engine is using around 10ms itself, r_sync at default engine values allows for rendering at 3-4% in quiet environments, with flip queue set to 1 instead of 3 (amd+nvidia both default on 3) you get 1% rendering but also more skipping, but the mouse input is technically non-existant, the difference between flip queue 1 and 3 is barely noticible, only difference is you lose 30-50fps depending on your setup.

    Input lag at the core of things, it's only affecting the mouse. and 500hz, 1000hz polling rates (most gamers use is enough to bypass this including with other mouse fixes, whoever uses 125hz deserves to be shot in the foot tbh.

    edit:Did i mention when changing flip queue in the registry, my FPS wasn't dropping like psycho, before-hand it was dropping down 30 fps then back up now it's pretty minute.
Sign In or Register to comment.