Q for your computer setup

2

Comments

  • Cannon_FodderAUSCannon_FodderAUS Brisbane, AU Join Date: 2013-06-23 Member: 185664Members, Squad Five Blue, Squad Five Silver, Reinforced - Shadow
    Soul_Rider wrote: »
    I see everyone quoting FPS but not Resolution. One is useless without the other.

    I have a 560ti with a 4670k and I get constant 200fps until mid-game where I can drop as low as 150fps.

    The crucial factor in this is that I only have a Samsung LCD TV connected over HDMI. The TV natively defaults to 1360x768, hence why I get such a high framerate. For some games, it will run at 1920x1200 as HDMI supports, but not with NS2. Strangely, it does support that resolution in Future Perfect.. :)

    I actually run the 1360 res, as it is the lowest res that keeps to my monitor's ratio. It does run faster than 60 on my i5-4690k + GTX 650, but I would rather a constant fps then having dips during late game, so I max_fps it to 60 (its constant and smooth). I used the 1920x1080 res with lowest settings, and didn't get constant smooth frame rates (dips during battles).
    @MrPink‌ I limit the fps to 60 with the console command max_fps to get a smoother feel to the game. If I don't limit the fps, it will jump from 150+ to as low as 50 when there is a big battle in the hive room with lots of life forms, structures etc... So, to make the game seemingly run smoother, I run 60fps max, so if it did dip from 60->50 it isn't so jarring.
  • FrozenFrozen New York, NY Join Date: 2010-07-02 Member: 72228Members, Constellation
    G/Free-Sync on a 60hz monitor
  • DC_DarklingDC_Darkling Join Date: 2003-07-10 Member: 18068Members, Constellation, Squad Five Blue, Squad Five Silver
    Gsync doesnt come on 60hz monitors.
    While gsync works fine on lower fps, the higher fps still has the advantage when gsync can not work. Also gsync can work fine up to max hz. (so do not go 200fps on a 144hz monitor, limit it below 144hz)
  • FrozenFrozen New York, NY Join Date: 2010-07-02 Member: 72228Members, Constellation
    Gsync doesnt come on 60hz monitors.
    While gsync works fine on lower fps, the higher fps still has the advantage when gsync can not work. Also gsync can work fine up to max hz. (so do not go 200fps on a 144hz monitor, limit it below 144hz)

    They dont make 2160p 144hz gsync monitors. They do make them at 60hz though.
  • YojimboYojimbo England Join Date: 2009-03-19 Member: 66806Members, NS2 Playtester, NS2 Map Tester, Reinforced - Supporter, Reinforced - Silver, Reinforced - Shadow
  • G_LockG_Lock Playtester_ FL Join Date: 2013-04-03 Member: 184624Members, NS2 Playtester, Reinforced - Supporter, Reinforced - Silver, Reinforced - Shadow
    The monitors max refresh rate becomes the maximum fps when g-sync mode is enabled in the nvidia control panel, if your screen is 144Hz then it will never output over 144 frames when g-sync is enabled same for 60Hz, 144fps g-synced blows 200fps non g-synced out of the water anyway so it doesn't matter that its capped at 144.
  • FrozenFrozen New York, NY Join Date: 2010-07-02 Member: 72228Members, Constellation
    I'd say 60 fps g-sync probably looks smoother than 200 fps too. I havent tested that myself, but the idea being that 60 perfect frames will look smoother than 200 where some tear. After all using more frames than the refresh rate just causes frames to be skipped.

    In the half life engine it actually made you move faster so developer 1 was ftw
  • dePARAdePARA Join Date: 2011-04-29 Member: 96321Members, Squad Five Blue
    Just wait till John Carmack make Asynchronous Time Warp ready for all games Occulus independent.
    This could be the end of these far too expensive monitors.
    Sad thing is, that Nvidia and AMD (freesync) want to earn money, so maybe they never implement this into there drivers for all games.

    http://www.dsogaming.com/news/john-carmack-is-pushing-hard-for-asynchronous-time-warp-on-the-pc-best-thing-coming-from-mobiles/
  • NordicNordic Long term camping in Kodiak Join Date: 2012-05-13 Member: 151995Members, NS2 Playtester, NS2 Map Tester, Reinforced - Supporter, Reinforced - Silver, Reinforced - Shadow
    edited January 2015
    dePARA wrote: »
    Just wait till John Carmack make Asynchronous Time Warp ready for all games Occulus independent.
    This could be the end of these far too expensive monitors.
    Sad thing is, that Nvidia and AMD (freesync) want to earn money, so maybe they never implement this into there drivers for all games.

    http://www.dsogaming.com/news/john-carmack-is-pushing-hard-for-asynchronous-time-warp-on-the-pc-best-thing-coming-from-mobiles/
    Amd doesn't make money with Freesync the same way nvidea does with gsync. It has been built into the display port standard.
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    edited January 2015
    @Nordic‌
    False.

    #1 AMD makes money because only their latest GPUs work with Freesync, and even then just specific ones.
    #2 They also make money because they make partnership deals with monitor vendors who include their firmware and scalar, because current desktop monitors (like mine) are not compatible with Freesync, so I have to buy a new one.

    But let's forget about them making money for a second and consider that it's just an inferior implementation in comparison anyways. Blur busters already has proven the overhead from Gsync isn't even a frame, so AMD's claims about latency are false. ("The average input latency with vsync off was 54ms ((53+55) / 2). The average input latency with gsync on was 55.5ms ((59+52) / 2). ")
    And currently, when Freesync goes below a certain FPS it actually makes the screen *flicker* instead of just letting tearing occur due to the panel technology involved..

    Edit: You edited your absolutist statement, my post now makes less sense. *shakes fist*



  • sotanahtsotanaht Join Date: 2013-01-12 Member: 179215Members
    IronHorse wrote: »
    @Nordic‌
    False.

    #1 AMD makes money because only their latest GPUs work with Freesync, and even then just specific ones.
    #2 They also make money because they make partnership deals with monitor vendors who include their firmware and scalar, because current desktop monitors (like mine) are not compatible with Freesync, so I have to buy a new one.

    But let's forget about them making money for a second and consider that it's just an inferior implementation in comparison anyways. Blur busters already has proven the overhead from Gsync isn't even a frame, so AMD's claims about latency are false. ("The average input latency with vsync off was 54ms ((53+55) / 2). The average input latency with gsync on was 55.5ms ((59+52) / 2). ")
    And currently, when Freesync goes below a certain FPS it actually makes the screen *flicker* instead of just letting tearing occur due to the panel technology involved..

    Edit: You edited your absolutist statement, my post now makes less sense. *shakes fist*



    An average of only 2 data points where the deviation between two points is greater than the difference between the two standards? Anecdotal evidence at its finest.
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    I agree that test was not thorough enough
    I happily await Gsync Vs Freesync tests to come these next few months
  • Soul_RiderSoul_Rider Mod Bean Join Date: 2004-06-19 Member: 29388Members, Constellation, Squad Five Blue
    edited January 2015
    IronHorse wrote: »
    @Nordic‌
    False.

    #1 AMD makes money because only their latest GPUs work with Freesync, and even then just specific ones.
    #2 They also make money because they make partnership deals with monitor vendors who include their firmware and scalar, because current desktop monitors (like mine) are not compatible with Freesync, so I have to buy a new one.

    You know FreeSync is an OpenStandard don't you? Well, The portion that goes in the TV's is DisplayPort Adaptive-Sync, something that AMD created and made a part of the Display Port Standard. Therefore, AMD receive no 'royalties' when someone uses the standard in their monitor.

    Also, as an open standard, NVidia could also freely add freesync support to their products, and would not have to pay AMD anything. So you could buy a samsung monitor, and in future, maybe a NVidida Gfx card, so AMD won't make any real money on it.

    Remeber, nVidia is all about cash generation through proprietary tech, AMD are more about open standards, and while it is all an attempt to make money, AMD are pushing forward game graphics tech in general, whereas NVidia are pushing it for themselves.

    Everything Nvidia have done, AMD have done, except in most cases, AMD made their version an open standard.
    OpenCL, Adaptive-Sync etc. The only one they haven't done is made mantle open source, although, apparently, nvidia gfx could also be modified to run mantle without too much effort.

    Nvidia keep their API (i forget what it is called) generally secret, and you only see it baked into games that belong to the 'Way it's mean't to be played' campaign.


    Edit ----

    Just had a look to check my info is corect and found this interivew which is worth a read:

    http://www.sweclockers.com/artikel/18798-amd-om-dynamiska-uppdateringsfrekvenser-med-project-freesync/2#pagehead

    Here is a quote about Free-Sync/Adaptive sync:
    Is it correct to say that FreeSync is a direct answer to Nvidia G-Sync?

    – The engineering timeline of the Radeon R9 and R7 Series, which feature Project FreeSync-compatible display controllers, establishes that FreeSync predates G-Sync by a healthy margin. Both technologies aim to deliver similar user benefits, but Project FreeSync will accomplish its goals with open industry standards that don’t require any licensing fees or contracts from participating parties. History has more or less proven that this strategy enables technologies to proliferate faster and cost less, and we think that’s the right way for everyone.
  • DC_DarklingDC_Darkling Join Date: 2003-07-10 Member: 18068Members, Constellation, Squad Five Blue, Squad Five Silver
    Actually you HAVE to limit fps when using gsync BELOW 144.
    This is (again tested on blurbusters) due to the fact that if you hit 144fps on a 144hz gsync monitor, it starts to behave like v-sync with real input lag.
    Keeping the max fps below the max hz completely avoids the problem altogether. (I just tend to lock it at 120fps)

    Also in theory nvidias technique is more stable but let us wait first for the freesync monitors to actually be released before we speculate
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    edited January 2015
    @Soul_Rider‌
    Yea i commend AMD for always going the route of open standards, definitely.
    And good point regarding royalties from open standards.

    However, they are definitely making money from this due to requiring their specific GPUs.
    Also, OpenCL was not their baby, but Apple's and then was developed with AMD, Intel, IBM, and Nvidia. Nvidia has released drivers for OpenCL since it was released. (Even their Tesla chip uses it today)
    But let's not pretend said open versions are ever worth choosing AMD alone, because the actual game industry adoption rates of said versions are always lacking in comparison. (Physx, nvidia's 3d etc)
    Whether that's due to Nvidia tossing around it's industry weight or not, I do not know, and frankly I don't care... I just want to see the better shiny particle effects in my batman game - so I choose Nvidia.

    Short of their Eyefinity stuff, I've never seen a reason to choose AMD unless it was for someone else who was on a budget. *shrug*
    Again though, I definitely await benchmark comparisons between the tech.


    Edit: I think we've sufficiently derailed this topic.. lol
  • unrenderedunrendered Finland Join Date: 2013-11-07 Member: 189137Members, Reinforced - Supporter, WC 2013 - Supporter

    sotanaht wrote: »
    Happy holidays to all,
    Just a question on setup. I just built my modest gaming rig from a:
    Intel E8400 dual core 3.0GHz
    8G ram, Nvidia GTX650

    to a
    i5-4690k 3.5GHz
    8G ram, Nvidia GTX650

    What res and other options do you recommend to play it on 60fps stable?

    I am currently running it on 1377x768 (or something like that) with everyone minimal or off.
    Tried 1920x1080 last night, and it couldn't keep 60fps.
    Thanks.
    PS. on the minimal setting I am used to for my old comp, I get the required 60fps on the new comp (I set the maxfps to 60), and it was awesome to bite marines that don't teleport when my frame rate tanks at the 15 minute mark. I just want a bit more eye candy.

    You bought the best gaming CPU available... to go with a 2 generation old low budget tier graphics card. You dun goofed.

    I have a 4670k (overclocked) and a gtx770 and I can get 90-120FPS with every setting I want maxed out. A gtx760 would have done you a solid 60, but that x50 level crap just wont cut it for a game like ns2.

    Yeah right, never dips below 90fps with a gtx770... I have a gtx970 and i5 4690k and it dips to 60fps on max settings, 1080p on 24 slot servers in the end game easily.
  • Soul_RiderSoul_Rider Mod Bean Join Date: 2004-06-19 Member: 29388Members, Constellation, Squad Five Blue
    sotanaht wrote: »

    I have a 4670k (overclocked) and a gtx770 and I can get 90-120FPS with every setting I want maxed out. A gtx760 would have done you a solid 60, but that x50 level crap just wont cut it for a game like ns2.

    unrendered wrote: »
    Yeah right, never dips below 90fps with a gtx770... I have a gtx970 and i5 4690k and it dips to 60fps on max settings, 1080p on 24 slot servers in the end game easily.

    Now, read those two next to each other.

    The first guys says he has the settings he wants, set to max, and you reply calling him out with an example of you putting ALL settings set to max.

    Maybe get a few facts first, like what settings he has set to max, then set your own system to the same, and test before you scoff at his input.

    This is the problem with the modern advertising age, everyone is throwing around numbers, and you know what they say about

    Lies, Damn Lies and Statistics






    .. it's the foundation of modern marketing
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    If you want more performance @unrendered‌ then disable Ambient Occlusion. You'll see ~20 fps increase and better input delay. Also, only run in Fullscreen DX9.
    Personally, I also disable Atmospherics because it ends up just obscuring things and is also costly - but i keep shadows on because well, that's half the atmosphere of this game to me.
  • NordicNordic Long term camping in Kodiak Join Date: 2012-05-13 Member: 151995Members, NS2 Playtester, NS2 Map Tester, Reinforced - Supporter, Reinforced - Silver, Reinforced - Shadow
    unrendered wrote: »
    sotanaht wrote: »
    Happy holidays to all,
    Just a question on setup. I just built my modest gaming rig from a:
    Intel E8400 dual core 3.0GHz
    8G ram, Nvidia GTX650

    to a
    i5-4690k 3.5GHz
    8G ram, Nvidia GTX650

    What res and other options do you recommend to play it on 60fps stable?

    I am currently running it on 1377x768 (or something like that) with everyone minimal or off.
    Tried 1920x1080 last night, and it couldn't keep 60fps.
    Thanks.
    PS. on the minimal setting I am used to for my old comp, I get the required 60fps on the new comp (I set the maxfps to 60), and it was awesome to bite marines that don't teleport when my frame rate tanks at the 15 minute mark. I just want a bit more eye candy.

    You bought the best gaming CPU available... to go with a 2 generation old low budget tier graphics card. You dun goofed.

    I have a 4670k (overclocked) and a gtx770 and I can get 90-120FPS with every setting I want maxed out. A gtx760 would have done you a solid 60, but that x50 level crap just wont cut it for a game like ns2.

    Yeah right, never dips below 90fps with a gtx770... I have a gtx970 and i5 4690k and it dips to 60fps on max settings, 1080p on 24 slot servers in the end game easily.

    I play on 8v8 not 12v12 servers so I would get more fps than you just from that. But with my overclocked 3570k and overclocked 970 I am getting lowest drops into the 100 fps range, but usually don't drop below 120 fps with everything maxed but Ambient Occlusion, atmospherics, and using minimal infestation. This is somewhat map dependent also. Maps like summit I get most FPS and maps like Kodiak I get the least.
  • Cannon_FodderAUSCannon_FodderAUS Brisbane, AU Join Date: 2013-06-23 Member: 185664Members, Squad Five Blue, Squad Five Silver, Reinforced - Shadow
    IronHorse wrote: »
    @Soul_Rider‌


    Edit: I think we've sufficiently derailed this topic.. lol

    touche. But nice to see discussion all the same. It is the new year, and there isn't much else to do (yet).
  • HivelordHivelord Join Date: 2003-06-21 Member: 17567Members, Reinforced - Shadow
    edited January 2015
    My latest setup is a i7 3770k with gtx970. 1920x1080 144hz (no gsync) with high textures and models, every other setting turned off. From what I've seen this is almost completely CPU bound, and the FPS can dip down to ~120-130 occasionally.
  • Cr4zyb4st4rdCr4zyb4st4rd United Kingdom Join Date: 2012-08-09 Member: 155200Members, NS2 Map Tester, Reinforced - Supporter, Reinforced - Silver, Reinforced - Gold, Reinforced - Diamond, Reinforced - Shadow
    Lets see.

    i7 5960x @ 4.2Ghz
    SLI GTX980
    1080
    everything LOW including all NS2+ features

    100FPS max, avg less than 60, drops to 40.

    I tried 1440p but that was even worse, game performed better when I only had a 680 and 4770k. thx for the perf increases cdt....
  • Cannon_FodderAUSCannon_FodderAUS Brisbane, AU Join Date: 2013-06-23 Member: 185664Members, Squad Five Blue, Squad Five Silver, Reinforced - Shadow
    Lets see.

    i7 5960x @ 4.2Ghz
    SLI GTX980
    1080
    everything LOW including all NS2+ features

    100FPS max, avg less than 60, drops to 40.

    I tried 1440p but that was even worse, game performed better when I only had a 680 and 4770k. thx for the perf increases cdt....

    How is this possible? Before my upgrade, I was on a E8400 with gtx650, and was getting better frame rates with the latest build (with everything turned off mind) compared to the previous builds. And with my upgrade (see OP), it rans even better still (ofcourse).

  • sotanahtsotanaht Join Date: 2013-01-12 Member: 179215Members
    Soul_Rider wrote: »
    sotanaht wrote: »

    I have a 4670k (overclocked) and a gtx770 and I can get 90-120FPS with every setting I want maxed out. A gtx760 would have done you a solid 60, but that x50 level crap just wont cut it for a game like ns2.

    unrendered wrote: »
    Yeah right, never dips below 90fps with a gtx770... I have a gtx970 and i5 4690k and it dips to 60fps on max settings, 1080p on 24 slot servers in the end game easily.

    Now, read those two next to each other.

    The first guys says he has the settings he wants, set to max, and you reply calling him out with an example of you putting ALL settings set to max.

    Maybe get a few facts first, like what settings he has set to max, then set your own system to the same, and test before you scoff at his input.

    This is the problem with the modern advertising age, everyone is throwing around numbers, and you know what they say about

    Lies, Damn Lies and Statistics






    .. it's the foundation of modern marketing

    I even italicized the "want" for emphasis. I would have specified which settings I ran but I couldn't find a way to say that without it sounding like rambling and distracting from my key point.

    Anyway, I run with ambient occlusion, bloom, and atmospherics OFF, and with minimal infestation. All because that's the way I prefer the game to look rather than for performance, I'm fairly confident I could get adequate performance in all situations even with those on. The OP wasn't interested in maxxing out everything so much as simply getting a stable 60FPS, so my advice (x60 series or equivalent card) should be correct.

  • RadimaXRadimaX Join Date: 2013-02-05 Member: 182840Members
    if you want a smooth experience try Quadcore i7 + Quad SLI Titans :)
  • woozawooza Switzerland Join Date: 2013-11-21 Member: 189496Members, Squad Five Blue
    Lets see.

    i7 5960x @ 4.2Ghz
    SLI GTX980
    1080
    everything LOW including all NS2+ features

    100FPS max, avg less than 60, drops to 40.

    I tried 1440p but that was even worse, game performed better when I only had a 680 and 4770k. thx for the perf increases cdt....

    Try without SLI. i get better results when i play with one AMD 7970 instead of two.

  • FrozenFrozen New York, NY Join Date: 2010-07-02 Member: 72228Members, Constellation
    edited January 2015
    Lets see.

    i7 5960x @ 4.2Ghz
    SLI GTX980
    1080
    everything LOW including all NS2+ features

    100FPS max, avg less than 60, drops to 40.

    I tried 1440p but that was even worse, game performed better when I only had a 680 and 4770k. thx for the perf increases cdt....

    Jumping on the DDR4 train this early is just early. Its cool and all, but not as practical.

    I basically grabbed the end of the DDR3 train with an i7 4790k and 980. I get over 100fps in lategame fights and the last update only helped. You should just get a G-sync monitor to hold you over until things are optimized for your setup :D
  • Cr4zyb4st4rdCr4zyb4st4rd United Kingdom Join Date: 2012-08-09 Member: 155200Members, NS2 Map Tester, Reinforced - Supporter, Reinforced - Silver, Reinforced - Gold, Reinforced - Diamond, Reinforced - Shadow
    I have Gsync 1440p and DDR4 should in no way be causing my fps to be over 60fps lower than before.

    Gsync doesnt help when the FPS is as jumpy and stuttery as NS2 is. I've given up playing it now anyway it's too broken to try.
  • Cannon_FodderAUSCannon_FodderAUS Brisbane, AU Join Date: 2013-06-23 Member: 185664Members, Squad Five Blue, Squad Five Silver, Reinforced - Shadow
    @Cr4zyb4st4rd‌ do you NS2 or Gsync? With your setup, I would do a max_fps 60 or something so the fluctuation of fps isn't so jarring. Works very smooth for me at 60fps. I shoot so so much better with the higher fps (even when I am 400ping on the EU servers).
  • NordicNordic Long term camping in Kodiak Join Date: 2012-05-13 Member: 151995Members, NS2 Playtester, NS2 Map Tester, Reinforced - Supporter, Reinforced - Silver, Reinforced - Shadow
    I have Gsync 1440p and DDR4 should in no way be causing my fps to be over 60fps lower than before.

    Gsync doesnt help when the FPS is as jumpy and stuttery as NS2 is. I've given up playing it now anyway it's too broken to try.
    Your hardware is better than mine. You might want to start a tech support thread and maybe it can be resolved.
Sign In or Register to comment.