How to enable SLI (driver WHQL 306.97)

AixAix Join Date: 2010-12-02 Member: 75409Members
Hello all,

If you're running SLI and find that you're not getting the performance that you should be, it's likely because NS2 does not yet have a proper SLI profile. As you can see below, leaving everything on auto means the driver will only utilize GPU1 to render:

<img src="http://i.imgur.com/C6AJG.jpg" border="0" class="linked-image" />

If you go into your Nvidia Control Panel and under SLI Performance Mode select "Force Alternate Frame Rendering 2", you should see a nice bump in performance:

<img src="http://i.imgur.com/W5crO.jpg" border="0" class="linked-image" />

<img src="http://i.imgur.com/P76Gp.jpg" border="0" class="linked-image" />

Hope it helps.

Comments

  • DavilDavil Florida, USA Join Date: 2012-08-14 Member: 155602Members, Constellation
    edited November 2012
    I agree, this does work, however for my setup frame rendering 1 is the better option so try both
  • AixAix Join Date: 2010-12-02 Member: 75409Members
    <!--quoteo(post=2007672:date=Nov 2 2012, 09:29 PM:name=Davil)--><div class='quotetop'>QUOTE (Davil @ Nov 2 2012, 09:29 PM) <a href="index.php?act=findpost&pid=2007672"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->I agree, this does work, however for my setup frame rendering 1 is the better option so try both<!--QuoteEnd--></div><!--QuoteEEnd-->

    How strange, what driver are you using? For me, AFR1 caused both GPUs to work around 50% and in-game was choppy. AFR2 brought them both to max and much smoother.
  • PhadePhade Join Date: 2012-10-03 Member: 161376Members
    edited November 2012
    My max FPS in explore mode on Summit after joining Marines and not moving:

    Global Setting (Nvidia Recommended): 141 FPS
    Single GPU: 140 FPS
    AFR1: 43 FPS
    AFR2: 145 FPS

    AFR2's gains were negligible in my brief and limited test but one thing is for sure: AFR1 was terrible.

    Specs:
    - Core i7-3770K @ 4.2GHz
    - 8GB of RAM
    - GeForce GTX 590 w/ Driver 310.33
    - OCZ Summit SSD

    Graphics Settings:
    - 1920x1080
    - Fullscreen
    - Vysnc Disabled
    - Medium Textures
    - Infestation Minimal
    - Anti-Aliasing, Bloom, Atmospherics, Anisotropic Filtering, Ambient Occlusion, Shadows & Texture Steaming : All OFF
    - Multicore Rendering On
  • DavilDavil Florida, USA Join Date: 2012-08-14 Member: 155602Members, Constellation
    <!--quoteo(post=2007680:date=Nov 2 2012, 06:36 PM:name=Aix)--><div class='quotetop'>QUOTE (Aix @ Nov 2 2012, 06:36 PM) <a href="index.php?act=findpost&pid=2007680"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->How strange, what driver are you using? For me, AFR1 caused both GPUs to work around 50% and in-game was choppy. AFR2 brought them both to max and much smoother.<!--QuoteEnd--></div><!--QuoteEEnd-->

    The newest drivers. It's my understanding the only difference between the 2 is which gpu renders the odd frames and which renders the even frames. I am using a pair of GTX580's so maybe it's just the cards you're using.
  • DavilDavil Florida, USA Join Date: 2012-08-14 Member: 155602Members, Constellation
    <!--quoteo(post=2007758:date=Nov 2 2012, 07:58 PM:name=Phade)--><div class='quotetop'>QUOTE (Phade @ Nov 2 2012, 07:58 PM) <a href="index.php?act=findpost&pid=2007758"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->AFR2's gains were negligible in my brief and limited test but one thing is for sure: AFR1 was terrible.<!--QuoteEnd--></div><!--QuoteEEnd-->

    You won't notice much of a difference with a single card that's doing "sli". Realistically you're still only using the 1 channel from the PCIe slot. If you have 2 cards it's a much bigger difference due to having 2 channels.
  • AixAix Join Date: 2010-12-02 Member: 75409Members
    <!--quoteo(post=2007802:date=Nov 3 2012, 12:27 AM:name=Davil)--><div class='quotetop'>QUOTE (Davil @ Nov 3 2012, 12:27 AM) <a href="index.php?act=findpost&pid=2007802"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->You won't notice much of a difference with a single card that's doing "sli". Realistically you're still only using the 1 channel from the PCIe slot. If you have 2 cards it's a much bigger difference due to having 2 channels.<!--QuoteEnd--></div><!--QuoteEEnd-->

    That is not at all how multi-GPU setups function; saying a dual-GPU card is only as good as a single-GPU card because it only uses 1 PCIe slot is patently incorrect. The bandwidth on PCIe slots is more than enough to handle dual-GPU rendering. Dual-GPU cards often have lower reference clocks to keep power draw (and heat generation) down, which leads to lower scores vs their dual-card counterparts, but dual-GPU cards are certainly capable of SLI/CF...otherwise there would be no point in buying them.

    I'll have to try those 310.33 beta drivers and see how they do, although 306.97 were great tonight for me (using 580 3GB SLI). I didn't see any NS2 changes in the documentation though.
  • DavilDavil Florida, USA Join Date: 2012-08-14 Member: 155602Members, Constellation
    <!--quoteo(post=2007879:date=Nov 2 2012, 10:39 PM:name=Aix)--><div class='quotetop'>QUOTE (Aix @ Nov 2 2012, 10:39 PM) <a href="index.php?act=findpost&pid=2007879"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->That is not at all how multi-GPU setups function; saying a dual-GPU card is only as good as a single-GPU card because it only uses 1 PCIe slot is patently incorrect. The bandwidth on PCIe slots is more than enough to handle dual-GPU rendering. Dual-GPU cards often have lower reference clocks to keep power draw (and heat generation) down, which leads to lower scores vs their dual-card counterparts, but dual-GPU cards are certainly capable of SLI/CF...otherwise there would be no point in buying them.

    I'll have to try those 310.33 beta drivers and see how they do, although 306.97 were great tonight for me (using 580 3GB SLI). I didn't see any NS2 changes in the documentation though.<!--QuoteEnd--></div><!--QuoteEEnd-->

    That's not what I said, obviously a 590 beats a single 580, however 2 580's will beat a 590 hands down. And no I didn't say 590's don't do sli, that option is certainly available in the nvidia settings. However, you won't notice as much of a performance gain because it's still 1 card on 1 slot using 1 set of bandwidth.
  • ASnogarDASnogarD Join Date: 2003-10-24 Member: 21894Members
    Before you fiddle with SLI and GPU related boosts and speed ups, use R_Stats in console to check if it is your GPU thats slowing you down.

    I have a single 580 GTX, and R_Stats shows my waiting for GPU to be 0 (occasionally flashing to 1ms), this means my system is not waiting for my card to catch up and in my case my 1090T 3.2 ghz AMD CPU is the bottleneck.

    I very much doubt SLI configs will boost the Spark engine as it seems to be a CPU intensive game rather than GPU demanding.
  • Dictator93Dictator93 Join Date: 2008-12-21 Member: 65833Members, Reinforced - Shadow
    edited November 2012
    Thx for OP. But I have found a better way to get SLI a long time ago (during the beta and what not).

    <b>The best way to get SLI working is to download Nvidia inspector, select the profile for the game Dishonored, and switch the .exe with that for NS2.exe. I get 99% scaling on my cards that way</b>
  • Sling_BladeSling_Blade Join Date: 2002-11-01 Member: 3412Members, Reinforced - Shadow
    edited November 2012
    While they've made big improvements I think the game will still be CPU bound for most people. If you have an old SLI setup it might help, but it makes no difference with mine, I have 2 680s. It's better to turn it off to prevent any possible microstutter. I usually get a constant 60FPS unless it's the end of the game, where it might drop into the 40s. My CPU is an i7 920 @ 3.9Ghz.
  • Sling_BladeSling_Blade Join Date: 2002-11-01 Member: 3412Members, Reinforced - Shadow
    edited November 2012
    Also Davil, Aix is right. PCI bandwidth is not a limiting factor for today's cards, not by a long shot.

    Here's a 690 review showing performance against a 680:
    <a href="http://www.anandtech.com/show/5805/nvidia-geforce-gtx-690-review-ultra-expensive-ultra-rare-ultra-fast/6" target="_blank">http://www.anandtech.com/show/5805/nvidia-...re-ultra-fast/6</a>

    Here's a 690 being tested on PCIE 2.0 and 3.0. No difference.
    <a href="http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/53901-nvidia-geforce-gtx-690-review-25.html" target="_blank">http://www.hardwarecanucks.com/forum/hardw...-review-25.html</a>

    The 690 is slightly slower than the 680 because it does not turbo its frequency as high as the 680s.

    Now, if you were going to render it very high resolutions (like 3K or 4K), then you might start seeing a difference. So if we start seeing big "retina" desktop displays, then this may change.
  • DavilDavil Florida, USA Join Date: 2012-08-14 Member: 155602Members, Constellation
    <!--quoteo(post=2008646:date=Nov 3 2012, 02:04 PM:name=Sling_Blade)--><div class='quotetop'>QUOTE (Sling_Blade @ Nov 3 2012, 02:04 PM) <a href="index.php?act=findpost&pid=2008646"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Also Davil, Aix is right. PCI bandwidth is not a limiting factor for today's cards, not by a long shot.

    Here's a 690 review showing performance against a 680:
    <a href="http://www.anandtech.com/show/5805/nvidia-geforce-gtx-690-review-ultra-expensive-ultra-rare-ultra-fast/6" target="_blank">http://www.anandtech.com/show/5805/nvidia-...re-ultra-fast/6</a>

    Here's a 690 being tested on PCIE 2.0 and 3.0. No difference.
    <a href="http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/53901-nvidia-geforce-gtx-690-review-25.html" target="_blank">http://www.hardwarecanucks.com/forum/hardw...-review-25.html</a>

    The 690 is slightly slower than the 680 because it does not turbo its frequency as high as the 680s.

    Now, if you were going to render it very high resolutions (like 3K or 4K), then you might start seeing a difference. So if we start seeing big "retina" desktop displays, then this may change.<!--QuoteEnd--></div><!--QuoteEEnd-->

    First I'm not comparing a 690 to a 680, even though a pair of 680's still beats a single 690.

    Second, look those PCIe 16x slots are connecting to your north bridge. That is going to be handling all the signals going to it and coming back. It can't do more than 1 signal per slot at once. It does do full duplex so it can do a send and receive at the same time, but not two of either. This is why, 2 separate cards are faster. It's like the difference between multiple threads and multiple cores.
  • Crazy GoatCrazy Goat Join Date: 2012-11-03 Member: 166482Members
    THANK YOU!!!!


    I've got dual GTX460 1GB cards and been getting 25-40fps with the global settings. When I change to AFR 1 it went down to 15-20, but with AFR 2 it went up to 70-80!

    This is a huge win, thank you!
  • Sling_BladeSling_Blade Join Date: 2002-11-01 Member: 3412Members, Reinforced - Shadow
    The 590 is clocked at 607/1215 while a 580 is clocked at 772/1544. That's why two 580s are considerably faster. The 690 is at 1059/915 while the 680 is clocked at 1110/1006 (both of those are the boost speeds). That's why the 690 is marginally slower than two 680s.

    Yes, 1 PCIE slot has less bandwidth than 2 PCIE slots, but 1 PCIE slot still has more than enough bandwidth to support two cards. The benchmarks I posted show this because, despite PCIE 2.0 having half as much bandwidth as PCIE 3.0, there is almost no difference between them in benchmarks at normal resolutions, even for an Nvidia 690.
  • KetamineKetamine Join Date: 2012-11-03 Member: 166292Members
    edited November 2012
  • AixAix Join Date: 2010-12-02 Member: 75409Members
    Took some screenshots while using AFR1, AFR2, and Single GPU settings, with Ambient Occlusion off and maxed out. All other settings maxed. I've only got my i5 2500k @ 4.2GHz, so after hearing all this CPU-bound talk I'm curious if I should bump it 4.5GHz and see if it runs smoother. I didn't really have any problems in the 16/16 game I played last night, but I also had AO off.

    SLI AFR1 (2x 580 GTX 3GB): 2560x1440, AA-ON, AF-ON, AO-OFF, All details-High | <b>15.5 FPS, 38ms waiting for GPU</b>
    <a href="http://imgur.com/Yl0L6" target="_blank"><img src="http://i.imgur.com/Yl0L6l.jpg" border="0" class="linked-image" /></a>

    SLI AFR1 (2x 580 GTX 3GB): 2560x1440, AA-ON, AF-ON, AO-MAX, All details-High | <b>12.6 FPS, 58ms waiting for GPU</b>
    <a href="http://imgur.com/Tka4L" target="_blank"><img src="http://i.imgur.com/Tka4Ll.jpg" border="0" class="linked-image" /></a>

    SLI AFR2 (2x 580 GTX 3GB): 2560x1440, AA-ON, AF-ON, AO-OFF, All details-High | <b>63.6 FPS, 2ms waiting for GPU</b>
    <a href="http://imgur.com/HAmzA" target="_blank"><img src="http://i.imgur.com/HAmzAl.jpg" border="0" class="linked-image" /></a>

    SLI AFR2 (2x 580 GTX 3GB): 2560x1440, AA-ON, AF-ON, AO-MAX, All details-High | <b>45.5 FPS, 7ms waiting for GPU</b>
    <a href="http://imgur.com/ha9s4" target="_blank"><img src="http://i.imgur.com/ha9s4l.jpg" border="0" class="linked-image" /></a>

    Single GPU (GTX 580 3GB): 2560x1440, AA-ON, AF-ON, AO-OFF, All details-High | <b>50.2 FPS, 7ms waiting for GPU</b>
    <a href="http://imgur.com/CRUKy" target="_blank"><img src="http://i.imgur.com/CRUKyl.jpg" border="0" class="linked-image" /></a>

    Single GPU (GTX 580 3GB): 2560x1440, AA-ON, AF-ON, AO-MAX, All details-High | <b>37.3 FPS, 26ms waiting for GPU</b>
    <a href="http://imgur.com/3dJFS" target="_blank"><img src="http://i.imgur.com/3dJFSl.jpg" border="0" class="linked-image" /></a>
  • NateN34NateN34 Join Date: 2012-11-04 Member: 166643Members
    edited November 2012
    This did absolutely nothing for me, on both settings...

    Still dips to 38-45 FPS in certain areas of the map.. others can run at 70+ though.

    So disappointed..........wasted my $25, until they can implement SLI for this game!

    Specs:

    GTX 570 x2 SLI w/ Driver 310.33
    8 Gigs RAM
    2500K @ 4 GHz
    Game on Crucial M4 SSD
  • AixAix Join Date: 2010-12-02 Member: 75409Members
    <!--quoteo(post=2009205:date=Nov 4 2012, 01:13 AM:name=NateN34)--><div class='quotetop'>QUOTE (NateN34 @ Nov 4 2012, 01:13 AM) <a href="index.php?act=findpost&pid=2009205"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->This did absolutely nothing for me, on both settings...

    Still dips to 38-45 FPS in certain areas of the map.. others can run at 70+ though.

    So disappointed..........wasted my $25, until they can implement SLI for this game!

    Specs:

    GTX 570 x2 SLI w/ Driver 310.33
    8 Gigs RAM
    2500K @ 4 GHz
    Game on Crucial M4 SSD<!--QuoteEnd--></div><!--QuoteEEnd-->

    Did you try it with 306.97? I haven't tried this with that beta driver.
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    edited November 2012
    <!--quoteo(post=2008782:date=Nov 3 2012, 04:52 PM:name=Crazy Goat)--><div class='quotetop'>QUOTE (Crazy Goat @ Nov 3 2012, 04:52 PM) <a href="index.php?act=findpost&pid=2008782"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->THANK YOU!!!!


    I've got dual GTX460 1GB cards and been getting 25-40fps with the global settings. When I change to AFR 1 it went down to 15-20, but with AFR 2 it went up to 70-80!

    This is a huge win, thank you!<!--QuoteEnd--></div><!--QuoteEEnd-->
    Careful... SLI already has 1 frame of lag, AFR2 adds 3 more to it, and when FPS drops it gets worse.
    Thats the tradeoff of "AFR" and "pre rendered frames" - input delay.
    And if you're like me having raw mouse iput and snappy mouse control without mouse accel or delay is waaayyy more important than a few more frames per second.

    Fun example of AFR:
    <a href="http://www.pcgameshardware.com/aid,675353/GPU-benchmark-review-CrossfireX-versus-Quad-SLI-and-3-Way-SLI/Reviews/?page=2" target="_blank">http://www.pcgameshardware.com/aid,675353/...Reviews/?page=2</a>

    Some of you have a beast of a rig so its really odd seeing 12 fps. ^.^
    Maybe run in single GPU until Nvidia releases a profile/beta drivers? I really think you can feel the input delay difference between default and AFR 2.. I can
  • KetamineKetamine Join Date: 2012-11-03 Member: 166292Members
    edited November 2012
    <!--quoteo(post=2009297:date=Nov 4 2012, 02:19 AM:name=ironhorse)--><div class='quotetop'>QUOTE (ironhorse @ Nov 4 2012, 02:19 AM) <a href="index.php?act=findpost&pid=2009297"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->I really think you can feel the input delay difference between default and AFR 2.. I can<!--QuoteEnd--></div><!--QuoteEEnd-->
    I've been using afr2 since yesterday morning and I haven't noticed any input lag at all.

    AFR2 helped a bunch with fps since my second card wasnt at 2% anymore, but my fps still wasn't 'completely' smooth.. though my 'wait for gpu' was now always 0.. ... so I overclocked my cpu from 3.3 to 4.4, and now it's 100% playable and smooth during every type of scene. (24man game) minimum fps 70, average 80, maximum 110. medium textures cause of 1gb cards, med ao because the difference isn't that big, and nvidia ffxa instead of ingame aa.

    Originally I was dropping to 35-40fps during big scenes on the lowest settings possible. That hurts your aim, and your eyes.. So I'm pretty happy now .

    2600k 3.4 @ 4.4
    560ti sli
    p8p67 deluxe
    8gb 1600 ddr3
    win7
  • Dictator93Dictator93 Join Date: 2008-12-21 Member: 65833Members, Reinforced - Shadow
    edited November 2012
    <!--quoteo(post=2009205:date=Nov 4 2012, 06:13 AM:name=NateN34)--><div class='quotetop'>QUOTE (NateN34 @ Nov 4 2012, 06:13 AM) <a href="index.php?act=findpost&pid=2009205"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->This did absolutely nothing for me, on both settings...

    Still dips to 38-45 FPS in certain areas of the map.. others can run at 70+ though.

    So disappointed..........wasted my $25, until they can implement SLI for this game!

    Specs:

    GTX 570 x2 SLI w/ Driver 310.33
    8 Gigs RAM
    2500K @ 4 GHz
    Game on Crucial M4 SSD<!--QuoteEnd--></div><!--QuoteEEnd-->
    Check out my post. It helps with SLI quite a bit

    I have post already how to achieve awesome SLI scaling. YOu just need the proper profile.

    There goes a lot more into SLI rendering than just AFR2 and AFR1. There are other settings which are invisible to the user and can only be unlocked by editing the profiles directly in Nvidia inspector. The Dishonored profile offers the best performance by far with no input lag.

    If anyone with SLi wants a tutorial just ask me
  • AixAix Join Date: 2010-12-02 Member: 75409Members
    <!--quoteo(post=2009297:date=Nov 4 2012, 03:19 AM:name=ironhorse)--><div class='quotetop'>QUOTE (ironhorse @ Nov 4 2012, 03:19 AM) <a href="index.php?act=findpost&pid=2009297"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Careful... SLI already has 1 frame of lag, AFR2 adds 3 more to it, and when FPS drops it gets worse.
    Thats the tradeoff of "AFR" and "pre rendered frames" - input delay.
    And if you're like me having raw mouse iput and snappy mouse control without mouse accel or delay is waaayyy more important than a few more frames per second.

    Fun example of AFR:
    <a href="http://www.pcgameshardware.com/aid,675353/GPU-benchmark-review-CrossfireX-versus-Quad-SLI-and-3-Way-SLI/Reviews/?page=2" target="_blank">http://www.pcgameshardware.com/aid,675353/...Reviews/?page=2</a>

    Some of you have a beast of a rig so its really odd seeing 12 fps. ^.^
    Maybe run in single GPU until Nvidia releases a profile/beta drivers? I really think you can feel the input delay difference between default and AFR 2.. I can<!--QuoteEnd--></div><!--QuoteEEnd-->

    That is a Quad-SLI review from 2009 of cards that are 4 generations old; that's not really applicable here unless we have a bunch of people rocking 4xGTX285's. Drivers are different, technology is different, games are different, and quad-SLI was always a mess compared to even tril-SLI, much less regular SLI.

    As for input lag, I had no trouble hitting skulks last night, and considering the interp on some of the weapons (the grenade launcher fires like the default TF2 launcher: 100ms delay), I'm not sure a "1 frame delay" is going to make or break anything if there is one - especially not when the alternative is a garbage framerate.
  • AixAix Join Date: 2010-12-02 Member: 75409Members
    <!--quoteo(post=2009430:date=Nov 4 2012, 07:53 AM:name=Dictator93)--><div class='quotetop'>QUOTE (Dictator93 @ Nov 4 2012, 07:53 AM) <a href="index.php?act=findpost&pid=2009430"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Check out my post. It helps with SLI quite a bit

    I have post already how to achieve awesome SLI scaling. YOu just need the proper profile.

    There goes a lot more into SLI rendering than just AFR2 and AFR1. There are other settings which are invisible to the user and can only be unlocked by editing the profiles directly in Nvidia inspector. The Dishonored profile offers the best performance by far with no input lag.

    If anyone with SLi wants a tutorial just ask me<!--QuoteEnd--></div><!--QuoteEEnd-->

    I downloaded Nvidia Inspector last night but didn't see anything about profiles...where do I find that? It basically looks like GPU-Z combined with an overclocking tool. Or did you mean update to the latest Nvidia beta driver and then use the Dishonored SLI profile in the Nvidia CP?
  • Dictator93Dictator93 Join Date: 2008-12-21 Member: 65833Members, Reinforced - Shadow
    <!--quoteo(post=2009793:date=Nov 4 2012, 06:23 PM:name=Aix)--><div class='quotetop'>QUOTE (Aix @ Nov 4 2012, 06:23 PM) <a href="index.php?act=findpost&pid=2009793"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->I downloaded Nvidia Inspector last night but didn't see anything about profiles...where do I find that? It basically looks like GPU-Z combined with an overclocking tool. Or did you mean update to the latest Nvidia beta driver and then use the Dishonored SLI profile in the Nvidia CP?<!--QuoteEnd--></div><!--QuoteEEnd-->

    Send me a PM i will walk you through it
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    edited November 2012
    <!--quoteo(post=2009787:date=Nov 4 2012, 09:20 AM:name=Aix)--><div class='quotetop'>QUOTE (Aix @ Nov 4 2012, 09:20 AM) <a href="index.php?act=findpost&pid=2009787"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->That is a Quad-SLI review from 2009 of cards that are 4 generations old; that's not really applicable here unless we have a bunch of people rocking 4xGTX285's. Drivers are different, technology is different, games are different, and quad-SLI was always a mess compared to even tril-SLI, much less regular SLI.<!--QuoteEnd--></div><!--QuoteEEnd-->
    I ran into this issue with a 570 SLI setup on NS2 and BF3, personally, recently.
    That website was provided to show an exaggeration of how the setting works - no you may not be using 4 cards but it highlights the core issues with AFR, that are present even with 2 cards.

    The way the method works, and the reason why nvidia will default to that sort of profile of prerendered frames, as some have accused, is simply to get higher FPS in benchmarks.
    But you are delaying the frames, make no mistake about it
    If it does not effect you then lucky you, stick with it. :)

    But for people like me who detest any form of delay, its no vsync, raw input, and as few prerendered frames as i can get away with - and after dealing with SLI microstutter and input delays for years i am now happily a single GPU user.

    I just wanted to warn others out there about this - its worth a mention and testing for yourself. I really wish the OP would at least warn of it in his first post... :-/

    <b>edit:</b>

    <b>Source = www.nhancer.com </b>
    <i>"Alternate Frame Rendering (AFR)
    When using the Alternate Frame Rendering, each other frame is rendered by one of the two cards alternatively. So one card renders a complete image, while the other card is already rendering the following image.

    This mode is very CPU effective. In theory, <u>it can introduce a slight lag (i.e. mouse movements are not as immediate as normal), </u>but this effect is so small, that practically nobody ever notices it."</i>

    <b>Source = SLI wiki</b>
    <i>While AFR may produce higher overall framerates than SFR,<u> it may result in increased input latency due to the next frame starting rendering in advance of the frame before it.</u> This is identical to the issue that was first discovered in the ATI Rage Fury MAXX board in 1999.[1] This makes SFR the preferred SLI method for fast paced action games.</i>

    I recommend the less FPS, but smoother split frame rendering. (SFR)
  • KetamineKetamine Join Date: 2012-11-03 Member: 166292Members
    edited November 2012
    Hey, try using split frame rendering (SFR in nvidia inspector) instead of AFR2 if you're getting microstuttering. To me it seems like there is none at all now. and set pre rendered frames to however many cards you have, according to nvidia. Also it showed vsync being set to 'passive', instead of forceoff. Not sure if that means 'application default' or if it's the adaptive stuff.

    <a href="http://isiforums.net/f/attachment.php?attachmentid=2081&d=1335031950" target="_blank">http://isiforums.net/f/attachment.php?atta...mp;d=1335031950</a>
    ^only pay attention to the SLI settings, the rest are for a specific driver version, so I wouldn't suggest using them. and obviously set GPU_COUNT to however many cards you have.

    edit: f u iron horse you beat me to it :P
  • ZaggyZaggy NullPointerException The Netherlands Join Date: 2003-12-10 Member: 24214Forum Moderators, NS2 Playtester, Reinforced - Onos, Subnautica Playtester
    Moving to Tech Support and stickying!
  • LaggyLaggy Join Date: 2012-11-19 Member: 172521Members
    Latest NVIDIA beta drivers have an SLI profile for NS2 now.

    <a href="http://www.nvidia.com/object/win8-win7-winvista-64bit-310.54-beta-driver.html" target="_blank">http://www.nvidia.com/object/win8-win7-win...eta-driver.html</a>

    (<a href="http://www.reddit.com/r/ns2/comments/1369ju/nvidia_sli_driver_update_for_ns2/" target="_blank">Source</a> from Reddit)
  • waflzwaflz Join Date: 2012-09-07 Member: 158459Members
    <!--quoteo(post=2007907:date=Nov 3 2012, 01:44 AM:name=ASnogarD)--><div class='quotetop'>QUOTE (ASnogarD @ Nov 3 2012, 01:44 AM) <a href="index.php?act=findpost&pid=2007907"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Before you fiddle with SLI and GPU related boosts and speed ups, use R_Stats in console to check if it is your GPU thats slowing you down.

    I have a single 580 GTX, and R_Stats shows my waiting for GPU to be 0 (occasionally flashing to 1ms), this means my system is not waiting for my card to catch up and in my case my 1090T 3.2 ghz AMD CPU is the bottleneck.

    I very much doubt SLI configs will boost the Spark engine as it seems to be a CPU intensive game rather than GPU demanding.<!--QuoteEnd--></div><!--QuoteEEnd-->

    You need to be in the 500 series of cards to get positive effects above medium graph settings.
  • soccerguy243soccerguy243 Join Date: 2012-12-22 Member: 175920Members, WC 2013 - Supporter
    <!--quoteo(post=2029676:date=Nov 19 2012, 11:20 PM:name=Laggy)--><div class='quotetop'>QUOTE (Laggy @ Nov 19 2012, 11:20 PM) <a href="index.php?act=findpost&pid=2029676"><{POST_SNAPBACK}></a></div><div class='quotemain'><!--quotec-->Latest NVIDIA beta drivers have an SLI profile for NS2 now.

    <a href="http://www.nvidia.com/object/win8-win7-winvista-64bit-310.54-beta-driver.html" target="_blank">http://www.nvidia.com/object/win8-win7-win...eta-driver.html</a>

    (<a href="http://www.reddit.com/r/ns2/comments/1369ju/nvidia_sli_driver_update_for_ns2/" target="_blank">Source</a> from Reddit)<!--QuoteEnd--></div><!--QuoteEEnd-->


    how do I use this profile?
Sign In or Register to comment.