Jagged Mouse Movement

intrikintrik Join Date: 2003-10-05 Member: 21451Members
edited December 2013 in Technical Support
Hello Everyone,

I am looking for all your help with fixing an issue that appears to have popped up quite recently. I am getting "jagged" mouse movement, it's not input delay, or lag it is literally jagged, sort of like the same experience you get with screen tearing.

On that note.. it might even be screen tearing.

Let me try and explain my setup and my experience.

I have a 144hz 1ms monitor hooked up to a GTX 670, processor is i7 3770K watercooled and overclocked to 4.6GHz. I have run it at 4.6GHz for the entire time I have had it on 1.4voltes and it runs at 60degrees Cecilius throughout gameplay of NS2.

I run 1280x720 and in most NSL maps I get 200 FPS running High Particle, High Texture quality, Anti Aliasing ON, Everything else OFF. Raw Input ON, Mouse Acceleration OFF.

My mouse is a SteelSeries Sensei Rugged Raw. I run it at 1000HZ and 1360 DPI. In game sensitivity is anywhere from 1.8 to 2 - I've tried manually inputting and used the "slider" in the options, which generally puts it to 1.79 or 1.88.

I've noticed recently TEARING when running 200FPS which makes the mouse movement "jagged" like feeling. So I thought I would change to 144fps (using max fps) and STILL tearing, tried 143FPS STILL tearing.

So I starting testing a plethora of different settings in the NVIDIA control panel, max pre-rendered frames, every other option in there on/off. PhysX settings on/off ingame/inNVIDIA, I've tried texture streaming on/off ingame. I've tried almost every setting I can think of.

I've done the RAW input OFF/ON/OFF/ON and Mouse Accell OFF/ON/OFF/ON ingame. I've done the Disable Desktop Composition and Themes on the ns2.exe (BTW I've always done this since a previous build where it was necessary). I've tried turning off the desktop Composition/Themes option. Then off/on/off/on again.

I've tried settings NS2.exe to high priority, realtime, low priority etc.

I've tried mumble off completely.

I've tried removing all applications that run entirely.

I've tried upgrading NVIDIA driver.

What BAFFLES me is I get tearing on 144FPS on a 144HZ 1ms monitor no matter what I change or do.

When I bring up r_stats I am 0ms waiting on render thread, 0ms waiting on GPU. But what I HAVE noticed (but not certain if it was always this way) is that in r_stats it shows 200FS (5ms), if i lock it to 144fps it shows "200FPS (6ms)".

Now here's the complete kicker. If I turn on VSYNC to double or triple buffered. It's PERFECT (like it used to be).

If i lock it to 143FPS on VSYNC on, I get "143FPS (6ms)" in r_stats, NO TEARING, SMOOTH MOUSE.

And the kicker: ZERO input delay.... that I can notice anyway?

Is it possible that 143FPS + VSYNC + 1ms monitor + 1000hz mouse results in no input delay using VSYNC? Or is it that the benefits of having smooth aiming and no tearing outway the impact of any - if at all minimal - input delay?

Can anybody give me ANY possible solutions? I would love to just run 200FPS and VSYNC OFF without tearing or jagged mouse movement like I used to...........about 2 weeks ago......

Thanks

Comments

  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    "if i lock it to 144fps it shows "200FPS (6ms)"
    Then it's not locking it, is it? ;-)

    You said you are using "max fps" .. But there's no space for that command, so be sure to type in the console maxfps 144.
    This is why vsync was properly displaying 143 fps.. Because it was successfully locking it.

    Honestly, I'd use both that command and vsync triple buffer.. And then compare the mouse input delay between vsync on and off (don't use double buffer).. If it's liveable, use it.

    Also, I don't know how you're not gpu bound given that cpu over clock..
  • Ghosthree3Ghosthree3 Join Date: 2010-02-13 Member: 70557Members, Reinforced - Supporter
    The maxfps command is unbelievably buggy. I wouldn't use it at all, just use vsync (which does the fps lock you want but without the bugs), unfortunately it does come with input lag...Not much you can do about that (until gsync woo!).

    Also as regards to "Jagged Mouse Movement" the first thing that popped into my head is this,
    zZ5bqaT.png
    This causes the mouse to travel perfectly horizontally or vertically if you come close to either, so even though you're traveling on a 10 degree angle it's going horizontal anyway.

    This may not be what you mean though, I'm really not sure as you haven't described it very well :p
  • intrikintrik Join Date: 2003-10-05 Member: 21451Members
    IronHorse wrote: »
    "if i lock it to 144fps it shows "200FPS (6ms)"
    Then it's not locking it, is it? ;-)

    You said you are using "max fps" .. But there's no space for that command, so be sure to type in the console maxfps 144.
    This is why vsync was properly displaying 143 fps.. Because it was successfully locking it.

    Honestly, I'd use both that command and vsync triple buffer.. And then compare the mouse input delay between vsync on and off (don't use double buffer).. If it's liveable, use it.

    Also, I don't know how you're not gpu bound given that cpu over clock..

    Oh wow.. I made a big typo.

    Let me resay that again:

    200FPS vsync off shows in r_stats "200FPS (5ms)"

    144fps vsync off shows in r_stats "144fps (6ms)"

    144fps vsync on shows in r_stats "144fps (6ms)"

    100fps vsync on shows in r_stats "100fps (10ms)"

    Does anybody know the correlation between FPS and the ms component? I don't really understand (technically) what that even means.

    What do you mean by "I don't know how you're not GPU bound given that CPU over clock" - what are you saying?

    Also. how would you guys measure input delay? I mean.... I've tested it a dozen different ways and I can't really notice any difference... mind you I should probably try it in an actual game.

    Also Ghost.. Yeah basically the "jagged" feeling might be exactly what you described, it feels like when I draw a circle with my mouse it's more like a square.. if that makes sense.

  • intrikintrik Join Date: 2003-10-05 Member: 21451Members
    Hang on a minute........ are you certain that VSYNC will cause input delay when my monitor is 144hz? I've noticed if I change my fps to 100 I get 10ms, if I lower it to 60 it's probably more like 15ms delay. I assume most people who're using VSYNC have 60HZ monitors (historically) and that's probably where the 15ms input delay comes from.

    If my delay on 144fps no vsync is 6ms and my delay on 144fps vsync is 6ms... I would see absolutely no way that's any different?

    I ASSUME it's the same delay/difference between HZ on mice? 1000hz is 1ms delay, 500hz is 2ms delay, 250hz is 4ms delay, 125hz is 8ms delay? So one could argue that 144hz + vsync is no different?
  • intrikintrik Join Date: 2003-10-05 Member: 21451Members
    edited December 2013
    I'm getting smarter by the minute guys!! :D haha

    Okay here goes.
    Hertz is a measurement of frequency in that it describes how many times something happens (known as a cycle) every second. Often used to describe the frequency of sound or electricity, hertz can describe any repetitive action. One millisecond is a standard measurement of time and represents 1/1000 of one second. Both hertz (Hz) and milliseconds (msecs) are time or "period" measurements, and converting Hz to msecs can be accomplished with a basic formula.


    1 Use the formula 1/Hz * 1000, where Hz represents cycles per second. For this example, convert 500 Hz to milliseconds.
    2 Divide the Hz value into one. Enter 1 ÷ 500. This will result in a value of 0.002. This value represents seconds.
    3 Multiply the seconds value by 1000 to derive the milliseconds value. For this example, enter 0.002 x 1000. This results in a value of 2 milliseconds. Record your results.


    Read more: http://www.ehow.com/how_6608111_convert-hertz-milliseconds.html#ixzz2nhbHRaOw


    60HZ = 16.66ms
    100HZ = 10ms
    120HZ = 8.33ms
    144HZ = 6.94ms
    200HZ = 5ms

    Assuming VSYNC off, running the games maximum rate of 200FPS, you would have an input delay of 5ms.

    Add your mouse input delay to that. 200FPS + 1000HZ = 6ms input delay.

    If I have VSYNC on, locking to 144FPS and 144HZ then my input delay is 6.94ms + 1ms for mouse = 7.94ms.

    Basically on my setup I am looking at the difference of:

    VSYNC ON = 7.94ms
    - No frame tearing
    - Extra 1.94ms of input delay

    VSYNC OFF = 6ms
    - Frame tearing
    - 1.94ms less input delay


    If we look at the "legacy" argument of VSYNC and input delay, let's use these numbers:

    60HZ monitor, 250HZ mouse = Just a little over 21ms of input delay. I can see how that would be "laggy" in terms of archaic technology. With 144+hz monitors, 1000hz mice, I see absolutely no reason why not to use vsync.
  • Ghosthree3Ghosthree3 Join Date: 2010-02-13 Member: 70557Members, Reinforced - Supporter
    edited December 2013
    There are other things that cause input lag than just mouse refresh rate and fps. A lot of stuff inbetween, doesn't add much, but it is there.

    As far as vsync goes and input lag, it really doesn't matter how much fps you are getting or what your monitor refreshes at (correct me if I'm wrong someone more intelligent, paging @DC_Darkling). What causes the input lag in vsync is the way it buffers 1-2 frames extra that get held in place waiting for the perfect moment to be displayed.

    More on vsync here, very good writeup actually, and based on it, I don't think it would matter if you had fps locked at 30/60/75/120/144, it's the same amount of buffer. It doesn't just buffer frames once per cycle (eg. every 144th frame) it does it for EVERY frame. (I think, Darkling help!).
    hardforum.com/showthread.php?t=928593

    EDIT: Actually if anything this makes me think you would have MORE input lag with a higher refresh rate, not sure.

    EDIT2: It probably doesn't give more actually, as you're still doing them all in a second. Still think it wouldn't give less though.
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    edited December 2013
    ms is the frame times listed in milliseconds.. but.. they are averaged since the fps is averaged

    Regarding your PM and question on vsync:

    The bad news,

    Yes, almost all forms of frame syncing or frame limiting are going to include input delay, but really you should consider that you cannot get rid of input delay, you can only minimize it.. but that being said, anandtech has this to say about triple buffering :

    "with triple buffering we get the same high actual performance and similar decreased input lag of a vsync disabled setup while achieving the visual quality and smoothness of leaving vsync enabled."
    Source.

    Which isn't the entire story..
    Triple buffering will lessen input delay significantly - but in the end, its actually up to the engine and their implementation.
    There are different implementations, triple buffering implies three buffers, but this doesn't always mean the typical 1 extra frame (16 ms @ 60fps) that comes from syncing techniques.
    I think the wiki on triple buffering puts it in far more accurate words than i could:
    In computer graphics, triple buffering is similar to double buffering but provides a speed improvement. In double buffering the program must wait until the finished drawing is copied or swapped before starting the next drawing. This waiting period could be several milliseconds during which neither buffer can be touched.
    In triple buffering the program has two back buffers and can immediately start drawing in the one that is not involved in such copying. The third buffer, the front buffer, is read by the graphics card to display the image on the monitor. Once the monitor has been drawn, the front buffer is flipped with (or copied from) the back buffer holding the last complete screen. Since one of the back buffers is always complete, the graphics card never has to wait for the software to complete. Consequently, the software and the graphics card are completely independent, and can run at their own pace. Finally, the displayed image was started without waiting for synchronization and thus with minimum lag.[1]
    Due to the software algorithm not having to poll the graphics hardware for monitor refresh events, the algorithm is free to run as fast as possible. This can mean that several drawings that are never displayed are written to the back buffers. This is not the only method of triple buffering available, but is the most prevalent on the PC architecture where the speed of the target machine is highly variable.
    Another method of triple buffering involves synchronizing with the monitor frame rate. Drawing is not done if both back buffers contain finished images that have not been displayed yet. This avoids wasting CPU drawing undisplayed images and also results in a more constant frame rate (smoother movement of moving objects), but with increased latency.[1] This is the case when using triple buffering in DirectX, where a chain of 3 buffers are rendered and always displayed.
    Triple buffering implies three buffers, but the method can be extended to as many buffers as is practical for the application. Usually, there is no advantage to using more than three buffers.

    The good news:

    Triple buffering increases the delay by 1/(2*FPS) due to the way it handles new and aging frames.
    So worst case scenario, on a system with 60Hz LCD and hardware capable of average 60fps, triple buffering adds in average 8.3ms instead of the typical 16 ms that would come from 1 frame of lag @ 60 fps.
    So at 144 fps you are looking at 3.4 ms of frame delay... pretty safe and negligible once you test your own reaction speed in milliseconds.
    Bottom line though is that real world tearing behaves differently than theoretical ones and it varies between intel/AMD/nVidia.

    SDL input is being used (TF2's input system) with a default of raw input enabled, iirc, and with NS2 using it's own method of frame limiting, you basically have the best case scenario for input delay with DX9 fullscreen even if you are using vsync triple buffering.
    The downside? More VRAM is used, and 60 fps (what most people have) may be too limiting of a ceiling for competitive gamers... but seeing as you are using 144...

    Worth considering:
    Btw, using DX9 fullscreen, NS2 is able to match Quake 3's input delay timing of 3 frames \:D/
    But.. there's no hard evidence, just lots of feedback that NS2 has some perceivable interruptions on the mouse input when the renderer drops below your monitor's refresh rate. No bug report has been made for this, however, as there's not much measurement done. But again.. something to keep in mind for a person with 144 mhz.. you have a low bar to fall over. :)

    This is why i suggest you trying to play with and without it. Because, in the end, it really is up to your perception from a combination of factors (from sensitivity to "smoothness" in screen rendering, to frame times, to input delays, to fps) all contributing to an end experience.

    Hope that answers some things for you

    p.s. I could be wrong.. but i recall reading somewhere that this game throws away any mouse input beyond 500 mhz polling, so there's no need for that 1000 mhz polling rate - especially if you are diagnosing issues - but this being said, i also use 1000 without issue.
  • Ghosthree3Ghosthree3 Join Date: 2010-02-13 Member: 70557Members, Reinforced - Supporter
    I thought the upside of triple buffering was that it wouldn't drop you to the next level of fps (eg. 59 fps becomes 30), but input delay was actually INCREASED because it buffers TWO frames instead of just one.

    I believe polling is actually handled by windows, not the game itself... So 1000hz polling rate is 1000hz in any app.
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    Also, dont confuse frame times with input delay.. the speed of the renderer and how your input is handled are two separate things. :)
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    edited December 2013
    @ghosthree3
    No the benefit and purpose of triple buffering is that it allows for vsync - smoothing of FPS and reducing screen tearing - but without the horrendous window of varying input delay.
    edit: and no, any app can decide what max polling rate to accept.
  • vartijavartija Join Date: 2007-03-02 Member: 60193Members, Constellation, Reinforced - Onos, WC 2013 - Shadow
    Ghosthree3 wrote: »
    I thought the upside of triple buffering was that it wouldn't drop you to the next level of fps (eg. 59 fps becomes 30), but input delay was actually INCREASED because it buffers TWO frames instead of just one.

    Please don't ask me to explain this by details. Triple buffering starts to fill that third frame buffer meanwhile it waits v-sync to give permission for second and main buffers to switch places. I don't really see how this would increase input delay (from triple buffering part). The downside is increased memory consumption (one more buffer would mean +50%).

    I feel like there is lots of misinformation in your posts, or I just don't get what you really mean.
  • Ghosthree3Ghosthree3 Join Date: 2010-02-13 Member: 70557Members, Reinforced - Supporter
    There probably is misinformation. I'm not an expert on the subject, just attempting to relay what I've been told before. Thinking about it it does make sense that triple buffering would not add any more input delay than double, as you still only render one at a time.
  • intrikintrik Join Date: 2003-10-05 Member: 21451Members
    http://engine3d.wordpress.com/2012/10/28/doble-buffering-triple-buffering-y-vsync/

    I will test and play with VSYNC on and let you know how I get on... I'm pretty sure on my setup there is negligible difference, if there is it's not humanly noticeable.

    In fact, regardless of if it is noticeable or not, the experience is better which provides a more positive feedback in relation to reaction time as I spend less time focusing on the tearing and "jagged" aspect and more time focuses on shooting or biting. (or for those who watched our game on Jambi, deciding whether to listen to the calls to run or stay as an onos in gravity... rather than focusing on frame tearing mid game... hahaha )

  • Ghosthree3Ghosthree3 Join Date: 2010-02-13 Member: 70557Members, Reinforced - Supporter
    Based on the discussion I'd say triple buffering is worth using on 144hz. What did IH say, like 3.4ms input delay.
  • IronHorseIronHorse Developer, QA Manager, Technical Support & contributor Join Date: 2010-05-08 Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
    Yeah if You consider quake 3's 3 frame delay = 50 ms @ 60 fps ([1/60]*3) Then 3.4 ms seems pretty freaking negligible hehe
  • DC_DarklingDC_Darkling Join Date: 2003-07-10 Member: 18068Members, Constellation, Squad Five Blue, Squad Five Silver
    What they all said. :D
    I think if I remember ok that windows sets the standard usb polling rate and apps indeed can mess with it on their own. I would definitely not ignore the quality (or lack thereof) of the usb chip & drivers being used.
    Note that different usb (1.0, 1.1, 2.0 etc) should all work on different rates by default. And only the 3.0 and upwards are generally coloured.

    Input lag on itself is hard to fully detect as it can be a factor created in many things. (for me it was my old keyboard)
Sign In or Register to comment.