Mouse Sensitivity
Underwhelmed
DemoDetective #?! Join Date: 2006-09-19 Member: 58026Members, Constellation
in Tech Support
<div class="IPBDescription">A SERIOUS QUERY</div>This question isn't really NS related, it has more to do with the HL engine itself. But I thought I'd post here anyways, since our local robots (You know who you are) are pretty smart.
So Half-Life changes your view when you move your mouse around in degrees. However, your mouse measures movements in pixels - my question is how exactly does HL convert pixels to degrees? I know that Windows sensitivity, m_yaw, m_pitch, and sensitivity are all involved, all of them almost certainly direct multipliers of the pixel --> degree translation. What I don't know is what else is involved. I assume the actual equation is something like
<!--c1--><div class='codetop'>CODE</div><div class='codemain'><!--ec1-->
Degrees = Pixels * Sensitivity * m_yaw or m_pitch * (Degree/Pixel ratio and whatever)
<!--c2--></div><!--ec2-->
So does anybody have more information? Or do I have to start doing extremely inaccurate experiments of my own to find out?
So Half-Life changes your view when you move your mouse around in degrees. However, your mouse measures movements in pixels - my question is how exactly does HL convert pixels to degrees? I know that Windows sensitivity, m_yaw, m_pitch, and sensitivity are all involved, all of them almost certainly direct multipliers of the pixel --> degree translation. What I don't know is what else is involved. I assume the actual equation is something like
<!--c1--><div class='codetop'>CODE</div><div class='codemain'><!--ec1-->
Degrees = Pixels * Sensitivity * m_yaw or m_pitch * (Degree/Pixel ratio and whatever)
<!--c2--></div><!--ec2-->
So does anybody have more information? Or do I have to start doing extremely inaccurate experiments of my own to find out?
Comments
This suggests a proportionality constant of 1.04 degrees/(pixel*sensitivity*m_yaw).
I would be curious to see how this holds with other configurations. It is fairly probable, for one, that the screen resolution would affect this (I run 800*600 ingame, and am pretty sure I've had to readjust my sensitivity every time I changed resolutions).
Other notes:
- Sensitivity is also dependent on field of view, as described in <a href="http://www.unknownworlds.com/forums/index.php?showtopic=97248" target="_blank">this thread</a>.
<strike>- The relation is different for vertical sensitivity and m_pitch. I have not determined an exact relation, but it may depend additionally upon vertical view angle.</strike> [Edit: Sorry, ignore this; it is not reliable as it came from some very old undocumented research which I have largely forgotten.]
DPI -> pixels registered per inch of mouse movement
Windows Sensitivity -> is a multiplier. If not set at default, gives the possibility of rounding your movement up or down to give an even number of pixels on the screen. ex. If you move your mouse over 11 pixels and the multiplier is 0.5, you'd get 5.5 pixels which would round to 6 pixels. This is a source of error in accuracy. There should be a reference out there for this, but I have had trouble finding it.
Note: If you use the launch command -noforcemspd, the windows sensitivity has no effect on your in-game sensitivity.
- in-game sensitivity seems to be at least roughly linearly related to degrees/inch
- m_yaw and m_pitch both have a linear relationship with degrees/inch
- Edit: After further testing, it appears that the mouse movement does not depend on the vertical angle after all! The rate remains the same.
<b>The test (with a Logitech MX500 mouse):</b>
- 700pixels/inch with default windows sensitivity translates to 70degrees/inch in-game at default m_yaw (0.022) and an in-game sensitivity of 4. The relationship would be around 1.14 degrees/(pixel)*(in-game sensitivity)*(m_yaw).
- Screen Resolution does not seem to affect this relationship, it is the same in 640*480, 800*600, and 1024*768. (Perhaps it changes if you don't have -noforcemspd in your launch options for the game.)
degrees = (DPI * Windows Sensitivity) * (In-Game Sensitivity) * (m_yaw or m_pitch)
Again, if you have -noforcemspd, Windows Sensitivity would just be a multiplier of 1. The difference between 1.04 and 1.14 could be due to human error or the mouse used.
Are you sure of that? I've seen many conflicting explanations, but <a href="http://www.gotfrag.com/cs/story/29319/?spage=3" target="_blank">this one</a> seems to be the most prevalent. I've never seen that one (unless I'm misunderstanding the meaning of "windows sensitivity").
Of course, this is a very simple matter to test.
<!--quoteo(post=1594392:date=Jan 3 2007, 05:38 AM:name=Sarisel)--><div class='quotetop'>QUOTE(Sarisel @ Jan 3 2007, 05:38 AM) [snapback]1594392[/snapback]</div><div class='quotemain'><!--quotec--><b>The test (with a Logitech MX500 mouse):</b>
- 700pixels/inch with default windows sensitivity translates to 70degrees/inch in-game at default m_yaw (0.022) and an in-game sensitivity of 4. The relationship would be around 1.14 degrees/(pixel)*(in-game sensitivity)*(m_yaw).
- Screen Resolution does not seem to affect this relationship, it is the same in 640*480, 800*600, and 1024*768. (Perhaps it changes if you don't have -noforcemspd in your launch options for the game.)
degrees = (DPI * Windows Sensitivity) * (In-Game Sensitivity) * (m_yaw or m_pitch)<!--QuoteEnd--></div><!--QuoteEEnd-->
Perhaps the variation is with monitor refresh rate, as I have always changed that with screen resolution, but that seems even more improbable. Or perhaps it is some other factor. I will do some testing when I get a chance.
I use an MX518 set to 800 DPI and all three of -noforcemaccel, -noforcemparms, -noforcemspd (for none of which I have seen conforming explanations).
Anyways, I found an equation that might be of some interest to people that want pixel perfect accuracy.
<!--quoteo--><div class='quotetop'>QUOTE</div><div class='quotemain'><!--quotec-->
The complete formula for calculating the highest sensitivity at which you will get "pixel perfect" accuracy in Quake 3 is as follows, where:
Definitions
cg_fov: horizontal field of view, Q3 variable
m_yaw: horizontal mouse sensitivity factor, Q3 variable (if m_pitch is higher then use this instead - see below for explanation)
R: horizontal resolution (e.g. 1024 if the resolution is 1024x768)
sensitivity: mouse sensitivity, Q3 variable
Formula
sensitivity = ( arctan(2/R) * tan(cg_fov/2) ) / m_yaw
<!--QuoteEnd--></div><!--QuoteEEnd-->
Obviously, this is for Q3, but seeing how HL was built on the Q2 engine, the same probably applies here. I'm not entirely sure how exactly this equation was derived, but it seems to produce a reasonably accurate result.
Suppose my screen is 1280x1024. In NS, the marine FOV is 75, meaning 1280 pixels = 75 degrees. Using this relationship, we can find that 1 pixel = 0.05859375 degrees (Of course, this would vary depending on your resolution). The highest sensitivity where we are able to target each pixel on the screen individually then would be where 1 pixel of movement detected by the mouse equals 1 pixel of movement on the monitor.
The Q3 equation I used ended up giving a different result from the experimental formula we've put forth, meaning one of them, maybe both are wrong.
I took my screen width (1024 pixels) and saw how many inches my mouse moved to cover these pixels (from left to right). Then I found how many inches are required to do a 360 degree turn in-game. From there, I converted accordingly to get the ratio from my previous post.
Of course, this is a very simple matter to test.<!--QuoteEnd--></div><!--QuoteEEnd-->
I changed the mouse sensitivity ("pointer speed") in the windows mouse settings and ran NS with my usual command line parameters (which include -noforcemspd); it <i>did</i> change my ingame sensitivity.
But I was not able to reproduce the change in sensitivity with change in resolution.
<!--quoteo(post=1594629:date=Jan 3 2007, 07:39 PM:name=Underwhelmed)--><div class='quotetop'>QUOTE(Underwhelmed @ Jan 3 2007, 07:39 PM) [snapback]1594629[/snapback]</div><div class='quotemain'><!--quotec-->Suppose my screen is 1280x1024. In NS, the marine FOV is 75, meaning 1280 pixels = 75 degrees. Using this relationship, we can find that 1 pixel = 0.05859375 degrees (Of course, this would vary depending on your resolution). The highest sensitivity where we are able to target each pixel on the screen individually then would be where 1 pixel of movement detected by the mouse equals 1 pixel of movement on the monitor.<!--QuoteEnd--></div><!--QuoteEEnd-->
The angle subtended by each pixel is not constant. A pixel at the center of the screen subtends a larger angle than does a pixel near the edge of the screen.
A simple way to see this is to turn at a constant rate and notice how things move faster near the edges of the screen than near the center. But to show this conclusively:
With constant 100 fps framerate,
alias ds "wait;wait;wait;wait;wait;wait;wait;wait;wait;wait;"
alias s "ds;ds;ds;ds;ds;ds;ds;ds;ds;ds;"
alias test "force_centerview;cl_yawspeed 22.5;+left;s;-left"
Aim such that a vertical edge intersects the crosshair, and run test. The edge will move <i>under</i> half the distance to the edge of the screen (I measured it as roughly 2/5 the distance). But run test again, and the edge should move exactly to the edge of the screen.
(cl_yawspeed is measured in degrees/second; replace 22.5 with 360 in the above script for an easy demonstration of this.)
Also, the marine's FOV is 90 degrees. The same method can show this.
<b>Edit: cl_yawspeed, not cl_pitchspeed.</b> Don't know how I missed that...
Mm, I was afraid it might be something tricky like that. So how exactly would one go about calculating the maximum sensitivity at which no pixels on the monitor would be skipped over?
I changed the mouse sensitivity ("pointer speed") in the windows mouse settings and ran NS with my usual command line parameters (which include -noforcemspd); it <i>did</i> change my ingame sensitivity.
But I was not able to reproduce the change in sensitivity with change in resolution.
<!--QuoteEnd--></div><!--QuoteEEnd-->
My windows mouse sensitivity, which you call the pointer speed, did not change anything for me in-game. I'll do more tests on this later.
Anyways, I'll also test the launch options when I get back to my dorm. I'm under the impression that the HL engine does not use DirectInput though, meaning that all mouse movements are filtered through the Windows. The launch settings should only be reaffirming the default settings.
Question: Why doesn't that Quake3 formula include mouse resolution? One would think it should be influenced by that, unless we are doing something wrong with that number. Edit: Or unless I'm doing something wrong with that number and you all know exactly what you are talking about.
According to <a href="http://razerblueprints.net/index.php?option=com_smf&Itemid=99&topic=1311.msg35716#msg35716" target="_blank">this</a>, it is the latter case, with the relation<!--quoteo--><div class='quotetop'>QUOTE</div><div class='quotemain'><!--quotec-->FoVv = 2 * arctan( Rv/Rh * tan(FoVh/2) )
where
FoVv = vertical field of view
FoVh = horizontal field of view (variable cg_fov in Q3)
Rv = vertical resolution (e.g. 768)
Rh = horizontal resolution (e.g. 1024)<!--QuoteEnd--></div><!--QuoteEEnd-->
That was for Quake 3, of course. But that gives a value of 73.74 degrees for the marine's vertical FOV, which is very close (and well within error range) to a measurement I made in NS a few years ago, 73.65 degrees.
Obviously that doesn't prove anything, but it does seem to indicate that the relation may be true in HL too.
<!--quoteo(post=1596453:date=Jan 8 2007, 09:04 PM:name=Cxwf)--><div class='quotetop'>QUOTE(Cxwf @ Jan 8 2007, 09:04 PM) [snapback]1596453[/snapback]</div><div class='quotemain'><!--quotec-->Question: Why doesn't that Quake3 formula include mouse resolution? One would think it should be influenced by that, unless we are doing something wrong with that number. Edit: Or unless I'm doing something wrong with that number and you all know exactly what you are talking about.<!--QuoteEnd--></div><!--QuoteEEnd-->
The mouse resolution determines the number of counts reported per unit distance. But that formula is intended only to ensure that no <i>single</i> count will result in more than one pixel of movement (or so I gather). The number of counts is then irrelevant.
We don't all know exactly what we're talking about, however. No one has come forth with a derivation of that formula, so we do not know whether it is even true.
My windows mouse sensitivity, which you call the pointer speed, did not change anything for me in-game. I'll do more tests on this later.
<!--QuoteEnd--></div><!--QuoteEEnd-->
Check your command line for NS, one of those three variables people like to include makes windows pointer stuff not work in HL.
And unfortunately, I am still not back on my own computer, thanks to a totally destroyed power supply and hopefully not-broken mobo.
If HL (or Q3, in this case) uses simple perspective projection, the origin of the tan(fov/2) is fairly obvious - it scales the size of the projection space with the FOV.
<img src="http://img172.imageshack.us/img172/3093/tanfovdh3.gif" border="0" alt="IPB Image" />
This projection method requires that it not be possible to set the FOV to 180 degrees or higher, for as the FOV approaches 180 degrees, the size of the projection space approaches infinity. But I tested with default_fov in an NS HLTV and the maximum FOV is 150 degrees, so that projection method is viable.
The arctan(2/R) factor is more elusive. My only guess is that he was thinking this:
<img src="http://img271.imageshack.us/img271/1531/arctan2ryf0.gif" border="0" alt="IPB Image" />
But that is only true if the two lengths which I showed as unity in the image are equal. That is not generally true; in fact, we know from the previous image the ratio of their lengths. The ratio of the length of half the projection space (or screen width) to that of the segment of the center-of-screen ray extending to the projection space, is tan(fov/2). Thus, if this is correct, the relation is only true when the FOV is 90 degrees.
My attempt at determining the correct relation, if my previous assumptions were correct, would go as follows:
<img src="http://img443.imageshack.us/img443/9908/tanfov2yo6.gif" border="0" alt="IPB Image" />
To determine a sensitivity value, we need to make another assumption: that sensitivity*m_yaw is expressed in degrees per count reported by the mouse. Then the relation is:
sensitivity*m_yaw = arctan(2tan(fov/2)/R)
But a comparison of the results of this formula with the results of the other (sensitivity*m_yaw = arctan(2/R)*tan(fov/2)) would lead me to believe that he actually simply used a "small angle" approximation somewhere, contrary to what I concluded previously. Except at very low screen resolutions (on the order of 10 pixels wide) or fields of view very close to (within ~1 degree of) 180 degrees, the results are essentially identical.
yes, screen res plays a rather large role.
In game sensitivity feels linear to me, that's for sure. I play at 8.8 in game though, and my razor is at 1600 DPI
The good thing about the Razors is that you can individually adjust V_sens and H_Sense <img src="style_emoticons/<#EMO_DIR#>/smile-fix.gif" style="vertical-align:middle" emoid=":)" border="0" alt="smile-fix.gif" />
And I never saw any point to the ability to change sensitivity "on the fly" via the mouse when it can be done just as easily through ingame commands. (I have a mouse with that capability, but I remapped the buttons almost immediately.)
Vertical and horizontal sensitivities can also be adjusted via ingame commands (m_pitch and m_yaw).
I like being able to just touch a button and change my sensitivity. Good for games like DOD when you're sniping or running around with an SMG <img src="style_emoticons/<#EMO_DIR#>/biggrin-fix.gif" style="vertical-align:middle" emoid=":D" border="0" alt="biggrin-fix.gif" />