FPS mouse revisited

Learning from mistakes

When I first tried to translate raw mouse input into camera rotation in my FPS game I encountered a problem. When translating raw input into camera rotation I was either getting a rotation which was too coarse or a rotation which was too slow.

I then tried to do mouse acceleration as I was told that other games do that. I thought I solved the mouse issues, but I was wrong. Mouse acceleration improved it a bit, but I wasn’t satisfied with the results(Mouse Ballistics).

I tested other games and I found that an x distance on the mouse pad was translated to the exact same y distance on mouse rotation no matter what speed I moved the mouse. However, mouse movement was still smooth and sufficiently fast.

Raw mouse input has this exact attribute of linear correlation between physical movement and the raw values. But when I used the raw input values linearly, it didn’t give good results. So what was the difference?

From watching other games, I think the solution is simply that the mouse values are converted linearly into screen pixel movement rather than camera angular movement.
It make sense because on the desktop we move the mouse by pixel units rather than rotating a camera with angles. So translating raw input linearly to rotation angles was not good enough.

From theory to code

How do we translate mouse movement into pixel movement in an FPS camera?

A simple FPS camera usually have an Up direction and can rotate around the Y axis(if the Y axis is the Up axis) and can have a pitch(around the X axis). What we will do is translate pixel movement into angle movement and then rotate the camera as usual.

Some code:

float fh = 0.25f;
float dy = y/fh;
float4 p = mProj.Inverted().Mul (float4(0, dy, 0, 1));
b+=atan2(p.y/p.w, mNearZ);

Let’s review the code.
This code is the pitch angle update(rotation around the X axis).

fh is a raw input normalization factor. It depends on the range of raw mouse values and it can be thought of as a sensitivity factor. In my case I have found 0.25 is a good value.

y is the raw input y mouse value.

mProj is the FPS camera’s projection matrix used to translate from 3D coordinates into the screen space.

mNear is the distance from the camera to the near clipping plane in the camera frustum.

As you can see, I multiply the inverse of the projection matrix with screen space coordinates which are the number of pixels to move translated into screen space.
Screen space is the normalized box which DirectX’s rasterizer use to draw pixels into the back buffer. In DirectX’s case it is a box of (-1, 1)x(-1, 1)x(0, 1). This is why I give the Z value of the coordinate 0 as I want the coordinate to be on the near plane.

What multiplying the screen space coordinate by the inverse of the projection matrix do, is to translate screen space coordinates into a 3D(homogenous) vector in world or view space.
We need to divide the vector by the 4th coordinate because we gave 1 in the screen space vector’s 4th coordinate which means we assume the normalizer is 1 when it actually should be z.

Now we have the distance in view\world space of our vector from the camera’s look axis.
In order to calculate the angle we just need to do arc tangent of the y coordinate divided by the z coordinate(which is the distance of the near plane from the camera).

For the sake of completion I also include the code for calculating the camera’s matrices from the rotations:

 
float4x4 R1 = float4x4::RotateY (a, float3(0, 0, 0));
float4x4 R2 = float4x4::RotateX (b, float3(0, 0, 0));
float4x4 R3 = float4x4::RotateZ (Roll, float3(0, 0, 0));
float3 Look(0, 0, 1);

float3 U = R3.MulDir(float3(0, 1, 0));
float3 L = R1.MulDir(R2.MulDir(Look));

float3 R = U.Cross(L);
U = L.Cross(R);

mRight = R;
mUp = U;
mLook = L;

Conclusion

This seems to give a much nicer and smoother results than what I did previously. I also know that other games have correlation between physical movement and screen movement which means other games have achieved good mouse movement with linear movement.
I hope I found a good solution.

Motion Blur of Skinned Meshs, Particles and a Bug Fix.

In the previous post about motion blur I presented my results even though I was not completely happy with them.

For the motion blur I was calculating screen space vectors in an offscreen texture. These velocity values are then used by the compute shader post processing to calculate how much to push pixels around to make pixels smear in places with high velocity.

It turns out I forgot to normalize the screen space x and y coordinates by the w component which actually stores a depth value to make the perspective projection. This resulted in a very acute blur for far objects, and a very minute blur for closer objects.

Another thing I wanted to add is motion blur for skinned meshes. To calculate the velocity of each pixel for each object I was calculating each vertex’s position AND what would be its position in the previous frame. I was considering world movement and camera movement. 
However, for skinned meshes I didn’t include object deformation differences in the previous frame, so a skinned mesh had only extrinsic blur and not intrinsic blur.

Similar to calculating the extrinsic velocity, in order to calculate bone affected position of the vertices, I had to calculate their position in the previous frame. To do that I simply sent a copy of the previous frame’s bones offsets. This might prove to be a performance issue since now I send twice the amount of bones’ offsets, but for now it’s good enough.

Finally, I also did motion blur for particles. I simply rendered the velocity vectors of the particles with blending into the velocity offscreen texture. It works good enough, but I didn’t put more effort than that. I might do something else later.

I think the next stage would be to combine motion blur with Depth Of Field. Hopefully it’s just a matter of pipelining the two filters.

Skinned motion blur.

Motion Blur

For motion blur I wanted to have a global solution. I didn’t want the motion to blur only the entire screen but not objects that move independently of the camera.
I also didn’t want to redraw the scene many times to produce the motion blur or didn’t want to scale down the scene and use that.

My first attempt wasn’t so succesful. My intuition told me that all I need to do is render the velocity of each pixel into an offscreen buffer and use that to blend the resulting image.
There are a few problems with that. First, it’s more complex to calculate the velocity of skinned mesh. Secondly, it is more complex to blend pixels between different objects.

If I have a moving object but the background doesn’t move, I need to blend the object on the background but the background has velocity 0. So the pixels I want to blend with the background will have velocity 0.
Then, I had the idea to do the inverse operation. Instead of looking at a pixel and it’s velocity to sample other pixels, I will simply sample a lot of pixels and move them around and add them to a buffer(if they land inside of it).
That didn’t go well either and had very limited results.

Afterwards I thought, what if I blend the previous frames? Blending previous frames is easy because I only need to store the result of summation. However, wouldn’t that make certain pixels tracing back to very old frames? No! The reason is that the color buffer has 8 bit per color channel(Red, Green and Blue) and so it has only 256 levels per channel. If I divide the previous summation every frame, and add the current frame multiplied by a factor of 0.5, I will have a trace of 8 frames at most. Because dividing repeatedly by 2 will make older values divided into 0.

However, the results of this method were not enough. With this method there is a problem that the motion blur is too coarse, you can see the ghosting of previous frames. It didn’t help even when I tried to blur previous frames.

Finally, I have decided to combine the two of the previous methods. I will render the velocity but I will also sum the velocity into an offscreen texture in a similar way I did for the color. This way I will have both the coarse and fine resolution.

The fine resolution will happen when I sample the color map and push pixels, and the coarse resolution will come from the sum of previous velocity maps that will be used for pushing the pixels.

Skinning and Animations.

Many objects in our daily life are rigid, but some are non rigid and deformable. For instance, humans are deformable as we can change our extrinsic shape by moving our limbs.
Skinning is a technique to deform 3D objects using bones.
When we want to move the hand of a mesh of a human, it would be hard for us to do so by moving each and every vertex in the hand. Instead, we move a bone that affects all the vertices that are closer to it.

Implementing Skinning took me well over a week, and it wasn’t a very smooth ride. The first issue was that I had several conventions on different libraries and APIs I used.

I had one convention for the math library(http://clb.demon.fi/MathGeoLib/) another for the 3D file importer(http://assimp.sourceforge.net/) and a third of the DX shaders. They didn’t necessarily coincidence.

I admit I didn’t have the brains or didn’t properly solve this and figured it out analytically. Instead I just debugged the results and tried to fix them until it looked right.

The issue I had is that the meshes were loaded mirrored over the z-axis. There was another issue that the math lib was generating matrices of row major instead of column major, but I already knew that.

It might be a bit overwhelming to try to find how to fix this when you have a complex mesh that looks really broken. So instead you can simply use a simpler mesh with only two bones to test things out. Makes it a lot easier to debug it.


 

Every bone in the skinned mesh has its own resting position. A position where deformation does not occur by this specific bone. If you move a bone from that resting position, it will deform the mesh.
Mirroring the z value of each vertex did not solve the issue, I also had to mirror the offset matrix(the translation of the bone from the resting position) and that is what took me quite some time and debugging.

Now I had my bones and mesh properly match the DirectX shaders convention, but I still needed weights for the vertices. Each vertex in a skinned mesh have up to 4 weights which represent how much(up to 4) bones affect it. The weights from the modeling program were not very good and it is quite a tedious work to set the weights for each bone manually in the modeling program. What I did instead is to set the weights automatically according to the bones placement in the mesh.

The automatic weight calculation is done by turning the mesh into voxel, or voxelization. This give us the volume of the mesh up to a certain accuracy on which we can make calculations. I used marching cubes to calculate the geodesic distance of each vertex in the mesh from each bone.
This makes sense because we don’t want to give weight to the vertices according to the Euclidean(flight distance) from the bone, but rather the shortest distance within the mesh bounds.

One issue that arose was that the voxelization was really slow. It took me ten minutes to voxelize a single mesh. The reason was that I did a naive implementation. For each voxel, I checked whether it’s inside or outside the mesh, which means I had to test every voxel against the entire mesh.

The optimization I did was some variant of Octrees. I simply had a coarse 3D grid. For each cube in the grid I checked which triangles of the mesh intersect the cube. Now in order to test if a voxel is inside the mesh, I only had to test it on the triangles which a ray from the voxel intersects with the cubes that contain those triangles.

This improved the times from ten minutes to a few seconds.

If you would like more detailed explanation of skinning and voxelization, please reply in the comments, in the mean time here are the results.

Mouse Ballistics

I said the mouse feels a bit jerky in my demo, so I bought a higher DPI mouse instead of my ancient mouse I was using. Well, apparently the issue was not with the mouse itself, but rather the way I was handling input. But buying a new mouse is also a way to debug, albeit an expensive one.
On windows I am reading mouse input using the raw input message (WM_INPUT).

I will just put a bit of code just in case it’s useful for someone, but the point here is not to show how to read Raw input.
First we need to register the input devices:

RAWINPUTDEVICE Rid[3];

Rid[0].usUsagePage = 0x01;
Rid[0].usUsage = 0x05;
Rid[0].dwFlags = 0; // adds game pad
Rid[0].hwndTarget = 0;

Rid[1].usUsagePage = 0x01;
Rid[1].usUsage = 0x04;
Rid[1].dwFlags = 0; // adds joystick
Rid[1].hwndTarget = 0;

// Mouse registration
Rid[2].usUsagePage = 0x01;
Rid[2].usUsage = 0x02;
Rid[2].dwFlags = RIDEV_NOLEGACY; // adds HID mouse and also ignores legacy mouse messages
Rid[2].hwndTarget = 0;

if (RegisterRawInputDevices(Rid, 3, sizeof(Rid[0])) == FALSE) {
//registration failed. Call GetLastError for the cause of the error.
}

Then we need to handle WM_INPUT to get the raw data:

case WM_INPUT:
{
UINT dwSize;

GetRawInputData((HRAWINPUT)lParam, RID_INPUT, NULL, &dwSize,
sizeof(RAWINPUTHEADER));
LPBYTE lpb = new BYTE[dwSize];
if (lpb == NULL)
break;

if (GetRawInputData((HRAWINPUT)lParam, RID_INPUT, lpb, &dwSize,
sizeof(RAWINPUTHEADER)) != dwSize )
OutputDebugString (TEXT("GetRawInputData does not return correct size !\n"));

RAWINPUT* raw = (RAWINPUT*)lpb;

if (raw->header.dwType == RIM_TYPEMOUSE)
{
Mouse->SetRaw((double)raw->data.mouse.lLastX, (double)raw->data.mouse.lLastY);
Mouse->SetMove();
}
delete[] lpb;
break;
}

The code reads raw input from the mouse, but the issue is that the raw input is linear.
That means we get movement values that are linearly proportional to the actual mouse movement on the surface.
Most FPS games have mouse ballistics. The mouse accelerate as you move it faster, so it is not 1:1 linear mapping to actual mouse movement.

I have searched for resources about mouse ballistics in windows, but didn’t find a definitive answer. There was something about thresholds you can get with SystemParametersInfo and SPI_GETMOUSE.
However, it seems like it might be deprecated and does not return useful values.

Another thing I found is how windows XP models mouse ballistics. This is an old article but it gives an idea how to model mouse ballistics from the raw input on my own.
http://msdn.microsoft.com/en-us/windows/hardware/gg463319.aspx

The way I modeled mouse movement is by using a function f(x) where x is the raw input, such as the horizontal movement or vertical movement, and the result is the desired movement.
Notice that this function does not have a time component so although we talk about acceleration we actually calculate the movement per sample.

I had to normalize the mouse data to values of about [0..1] because I wanted to use te square function to reduce the small values and that only happen on values bellow 1(unless you normalize the values afterwards).

The link about mouse ballistics in Windows XP is doing a lot more to make sure the mouse response is hardware independent, but I didn’t go to this length. I simply used the game’s resolution to normalize the mouse.

After considering that, all that was left was to do is to tweak the mouse to my liking. The final function I came up with looks like this:

MoveX = dynamic_cast(a.operator->())->GetRawX();

MoveX = min(MoveX, 1);

double a1 = 0.075;

MoveX=MoveX

Mouse ballistics model, f(x)

I feel like there is a lot more to do in this area, and I need to learn more about how other games model the mouse movement. It is easy to neglect this part because it’s usually easy to “make it work” and be “good enough”.
I have a feeling I will get back to this subject later.

Beautiful Fail 2 (broken weights)

I didn’t update this week not because I didn’t work on the Fatal framework but rather the task I have in hand is a longer one.
I am working on skinning and animations.
I am also hopefully going to get a new mouse tomorrow which will hopefully fix the jerkiness when I am moving my mouse.
In the mean time, enjoy this early fail of the weight maps and skinning.