Mouse Position Formula

Hey all,

In lamens terms, what is happening with the Debug.Log code in this lecture as a formula or mathematical point of view? I can get the code working but struggling with the thought process of how we arrive at the formula to calculate the mouse position related to the screen.

I want to get a solid grasp on the thought process when faced with a similar challenge in the future, so I can more instinctly resolve this problem kind of problem.


Hi Dylan,

Welcome to our community! :slight_smile:

What we do is to normalise the screen. The left x-coordinate of our screen is 0, the right x-coordinate depends on the resolution. For the sake of simplicity, lets claim we have a resolution of 800 x 600. This means that the x-coordinate at the right edge of the screen is 800.

The mouse exists in the same “world”, the screen world, so to say. The left x-coordinate is 0, the right x-coordinate 800.

Now we normalise the screen meaning we make our max value 1 (100 %), the the minimum value 0 (0 %): (mousePos.x/Screen.width).
0 / 800 = 0
400 / 800 = 0.5
800 / 800 = 1

Does that make sense so far?

The 3D world in Unity does not have any pixel coordinates but World Units (WU). This means that we will have to convert our mouse position to world space coordinates.

We know that our camera covers 16 tiles of the scene grid. The left edge is at x = 0, and the right edge at x = 16.

100 % of 16 is 16.
50 % of 16 is 8 (the centre of our game screen).
0 % of 16 is 0.

And where do the 0 % to 100 % (0 to 1) come from? From the normalised mouse coordinates.

Did this clear it up for you?

See also:

1 Like

Thanks Nina for the prompt response! The normalising was tripping me up a bit. A great help and definitely have a stronger sense of what’s happening in this case.


This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.

Privacy & Terms