Hi Dylan,
Welcome to our community!
What we do is to normalise the screen. The left x-coordinate of our screen is 0, the right x-coordinate depends on the resolution. For the sake of simplicity, lets claim we have a resolution of 800 x 600. This means that the x-coordinate at the right edge of the screen is 800.
The mouse exists in the same “world”, the screen world, so to say. The left x-coordinate is 0, the right x-coordinate 800.
Now we normalise the screen meaning we make our max value 1 (100 %), the the minimum value 0 (0 %): (mousePos.x/Screen.width).
0 / 800 = 0
400 / 800 = 0.5
800 / 800 = 1
Does that make sense so far?
The 3D world in Unity does not have any pixel coordinates but World Units (WU). This means that we will have to convert our mouse position to world space coordinates.
We know that our camera covers 16 tiles of the scene grid. The left edge is at x = 0, and the right edge at x = 16.
100 % of 16 is 16.
50 % of 16 is 8 (the centre of our game screen).
0 % of 16 is 0.
And where do the 0 % to 100 % (0 to 1) come from? From the normalised mouse coordinates.
Did this clear it up for you?
See also: