Struggling with this projection problem

Hi,
this isn’t tied to any one specific course, I hope that is ok. I’ve gone through the math course, which is an excellent refresher for someone like me who hasn’t done that stuff in years, but I’m still struggling to apply it to my problem and come up with the solution.

My game is 3D first-person. You can aim an object out in front of yourself, a boxcast is sent out a certain distance to query for contacts, and the object should stick/project onto surface it contacts. The object can be wide, so that it extends out to both sides of you, and therfore the contact point on the surface can be to either side of the look direction. I want the object to lay on the surface it contacts with, but at the point on the surface in the look direction, not the actual contact point the boxcast returns.

Im struggling to figure out how far to move the object in the look direction in order for it to lay on the surface. The information I have is: my look origin, look vector, contact point, contact normal, the vector from my look origin to the contact point, and the angle between look vector and that vector to the contact point. Seems like I should be able to solve this with some triangle math, but Im failing to do so.

For reference, this was my closest to a solution. I take the point at the max look distance in the look direction, and move it back towards the character along the normal direction to the surface. That isnt what I want though as it should really move back in the look direction (for example, if you are looking upwards at an angle, the object will position on the contact surface too high up if you use the normal direction), but I cant figure out that distance. Hope the question is clear, here is the code I have that produces the above incorrect result using normal direction (unity code)

Vector3 maxLookPoint = lookOrigin + lookDirection * maxLookDistance;                                 
float angle = Vector3.Angle((maxLookPoint - hit.point).normalized, hit.normal);                            
float distance = (maxLookPoint - hit.point).magnitude;
float move = distance * Mathf.Sin((90f - angle) * Mathf.Deg2Rad);                                                        
Vector3 surfacePosition = maxLookPoint - (hit.normal * move);

Appreciate any help/suggestions.
Thanks

@garypettie

Hi @mikem, did you manage to solve this problem?
It sounds like you want Vector3.ProjectOnPlane() but I may be misunderstanding the question.
If you’re still struggling with it then it may be helpful to share a diagram and some pictures so I can understand exactly what you’re trying to achieve.

1 Like

This topic was automatically closed 20 days after the last reply. New replies are no longer allowed.

Privacy & Terms