Reputation:
What I'm trying to achieve is get the angle that an object is in relation to the direction the camera is facing. From the image I've provided: Point A: Camera position. Point B: Reference point which is always in front of the camera's view regardless of rotation. Point C: Object in question.
Geting the degrees between point B and C is already done, but the problem is that the yaw of my camera (in degrees) is according to world space and not in view(camera) space. This becomes problematic depending on which Cartesian Quadrant the camera position currently resides in.
How do I define degrees around the camera based on the camera's direction independent of world space?
Upvotes: 0
Views: 2342
Reputation: 33865
Your picture doesn't show it too well, because you're camera is positioned at the origin of the world. But if you were to imagine the camera somewhere else, like in the top-left, It's easy to see that the angle between the lines to B and C from the origin, is different than the angle between the lines to B and C from the camera.
By first subtracting the camera position from the vectors to B and C, you get 2 new vectors which represent the lines to B and C from the camera. As if the camera were the origin.
Computing the angle between those will give you the angle in camera space.
In pseudocode:
Vector b // B's position
Vector c // C's position
Vector camPos // camera's position
Vector bFromCam = (b - camPos)
Vector cFromCam = (c - camPos)
float angle = AngleBetween(bFromCam, cFromCam) // Compute the angle
I've left the implementation of computing the angle out, since you know how to do that already.
Upvotes: 1