Marc
Marc

Reputation: 385

Compute yaw of a 2D Vector

I have two 3D points (however the z coordinate is always zero) and want to compute the yaw of the direction vector created from these points. I already found this post, and tried the following code based on it:

double p1_x, p1_y, p2_x, p2_y;
//initialize vars...

double dx = p2_x - p1_x;
double dy = p2_y - p1_y;

double yaw = atan(dx/-dy);

However I seem to get strange results if I test this approach. Moreover this approach does not seem to regard cases with an dy of zero. My problem is that I do not entirely know the underlying math, so I have trouble adapting the code.

My question is: How does this approach need to be adapted to return an appropriate yaw? And why isn't it working in ints curent state?

Thank you for your help & regards, scr

Upvotes: 1

Views: 1672

Answers (1)

emartel
emartel

Reputation: 7773

Usually, people will use atan2 (documented here and here)

double dx = p2_x - p1_x;
double dy = p2_y - p1_y;

double yaw = atan2(dy, dx);

This version of your code should give you what you're looking for.

You can then multiply by 180 / PI if you want a value in degrees.

Upvotes: 8

Related Questions