15 Jun, 2009, Runter wrote in the 1st comment:
Votes: 0
Given point a is at (0,0) and point b is at (10, 10) how do you find the azimuth?

So I had this problem. I wanted to find best way to calculate azimuth from point of reference a to point of reference b on a 2d plane.
(0 degrees being north, 90 degrees being east, 180 degrees being south, etc.)

I had a lot of help on IMC and someone suggested I post my question/results on a forum. So here it is.

This solution is mostly thanks to David Haley. If this helps anyone else you can give him your thanks. :P


In David's own words:
cos A = adj / hyp, so you want cos^-1(adj/hyp)
adj = abs(y_2 - y_1), hyp = dist(p1, p2)

(keep in mind if the absolute value is negative in the first place our degree answer will have 180 added to it at the end.)
Then we'll be at radians and to get from Radians to Degrees:

Degrees = Radians * 180 / PI


Here's a link to the code I wrote in ruby to test.
And here's the output:
Quote
Should be 0 degrees: 0.0
90 degrees: 270.0
180 degrees: 180.0
270 degrees: 270.0
—— —— ——
45 degrees: 45.0
225 degrees: 225.0


Again, thanks to David and Craty and everyone else who helped me.
If anyone else has any other solutions they'd like to share please post. :)
21 Jun, 2009, Runter wrote in the 2nd comment:
Votes: 0
The previous code wasn't working for all cases. (As even the output I quoted showed.)

Here is the fixed code.
0.0/2