Get point between two points crossing the 180 degree meridian at 180 degrees - coordinates

I am plotting a line on a MapboxGL (js) world map that follows the path of orbital objects. I do this by adding a new set of decimal longitude/latitude coordinates to a line geometry array as the orbit of the object updates.
There is a known issue with Mapbox (and others) that when drawing a line that crosses the 180° meridian (longitude), we do not get a nice straight line from a to b, we get a very long line that wraps around the whole globe from a to b:
instead of: we get:
/ /
/ /___
.../..... ......... 180° meridian
/ ___
/ /
/ /
"Accepted" answers here and at Mapbox suggest shifting to a 0°/360° longitude range, but this just moves the problem from the equator to the pole. This is fine for most general proposes, but is still an issue for orbital tracking where we may be crossing the 0°/360° meridian.
My solution is to use MultiLine geometry and break up my coords into new arrays when I cross this meridian, however, this will always leave a wee gap, or, if I "180, lat" either side, we get a "kink" at the meridian:
gap: or kink:
/ /
/ /
........ .....|... 180° meridian
/
/ /
/ /
So I need to figure out what the exact latitude would be if the longitude is on the meridian, knowing the start and end points either side:
+170 | p2 /:
| / :
| / :
180 -|-----/ pX? -- 180° meridian
| /: :
(lng) | / : :
| / : :
-170 |_/___:___:___
p1 x?
(lat)
I need to solve for latitude x so I can generate pX (knowing p1 and p2 if longitude where 180). Once I have pX, I can add this to the end of the last line and to the beginning of the next, thus closing the gap (or smoothing the "kink").
I know this is basic Trig, but my old-man-brain has failed me .. again ....

The simple way to split a line in this way would be to use Turf's lineSplit function. Something like:
const meridian = turf.lineString([[180, -90], [180, 90]]);
const linePieces = turf.lineSplit(myline, meridian);
I haven't tried this, so not sure if Turf itself has any weirdness at the meridian. If it does, you might have to temporarily translate the coordinates elsewhere or something.
Better than doing your own trigonometry in any case, especially since it may introduce errors with the world not being flat.

SOLVED! With basic Trig (while writing the question - so I am posting it anyway, just incase it helps someone else):
We are basically playing with two right triangles: p1 to p2, and the smaller right triangle where the opposite side stops at the meridian, both with the same hypotenuse angle. So, we have:
+170 | p2 /|
| / |
| / |
180 -|-----/ pX? -- 180° meridian
| /: |
(lng) | / : A |
| / B: |
-170 |_/___:___|___
p1 x?
(lat)
Where A is our p1 to p2 right angle triangle and B is the triangle from p1 longitude to the meridian, whose adjacent side we need to work out.
Pythagoras basically teaches us that all we need is two points of data (other then the right angle) of a right triangle to solve any other.
We already have the opposite and adjacent lengths of A:
+170 | p2 /|
| /α|
| / |
180 -|-- / | -- 180° meridian
| / |
(lng) | / A | oppositeA
| / |
-170 |_/β______|___
p1 adjacentA
(lat)
So from here we need to calculate the hypotenuse of A to get the angle of the hypotenuse of A (α) so we can use it later:
// add 360 to a negative longitude to shift lng from -180/+180, to 0/360
p1 = { lng: p1.lng < 0 ? p1.lng + 360 : p1.lng, lat: p1.lat }
p2 = { lng: p2.lng < 0 ? p2.lng + 360 : p2.lng, lat: p2.lat }
let oppositeA = Math.abs(p2.lng - p1.lng) // get A opposite length
let adjacentA = Math.abs(p2.lat - p1.lat) // get A adjacent length
let hypotenuseA = Math.sqrt(Math.pow(oppositeA,2) + Math.pow(adjacentA,2)) // calc A hypotenuse
let angleA = Math.asin(oppositeA / hypotenuseA) // calc A hypotenuse angle
Now we need the new opposite of B (p1.lng to 180) and our calculated angle of A to work out the new hypotenuse of B so we can get the new adjacent of B:
+170 | p2 /
| /
| /
180 -|-- / -- 180° meridian
| /: B
(lng) | /α:
| / : oppositeB
-170 |_/___:___ ___
p1 adjacentB
(lat)
let oppositeB = Math.abs(180 - p1.lng) // get B opposite
let hypotenuseB = oppositeB / Math.cos(angleA) // calc B hypotenuse using A angle
let adjacentB = Math.sqrt(Math.pow(oppositeB,2) + Math.pow(hypotenuseB,2)); calc B adjacent
Now we add the new adjacent to p1 latitude, and we have x! So:
let pX = { lng: 180, lat: p1.lat + adjacentB }
End the last line array and start the next with pX, and the gap is perfectly closed!
Highschool math (well, the genius of Pythagoras) to the rescue! I knew it was rattling around in that old-man-brain somewhere .....

Related

Compute coordinates position with projection

Given 2 coordinates (point 1 and 2 in red) in WGS84 I need to find the coordinates of the point perpendicular (point 3) to the line at a given distance.
I could manage to make the math to compute this perpendicular point, but when displayed on the map, the point seems to be at a wrong place, probably because of the projection.
What I want on a map:
And what I have instead on the map:
How can I take into account the projection so that the point on the map appears perpendicular to the line? The algorithm below to compute the point comes from here: https://math.stackexchange.com/questions/93424/calculate-rectangle-coordinates-from-line-and-height
public static Coords ComputePerpendicularPoint(Coords first, Coords last, double distance)
{
double slope = -(last.Lon.Value - first.Lon.Value) / (last.Lat.Value - first.Lat.Value);
// number of km per degree = ~111km (111.32 in google maps, but range varies between 110.567km at the equator and 111.699km at the poles)
// 1km in degree = 1 / 111.32km = 0.0089
// 1m in degree = 0.0089 / 1000 = 0.0000089
distance = distance * 0.0000089 / 100; //0.0000089 => represents around 1m in wgs84. /100 because distance is in cm
double t = distance / Math.Sqrt(1 + (slope * slope));
Coords perp_coord = new Coords();
perp_coord.Lon = first.Lon + t;
perp_coord.Lat = first.Lat + (t * slope);
return perp_coord;
}
Thank you in advance!

Get the exact satellite image for a given Lat/Long bbox rectangle?

For a visualization I need an optical satellite image for a specific rectangular AOI, that is defined by two lat/long coordinates. I tried Mapbox Static Images API, which takes a lat/long bounding box and a resolution in width/height pixel for the output. The problem is that it looks like to me that if ratio of the lat/long box is not the same as the w/h pixels, it will add padding to the lat/long bounding box to fill the w/h of the pixel image.
And this would prevent me from combining the optical image with the other data, because I would not know which image pixel would (roughly) correspond to which lat/long coordinate.
I see three "solutions", but I don't know how to achive any of them.
"Make" Mapbox return the images with out padding.
Compute the ratio for the correct w/h pixel ratio using the lat/long coordinate, so there would be no padding. Maybe with https://en.wikipedia.org/wiki/Equirectangular_projection like discussed here: https://stackoverflow.com/a/16271669/380038?
Find a way to determine the lat/long coordinates of the optical satellite image so I can cut off the possible padding.
I checked How can I extract a satellite image from google maps given a Lat Long Rectangle?, but I would prefer to use my existing paid Mapbox account and I got the impression that I still wouldn't get the exact optical image or the exact corner coordinates of the optical image.
Mapbox Static Images API serves maps
You have optical image from other source
You want to overlay these data
Right?
Note the Red and Green pins: the waypoints are at opposite corners on Mapbox.
After Equirectangular correction Mapbox matches Openstreetmaps (little wonder), but Google coordinates are quite close too.
curl -g "https://api.mapbox.com/styles/v1/mapbox/streets-v11/static/[17.55490,47.10434,17.55718,47.10543]/600x419?access_token=YOUR_TOKEN_HERE" --output example-walk-600x419-nopad.png
What is your scale? 1 km - 100 km?
What is your source of optical image?
What is the required accuracy?
Just to mention, optical images have their own sources of distortions.
In practice:
You must have the extent of your non optical satellite data (let's preserve the mist around...) I'll call it ((x1, y1), (x2, y2)) We are coders, not cartographers - right!?
If you feed your extent to https://docs.mapbox.com/playground/static/ as
min longitude = x1, min lattitude = y1, max longitude = x2, max lattitude = y2
Select "Bounding box" entry! Do you see mapbox around your data!? Don't mind the exact dimensions, just check if mapbox is related to your data! May be you have to swap some values to get to the right corner of the globe.
If you have the right ((x1, y1), (x2, y2)) coordinates, do the equirectangular transformation to get the right pixel size.
You've called it Solution #2.
Let's say the with of your non optical satellite data is Wd, the height is Hd.
The mapbox image will fit your data, if you ask for Wm widht, and Hm height of mapbox data where
Wm = Wd
Hm = Wd * (y2 - y1) * cos(x1) / (x2 - x1)
Now you can pull the mapbox by
curl -g "https://api.mapbox.com/styles/v1/mapbox/streets-v11/static/[<x1>,<y1>,<x2>,<y2>]/<Wm>x<Hm>?access_token=<YOUR_TOKEN>" --output overlay.png
If (Hd == Hm)
then {you are lucky :) the two images just fit each other}
else { the two images are for the same area, but you have to scale the height of one of the images to make match }
Well... almost. You have not revealed what size of area you want to cover. The equation above is just an approximation which works up to the size of a smaller country (~100 km or so). For continent scale you probably have to apply more accurate formulas.
In my opinion, your #2 idea is the way to go. You do have the LLng bbox, so all that remains is calculate its "real" size in pixels.
Let us say that you want (or can allow, or can afford) a resolution of 50m per pixel, and the area is small enough not to have distortions (i.e., a rectangle of say 1 arcsecond of latitude and 1 arcsecond of longitude has top and bottom sides of the same length, with an error less than your chosen resolution). These are, I believe, very loose requisites and easy to fulfill.
Then, you just need to calculate the distance between the (Lat1, Lon1) and (Lat1, Lon2) points, and betwen (Lat1, Lon1) and (Lat2, Lon1). Divide that distance in meters by 50, and you'll get the exact number of pixels:
Lon1 Lon2
Lat1 +---------------+
| |
| |
Lat2 +---------------+
And you have a formula for that - the haversine formula.
If you need a higher precision, you could recourse to the Vincenty oblate spheroid (here a Javascript library). On the MT site (first link) there is a live calculator that you can use to plug data from your calls, and verify whether the approach is indeed working. I.e. you plug in your bounding box, get the distance in meters, divide and get the pixel size of the image (if the image is good, chances are that you can go with the simpler haversine. If it isn't, then there has to be some further quirk in the maps API - its projection, perhaps - that doesn't return the expected bounding box. But it seems unlikely).
I've had this exact problem when using a satellite image on an apple watch. I overlay some markers and a path. I convert everything from coordinates to pixels. Below is my code to determine the exact bbox result
var maxHoleLat = 52.5738902
var maxHoleLon = 4.9577606
var minHoleLat = 52.563994
var minHoleLon = 4.922364
var mapMaxLat = 0.0
var mapMaxLon = 0.0
var mapMinLat = 0.0
var mapMinLon = 0.0
let token = "your token"
var resX = 1000.0
var resY = 1000.0
let screenX = 184.0
let screenY = 224.0 // 448/2 = 224 - navbarHeight
let navbarHeight = 0.0
var latDist = 111000.0
var lonDist = 111000.0
var dx = 0.0
var dy = 0.0
func latLonDist(){
//calgary.rasc.ca/latlong.htm
let latRad = maxHoleLat * .pi / 180
//distance between 1 degree of longitude at given latitude
self.lonDist = 111412.88 * cos(latRad) - 0.09350*cos(3 * latRad) + 0.00012 * cos(5 * latRad)
print("lonDist = \(self.lonDist)")
//distance between 1 degree of latitude at a given longitude
self.latDist = 111132.95 - 0.55982 * cos(2 * latRad) + 0.00117 * cos(4 * latRad)
print("latDist = \(self.latDist)")
}
func getMapUrl(){
self.dx = (maxHoleLon - minHoleLon) * lonDist
self.dy = (maxHoleLat - minHoleLat) * latDist
//the map is square, but the hole not
//check if the hole has less x than y
if dx < dy {
mapMaxLat = maxHoleLat
mapMinLat = minHoleLat
let midLon = (maxHoleLon + minHoleLon ) / 2
mapMaxLon = midLon + dy / 2 / lonDist
mapMinLon = midLon - dy / 2 / lonDist
} else {
mapMaxLon = maxHoleLon
mapMinLon = minHoleLon
let midLat = (maxHoleLat + minHoleLat ) / 2
mapMaxLat = midLat + dx / 2 / latDist
mapMinLat = midLat - dx / 2 / latDist
}
self.imageUrl = URL(string:"https://api.mapbox.com/styles/v1/mapbox/satellite-v9/static/[\(mapMinLon),\(mapMinLat),\(mapMaxLon),\(mapMaxLat)]/1000x1000?logo=false&access_token=\(token)")
print("\(imageUrl)")
}

Raycast get local coordinates of hit of Plane

I have a plane with scale (64,1,36) and rotation is (90,-180,0) and need the local coordinate of a raycast hit in the 2d coordinates format:
(0,0)-------(64,0)
| |
| |
| |
(0,36)------(64,36)
with my current code:
RaycastHit hit;
Vector3 coords = new Vector3();
if (Physics.Raycast(GazeOriginCombinedLocal, GazeDirectionCombined, out hit, Mathf.Infinity))
{
if (!hit.transform.Equals("Cube"))
{
Pointer.transform.position = hit.point; //Green cube for visualization of hit in worldspace
// coords = RectTransformUtility.ScreenPointToLocalPointInRectangle(Plane, hit.point, Camera.main, out coords);// no result at all
}
}
Trying this:
hit.transform.InverseTransformPoint(hit.point)
gives me this
(5,-5)---(-5,-5)
| |
| (0,0) |
| |
(5,5)----(-5,5)
Does some have an idea to get the needed format?
Thats how my plane which is a child of the main camera and my hierarchy looks like:
Thanks in advance
I think you could use the Transform.InverseTransformPoint which
Transforms position from world space to local space.
And then since this also is affected by the scale multiple it again by the scale of the plane using Vector3.Scale.
So your coords should probably be something like
coords = hit.transform.localScale / 2f + Vector3.Scale(hit.transform.InverseTransformPoint(hit.point), hit.transform.localScale);
can't test it right now though since typing on smartphone. You might e.g. need to invert the y/z component according to your needs and depending how the plane is rotated etc. But I hope this gives you an idea
In order to debug what's wrong you should probably print out the values step by step
var scale = hit.transform.localScale; // 64, 1, 36
var halfScale = scale / 2f; // 32, 0.5, 18
var localHitPoint = hit.transform.InverseTransformPoint(hit.point);
Debug.Log($"{nameof(localHitPoint)}:{localHitPoint:0.000}");
So what I had expected originally here would be values like
(-0.5, 0.5, 0)----(0.5, 0.5, 0)
| |
| (0, 0, 0) |
| |
(-0.5, -0.5, 0)---(0.5, -0.5, 0)
BUT as you now added: Your plane is rotated!
The 90° on X actually makes that Y and Z switch places. So in order to get the desired Y coordinate you would rather read the localHitPoint.z.
Then the 180° on Y basically inverts both X and Z.
So I would now expect the values to look like
(0.5, 0, -0.5)----(-0.5, 0, -0.5)
| |
| (0, 0, 0) |
| |
(0.5, 0, 0.5)---(-0.5, 0, 0.5)
Which looks pretty much like the values you describe you are getting. Not sure though why you have a factor of 10 and why you didn't need to switch Y and Z.
However since you actually want the 0,0 to be in the top-left corner you only need to flip the X axis and use Z instead of Y so
fixedLocalHitPoint = new Vector2(-localHitPoint.x, localHitPoint.z);
Debug.Log($"{nameof(fixedLocalHitPoint)}:{fixedLocalHitPoint:0.000}");
Which should now give you values like
(-0.5, -0.5)----(0.5, -0.5)
| |
| (0, 0) |
| |
(-0.5, 0.5)----(0.5, 0.5)
And still you need to scale it up again
var scaledHitPoint = Vector2.Scale(fixedLocalHitPoint, new Vector2 (scale.x, scale.z));
Debug.Log($"{nameof(scaledHitPoint)}:{scaledHitPoint:0.000}");
Which should now give values like
(-32, -18)----(32, -18)
| |
| (0, 0) |
| |
(-32, 18)-----(32, 18)
That's why you need to add the center point as a reference
coords = new Vector2(halfScale.x, halfScale.z) + scaledHitPoint;
Debug.Log($"{nameof(coords)}:{coords:0.000}");
Which now should be
(0, 0)------(64, 0)
| |
| (32, 18) |
| |
(0, 36)-----(64, 36)
I hope this brings a bit more light into where these "strange" values come from.
Since your camera is scaled 1,1,1 and there is nothing else involved I have a hard time finding where the factor of 10 would have sneaked its way into the calculation to be honest.
If you want to convert this:
hit.transform.InverseTransformPoint(hit.point)
which gives this:
(5,-5)---(-5,-5)
| |
| (0,0) |
| |
(5,5)----(-5,5)
to this:
(0,0)-------(64,0)
| |
| |
| |
(0,36)------(64,36)
Why not do this:
Vector2.Scale(
hit.transform.InverseTransformPoint(hit.point) - new Vector2(5,-5),
new Vector2(-6.4, 3.6)
);
This answer hardcodes the (5,-5) and (-6.4, 3.6) terms because the question doesn't include enough information to use variables instead.
Assuming the scale of the parent of the plane (Main Camera) is (10,10), then this should suffice:
Vector3 planeScale = hit.transform.localScale;
Vector3 cameraScale = hit.transform.parent.localScale;
result = Vector2.Scale(
hit.transform.InverseTransformPoint(hit.point)
- new Vector2(cameraScale * 0.5f ,-cameraScale * 0.5f),
new Vector2(-planeScale.x * 0.5f/cameraScale.x, planeScale.y * 0.5f / cameraScale.y)
);

Angle between looking direction and latitude/longitude

I am experimenting a little with AR. I have got the angle of the direction I am looking to from a compass in degrees. I know my own position and the position of another object (POI), the position is giving in form of latitude and longitude.
Now I would like to know how I can calculate the angle between the direction I am looking to and the POI.
Dot Product:
a . b = ||a|| ||b|| cos(t)
t = acos( (a.b)/(||a|| ||b||) )
||vector|| = length of vector (magnitude)
t = angle between the two vectors
Your probably going to need to do this a couple times for each plane you have. (1x for 2D, 2x for 3D)
Or:
/|
/ |
h / | y
/ |
/____|
x
t is the lower left corner, which we'll assume is your object, the upper right corner is going to be your other object
x = obj2.x - obj1.x
y = obj2.y - obj1.y
h = sqrt( (x*x) + (y*y) )
sin(t) = y/h
cos(t) = x/h
tan(t) = y/x
t = asin(y/h)
t = acos(x/h)
t = atan(y/x)
What makes the first method better, is that it account's for you're current rotation. The second method (using atan, asin, and acos) doesn't.

Point tangent to circle

Using this as a [reference][1]: Find a tangent point on circle?
cx = 0;
cy = 0;
px = -3;
py = -8;
dx = cx - px;
dy = cy - py;
a = asin(5 / ((dx*dx + dy*dy)^0.5));
b = atan2(dy, dx);
t_1 = deg2rad(180) + b - a;
t_2 = deg2rad(180) + b + a;
For a point (7,6) the angles are 7.9572/73.4434 and for (-3, -8) are 213.6264/285.2615. So for the first quadrant, the angles do not make sense, but for third quadrant they do. What I am doing wrong?
Your formula for a is wrong. You should use
a = acos(5 / ((dx*dx + dy*dy)^0.5))
instead of
a = asin(5 / ((dx*dx + dy*dy)^0.5))
i.e. use acos(...) instead of asin(...). The reason is shown in the image below. The formula for angle a is a=acos(r/H), where r is the radius of the circle and H is the length of the hypotenuse of the right angle triangle. So this has nothing to do with the fact that asin(...) has no way to know which of the two possible quadrants the value that is passed in lies. the argument of the asin is always positive, and you always want the answer in the range 0 to 90 degrees.
So the answer for the two angles that you want are b+a and b-a. Using acos instead of asin in your two cases produces 97.7592 & -16.5566 (or equivalently 343.4434) for your first quadrant example, and -164.7385 & -56.3736 (or equivalently 195.2615 and 303.6264) for your third quadrant example. (NB: instead of adding 180 degrees in the formula for t_1 and t-2, you could just switch the signs of dx and dy)
First -- I spent like 10 minutes figuring out what the heck you're trying to do (which in the end, I got from a comment in one of the answers), while solving your problem took 2 minutes. So, for future reference, please give a description of your problem as clear as you can first.
Now, I think you just have your signs messed up. Try the following:
%// difference vector
%// NOTE: these go the other way around for the atan2 to come out right
dx = px - cx;
dy = py - cy;
%// tip angle of the right triangle
a = asin( 5 / sqrt(dx*dx + dy*dy) );
%// angle between the (local) X-axis and the line of interest
b = atan2(dy, dx);
%// the third angle in the right triangle
%// NOTE: minus a here instead of plus b
g = pi/2 - a;
%// Angles of interest
%// NOTE1: signs are flipped; this automatically takes care of overshoots
%// NOTE2: don't forget to mod 360
t_1 = mod( rad2deg(b - g), 360)
t_2 = mod( rad2deg(b + g), 360)
Alternatively, you could skip computing the intermediate angle a by using acos instead of asin:
%// difference vector
dx = px - cx;
dy = py - cy;
%// Directly compute the third angle of the right triangle
%// (that is, the angle "at the origin")
g = acos( 5 / sqrt(dx*dx + dy*dy) );
%// angle between the (local) X-axis and the line of interest
b = atan2(dy, dx);
%// Angles of interest
t_1 = mod( rad2deg(b - g), 360)
t_2 = mod( rad2deg(b + g), 360)
Just another wayto re-discover the trigonometric identity acos(x) = pi/2 - asin(x) :)
This MathWorld entry is what you want: http://mathworld.wolfram.com/CircleTangentLine.html.
Alright, it looks like you are not accounting for the fact that asin, atan, ( any a-trig function ) has no way to know which of the two possible quadrants the value you passed in lies. To make up for that, a-trig function will assume that your point is in the first or fourth quadrant ( northeast / southeast ). Therefore, if you call atan function and your original point was in the second or third quadrant, you need to add 180 degrees / pi radians onto whatever value it returns.
See the documentation here stating that asin returns a value from [-pi/2, pi/2] :
http://www.mathworks.com/help/matlab/ref/asin.html
Hope that helps :)
EDIT
I misunderstood the situation originally
Here is what I think you have calculated :
t_1 and t_2 represent the angles you would travel at if you started on the circle from the tangent point and wanted to travel to your original starting point.
Viewed with this perspective your angles are correct.
For the point (7,6)
If you started on the circle at approx. (0,5) and traveled at 7 degrees, you would hit the point.
If you started on the circle at approx. (5,0) and traveled at 70 degrees, you would hit the point.
Now, what is going to be more useful and less confusing than angles, will be to know the slope of the line. To get this from the angle, do the following with angle in degrees:
angle = (angle + 90 + 360) % 180 - 90 // this gives us the angle as it would be in quad 1 or 4
slope = tan( deg2rad( angle ) )