MKMapView - I have a road, consisting of many location points.
A line is drawn from each point to the next, making it a visible road line.
I have a user location, and I would like to draw a line to the closest point of that road.
That means - I need to iterate over each two points, and determine the closest point to user location.
It's all working well, but the problem is - the calculated closest point on a line is sometimes not directly at 90 degrees against the said line. (In some situations the angle is almost 45 degrees).
It seems it depends on the angle of the line.
Please see the example video:
https://imgur.com/a/27QFmHx
(Or screenshot from the video:)
In this visible example, there are 3 black static lines drawn, two of them are perpendicular (right ones).
Red lines - are calculated on the fly, to be the drawn between user location (center of the map) to the closest point on each of the lines.
It is visible, that the top line is straight (no angle), and the found closest point, and it's drawn line is perpendicular.
But there is issue with the right black lines. The found point (and red line) clearly is not 90 degrees.
This is the code I am using, to determine the closest point on the line:
func distanceBetweenTwoPointsFrom(origin: CLLocationCoordinate2D, pointOne: CLLocationCoordinate2D, pointTwo: CLLocationCoordinate2D) -> CLLocationCoordinate2D {
let A: Double = origin.latitude - pointOne.latitude
let B: Double = origin.longitude - pointOne.longitude
let C: Double = pointTwo.latitude - pointOne.latitude
let D: Double = pointTwo.longitude - pointOne.longitude
let dot: Double = A * C + B * D
let len_sq: Double = C * C + D * D
var param: Double = -1
if len_sq != 0 {
param = dot / len_sq
}
var xx: Double = 0
var yy: Double = 0
if param < 0 || (pointOne.latitude == pointTwo.latitude && pointOne.longitude == pointTwo.longitude) {
xx = pointOne.latitude
yy = pointOne.longitude
} else if param > 1 {
xx = pointTwo.latitude
yy = pointTwo.longitude
} else {
xx = pointOne.latitude + param * C
yy = pointOne.longitude + param * D
}
return CLLocationCoordinate2D.init(latitude: xx, longitude: yy)
}
Question: How can I fix it, so that the found point on the black line would be directly 90 degrees from the user? (Visibly the most straight line to get from user location to the said line)
You should calculate the distance between two points as follows:
import MapKit
func distance(
from: CLLocationCoordinate2D,
to: CLLocationCoordinate2D
) -> CLLocationDistance {
MKMapPoint(from).distance(to: MKMapPoint(to))
}
Note though, you will only find the shortest distance from the given points, which may not be the actual shortest distance from the user's location to any point on the path.
Unless you have many points which approximate your path very closely, you may add this step:
In order to increase accuracy, you may create a line from the two points which are closest to the user's point using this formula,
https://en.wikipedia.org/wiki/Linear_equation
This basically creates an approximation to the actual path, using a line - but with infinite points.
Then search the point which is the shortest distance from the user's location and this line using this formula:
https://en.wikipedia.org/wiki/Distance_from_a_point_to_a_line
This also does not guarantie that you get the actual point with the shortest distance, but it is certainly a better approximation.
Related
I am trying to find the area of a polygon generated by a users path. The path gives back an array of lat/lon. The path is always self closing. I have tried multiple methods that I found online. Below is what I currently have, I can't make sense of the outputted data. The area * 6378137.0 * 6378137.0 is meant to give me back an area in m^2, but the results are massive numbers.
My ideal solution would be to find a way to (with an assumed % of error) map my lat/lon to xy coordinates and apply the shoelace theorem. I understand that this is pretty hard to do accurately because of the curvature of the earth.
Building from the last paragraph, maybe the best way to map to x,y coordinates would be some sort of projection method. I have not gone too far down that path yet.
What would be the best method to try in order to solve this problem? If someone could set me on the right path or decipher the code I have already tried I would greatly appreciate it (code is done in swift playground). Thanks!
func deg2rad(_ number: Double) -> Double {
return number * .pi / 180
}
func areaCalc(lat: [Double]?, lon: [Double]?){
guard let lat = lat,
let lon = lon
else { return }
var area: Double = 0.0
if(lat.count > 2){
for i in stride(from: 0, to: lat.count - 1, by: 1) {
let p1lon = lon[i]
let p1lat = lat[i]
let p2lon = lon[i+1]
let p2lat = lat[i+1]
area = area + (deg2rad(p2lon - p1lon)) * (2 + sin(deg2rad(p1lat))) + (sin(deg2rad(p2lat)))
}
area = area * 6378137.0 * 6378137.0
area = abs(area / 2)
}
}
Given 2 coordinates (point 1 and 2 in red) in WGS84 I need to find the coordinates of the point perpendicular (point 3) to the line at a given distance.
I could manage to make the math to compute this perpendicular point, but when displayed on the map, the point seems to be at a wrong place, probably because of the projection.
What I want on a map:
And what I have instead on the map:
How can I take into account the projection so that the point on the map appears perpendicular to the line? The algorithm below to compute the point comes from here: https://math.stackexchange.com/questions/93424/calculate-rectangle-coordinates-from-line-and-height
public static Coords ComputePerpendicularPoint(Coords first, Coords last, double distance)
{
double slope = -(last.Lon.Value - first.Lon.Value) / (last.Lat.Value - first.Lat.Value);
// number of km per degree = ~111km (111.32 in google maps, but range varies between 110.567km at the equator and 111.699km at the poles)
// 1km in degree = 1 / 111.32km = 0.0089
// 1m in degree = 0.0089 / 1000 = 0.0000089
distance = distance * 0.0000089 / 100; //0.0000089 => represents around 1m in wgs84. /100 because distance is in cm
double t = distance / Math.Sqrt(1 + (slope * slope));
Coords perp_coord = new Coords();
perp_coord.Lon = first.Lon + t;
perp_coord.Lat = first.Lat + (t * slope);
return perp_coord;
}
Thank you in advance!
Hey there I am trying to rotate a line around its own center within the latlng system.
I got the angle and the two points. So I tried to append the rotation matrix, like this (following method takes the latitude and longitude of a point and the angle):
LatLng rotate(double lat, double long, double angle){
double rad = angle*pi/180;
double newLong = long*cos(rad)-lat*sin(rad);
double newLat = long* sin(rad) + lat*cos(rad);
return LatLng(newLat,newLong);
}
For example I got the point A (latitude:x,longitude:y) and the point B(latitude:x,longitude:y). Connecting these two points leads to a line. Now I want two rotate the line around it's own center with the above method, by calling:
LatLng newA = rotate(A.latitude,A.longitude);
LatLng newB = rotate(B.latitude,B.longitude);
But when I connect the two Points newA and NewB there is not the desired effect.
As #Abion47 clarified in his answer I need a rotation in 3-dimension, but how to do so? And is it possible with 2-dimension if it is a very small line?
So here's the rub. The problem I mentioned before is that a latitude-longitude pair are a pair of angles, not a 2D vector of a point on a graph, so trying to use them to rotate a point in 3D space on the surface of a sphere is going to run into its own problems. One thing that turns out, however, is that as long as you don't pick points that cross either the international date line or the poles, you can still use this trick by just pretending the angle pair is a 2D vector.
The real problem is that you are wanting to rotate the points around the midpoint, but your math is merely performing a straight rotation which will be rotating them around the origin instead (i.e. 0,0). You need to offset your "points" by the point you are using as a reference.
import 'dart:math';
LatLng rotate(LatLng coord, LatLng midpoint, double angle) {
// Make this constant so it doesn't have to be repeatedly recalculated
const piDiv180 = pi / 180;
// Convert the input angle to radians
final r = angle * piDiv180;
// Create local variables using appropriate nomenclature
final x = coord.longitude;
final y = coord.latitude;
final mx = midpoint.longitude;
final my = midpoint.latitude;
// Offset input point by the midpoint so the midpoint becomes the origin
final ox = x - mx;
final oy = y - my;
// Cache trig results because trig is expensive
final cosr = cos(r);
final sinr = sin(r);
// Perform rotation
final dx = ox * cosr - oy * sinr;
final dy = ox * sinr + oy * cosr;
// Undo the offset
return LatLng(dy + my, dx + mx);
}
Using this approach, I ended up with the following results:
The blue points are the input, the green point is the calculated midpoint, and the red points are each of the blue points passed through a 90 degree rotation.
(Note that the distance between the blue points appears to be farther than the distance between the red points. This is because I visualized the results in Google Maps which uses the Mercator projection, and that had the result of screwing with where the points appear to be relative to each other. If you were to visualize this on a globe, the points should appear the correct distance from each other.)
According to the documentation for Path:
Closed sub-paths enclose a (possibly discontiguous) region of the plane based on the current fillType.
As far as I understand this implies that when a Path object is closed it surrounds a two dimensional area.
When the user clicks on a point of the screen I want to calculate the distance between the point that the user clicks and the area that's surrounded by the path. I get the point that the user clicks via GestureDetector/onPanDown but I have trouble figuring out how to calculate the distance to the path (or the area surrounded by the path). All the functions that Path offers seem to return void or bool but no distances.
Imagine for illustration: (red is the Path object when I draw it to the screen and the X is supposed to be where my user clicks; the distance between the two represented by the green line is what I'm interested in)
How do I calculate the distance?
First of all traverse through all points of the path.
and for each point find out the distance to clicked position and hold the shortest one.
So to get the points of the path use PathMetrics.
double getShortestDistance(Path path, Offset clickedPoint) {
PathMetrics pathMetrics = path.computeMetrics();
double minDistance;
pathMetrics.toList().forEach((element) {
for (var i = 0; i < element.length; i++) {
Tangent tangent = element.getTangentForOffset(i.toDouble());
Offset pos = tangent.position;
double distance = getDistance(pos,clickedPoint);
if(minDistance==null||distance<minDistance) {
minDistance = distance;
}
}
});
return minDistance;
}
double getDistance(Offset pos, Offset clickedPoint) {
double dx = pos.dx-clickedPoint.dx;
double dy = pos.dy-clickedPoint.dy;
double distance = sqrt(dx*dx+dy*dy);
return distance.abs();
}
got reference from here
I am using Google Maps SDK to allow a user to draw a polygon on the map by tapping. Everything works perfectly is the user is to draw a polygon following a path and continues on that path without crossing over lines. If that happens, this result is produced:
However, if the user is to make an error and cross over or change the direction of their "tapping" path this happens:
I need to either:
A) alert the user that they have created an invalid polygon, and must undo that action, or
B) correct the polygon shape to form a complete polygon.
With the research I have done, option A seems much more feasible and simple since option B would require rearranging the path of the polygon points.
I have done research and found algorithms and formulas to detect line intersection, but I am yet to find any piece of a solution in Swift to recognize if a polygon self-intersects based off of points (in this case, latitude and longitude). I don't need to know the point, just TRUE or FALSE to the question, "Does this Polygon self-intersect?" The polygon will typically have less than 20 sides.
Perhaps there is a solution built in with the GoogleMaps SDK, but I am yet to find it. Also, I understand that there are already algorithms for problems such as these, I am just having trouble implementing them into Swift 2 or 3. Any help is appreciated, thanks!
I'm guessing that you're trying to plot out the quickest way to get from point to point as the crow flies. You'll probably want to consider road direction too, which I won't here.
Both your options are possible. It's easy enough to iterate over every existing line when a new line is added and determine if they've crossed over. But your user would definitely rather not be told that they've screwed up, your app should just fix it for them. This is where it gets fun.
I am certain algorithms exist for finding the minimal polygon containing all points, but I didn't look them up, because where's the fun in that.
Here's how I would do it. In pseudocode:
if (line has intersected existing line)
find mean point (sum x sum y / n)
find nearest point to centre by:
taking min of: points.map(sqrt((x - centrex)^2 + (y-centrey)^2))
from the line between centre and nearest point, determine angle to every other line.
points.remove(nearest)
angles = points.map(cosine law(nearest to centre, centre, this point))
<- make sure to check if it crossed pi, at which point you must add pi.
sort angles so minimum is first.
starting at nearest point, add line to next point in the array of minimal angle points
I'm sorry I haven't put this into swift. I will update tomorrow with proper Swift 3.
This seems to be working pretty well for what I need. Adopted from Rob's answer here
func intersectionBetweenSegmentsCL(p0: CLLocationCoordinate2D, _ p1: CLLocationCoordinate2D, _ p2: CLLocationCoordinate2D, _ p3: CLLocationCoordinate2D) -> CLLocationCoordinate2D? {
var denominator = (p3.longitude - p2.longitude) * (p1.latitude - p0.latitude) - (p3.latitude - p2.latitude) * (p1.longitude - p0.longitude)
var ua = (p3.latitude - p2.latitude) * (p0.longitude - p2.longitude) - (p3.longitude - p2.longitude) * (p0.latitude - p2.latitude)
var ub = (p1.latitude - p0.latitude) * (p0.longitude - p2.longitude) - (p1.longitude - p0.longitude) * (p0.latitude - p2.latitude)
if (denominator < 0) {
ua = -ua; ub = -ub; denominator = -denominator
}
if ua >= 0.0 && ua <= denominator && ub >= 0.0 && ub <= denominator && denominator != 0 {
print("INTERSECT")
return CLLocationCoordinate2D(latitude: p0.latitude + ua / denominator * (p1.latitude - p0.latitude), longitude: p0.longitude + ua / denominator * (p1.longitude - p0.longitude))
}
return nil
}
I then implemented like this:
if coordArray.count > 2 {
let n = coordArray.count - 1
for i in 1 ..< n {
for j in 0 ..< i-1 {
if let intersection = intersectionBetweenSegmentsCL(coordArray[i], coordArray[i+1], coordArray[j], coordArray[j+1]) {
// do whatever you want with `intersection`
print("Error: Intersection # \(intersection)")
}
}
}
}