I am using Google Maps SDK to allow a user to draw a polygon on the map by tapping. Everything works perfectly is the user is to draw a polygon following a path and continues on that path without crossing over lines. If that happens, this result is produced:
However, if the user is to make an error and cross over or change the direction of their "tapping" path this happens:
I need to either:
A) alert the user that they have created an invalid polygon, and must undo that action, or
B) correct the polygon shape to form a complete polygon.
With the research I have done, option A seems much more feasible and simple since option B would require rearranging the path of the polygon points.
I have done research and found algorithms and formulas to detect line intersection, but I am yet to find any piece of a solution in Swift to recognize if a polygon self-intersects based off of points (in this case, latitude and longitude). I don't need to know the point, just TRUE or FALSE to the question, "Does this Polygon self-intersect?" The polygon will typically have less than 20 sides.
Perhaps there is a solution built in with the GoogleMaps SDK, but I am yet to find it. Also, I understand that there are already algorithms for problems such as these, I am just having trouble implementing them into Swift 2 or 3. Any help is appreciated, thanks!
I'm guessing that you're trying to plot out the quickest way to get from point to point as the crow flies. You'll probably want to consider road direction too, which I won't here.
Both your options are possible. It's easy enough to iterate over every existing line when a new line is added and determine if they've crossed over. But your user would definitely rather not be told that they've screwed up, your app should just fix it for them. This is where it gets fun.
I am certain algorithms exist for finding the minimal polygon containing all points, but I didn't look them up, because where's the fun in that.
Here's how I would do it. In pseudocode:
if (line has intersected existing line)
find mean point (sum x sum y / n)
find nearest point to centre by:
taking min of: points.map(sqrt((x - centrex)^2 + (y-centrey)^2))
from the line between centre and nearest point, determine angle to every other line.
points.remove(nearest)
angles = points.map(cosine law(nearest to centre, centre, this point))
<- make sure to check if it crossed pi, at which point you must add pi.
sort angles so minimum is first.
starting at nearest point, add line to next point in the array of minimal angle points
I'm sorry I haven't put this into swift. I will update tomorrow with proper Swift 3.
This seems to be working pretty well for what I need. Adopted from Rob's answer here
func intersectionBetweenSegmentsCL(p0: CLLocationCoordinate2D, _ p1: CLLocationCoordinate2D, _ p2: CLLocationCoordinate2D, _ p3: CLLocationCoordinate2D) -> CLLocationCoordinate2D? {
var denominator = (p3.longitude - p2.longitude) * (p1.latitude - p0.latitude) - (p3.latitude - p2.latitude) * (p1.longitude - p0.longitude)
var ua = (p3.latitude - p2.latitude) * (p0.longitude - p2.longitude) - (p3.longitude - p2.longitude) * (p0.latitude - p2.latitude)
var ub = (p1.latitude - p0.latitude) * (p0.longitude - p2.longitude) - (p1.longitude - p0.longitude) * (p0.latitude - p2.latitude)
if (denominator < 0) {
ua = -ua; ub = -ub; denominator = -denominator
}
if ua >= 0.0 && ua <= denominator && ub >= 0.0 && ub <= denominator && denominator != 0 {
print("INTERSECT")
return CLLocationCoordinate2D(latitude: p0.latitude + ua / denominator * (p1.latitude - p0.latitude), longitude: p0.longitude + ua / denominator * (p1.longitude - p0.longitude))
}
return nil
}
I then implemented like this:
if coordArray.count > 2 {
let n = coordArray.count - 1
for i in 1 ..< n {
for j in 0 ..< i-1 {
if let intersection = intersectionBetweenSegmentsCL(coordArray[i], coordArray[i+1], coordArray[j], coordArray[j+1]) {
// do whatever you want with `intersection`
print("Error: Intersection # \(intersection)")
}
}
}
}
Related
MKMapView - I have a road, consisting of many location points.
A line is drawn from each point to the next, making it a visible road line.
I have a user location, and I would like to draw a line to the closest point of that road.
That means - I need to iterate over each two points, and determine the closest point to user location.
It's all working well, but the problem is - the calculated closest point on a line is sometimes not directly at 90 degrees against the said line. (In some situations the angle is almost 45 degrees).
It seems it depends on the angle of the line.
Please see the example video:
https://imgur.com/a/27QFmHx
(Or screenshot from the video:)
In this visible example, there are 3 black static lines drawn, two of them are perpendicular (right ones).
Red lines - are calculated on the fly, to be the drawn between user location (center of the map) to the closest point on each of the lines.
It is visible, that the top line is straight (no angle), and the found closest point, and it's drawn line is perpendicular.
But there is issue with the right black lines. The found point (and red line) clearly is not 90 degrees.
This is the code I am using, to determine the closest point on the line:
func distanceBetweenTwoPointsFrom(origin: CLLocationCoordinate2D, pointOne: CLLocationCoordinate2D, pointTwo: CLLocationCoordinate2D) -> CLLocationCoordinate2D {
let A: Double = origin.latitude - pointOne.latitude
let B: Double = origin.longitude - pointOne.longitude
let C: Double = pointTwo.latitude - pointOne.latitude
let D: Double = pointTwo.longitude - pointOne.longitude
let dot: Double = A * C + B * D
let len_sq: Double = C * C + D * D
var param: Double = -1
if len_sq != 0 {
param = dot / len_sq
}
var xx: Double = 0
var yy: Double = 0
if param < 0 || (pointOne.latitude == pointTwo.latitude && pointOne.longitude == pointTwo.longitude) {
xx = pointOne.latitude
yy = pointOne.longitude
} else if param > 1 {
xx = pointTwo.latitude
yy = pointTwo.longitude
} else {
xx = pointOne.latitude + param * C
yy = pointOne.longitude + param * D
}
return CLLocationCoordinate2D.init(latitude: xx, longitude: yy)
}
Question: How can I fix it, so that the found point on the black line would be directly 90 degrees from the user? (Visibly the most straight line to get from user location to the said line)
You should calculate the distance between two points as follows:
import MapKit
func distance(
from: CLLocationCoordinate2D,
to: CLLocationCoordinate2D
) -> CLLocationDistance {
MKMapPoint(from).distance(to: MKMapPoint(to))
}
Note though, you will only find the shortest distance from the given points, which may not be the actual shortest distance from the user's location to any point on the path.
Unless you have many points which approximate your path very closely, you may add this step:
In order to increase accuracy, you may create a line from the two points which are closest to the user's point using this formula,
https://en.wikipedia.org/wiki/Linear_equation
This basically creates an approximation to the actual path, using a line - but with infinite points.
Then search the point which is the shortest distance from the user's location and this line using this formula:
https://en.wikipedia.org/wiki/Distance_from_a_point_to_a_line
This also does not guarantie that you get the actual point with the shortest distance, but it is certainly a better approximation.
I'm working on a program in which i want to store the distance the user walked since pressing a button. I retrieve the distance via geolocator package and display it on screen which works just fine.
I know there are some distanceBetween-Function for locations, but as far as i noticed, they are just calculating the distance between 2 points and not the actual distance the user walked (For example, if the user starts at one point X, walks over to Point Y and back to X would end in comparing start-and endpoint (X to X), which results in distance: 0, but i want the distance X -> Y -> X.
I added following function that calculated the distance based on longitude/latitude.
double distance(Position start, Position current){
return double.parse((acos(sin(start.latitude)*sin(current.latitude)+cos(start.latitude)*cos(current.latitude)*cos(current.longitude-start.longitude))*6371).toStringAsFixed(2));
}
I call it every frame and store the distance between the current and last gps position.
Works slowly but fine, except one Problem:
Somewhen, the double suddenly turns into "NaN", and i can't figure out why.
It's completely random when this occurs - At the beginning, it was always around 0.6, but it also occurred around 4.5 and 0.2, so i think the problem may be somewhere else.
Can anybody help?
Or does anybody knows a built-in-function that can solve the same problem?
I tried parsing the double to only have 2 decimal spaces (Didn't round it before) because i thought the number might just got too many decimal spaces to be displayed, but error still occured.
I have a second task that is happening at the same time each time stamp, so i thought it was hindering retrieving the GPS, so i tried disabling it, but it didn't change anything.
It's possible that you are getting numerical stability issues with the spherical law of cosines since you're calculating the distance on every frame? It is known that the formula has conditioning issues for very small distances (less than one meter).
Note that the domain for
arccosine(x) is given by -1 <= x <= 1. If in your case you were to supply a value greater than 1 (or smaller than -1) you would get a NaN result.
If you are still debugging this you can add a simple print statement:
double distance(Position start, Position current){
double x = sin(start.latitude)*sin(current.latitude)+cos(start.latitude)*cos(current.latitude)*cos(current.longitude-start.longitude);
if (x > 1 || x < -1) {
print("error");
}
return ((acos(sin(start.latitude)*sin(current.latitude)+cos(start.latitude)*cos(current.latitude)*cos(current.longitude-start.longitude))*6371));
}
If this is indeed the case, then you have a few options, use the Haversine formula because it is better conditioned for small distances, or simply set x to 1 if it's above 1. This anyway just means that the distance is zero.
For more information (and the Haversine formula) see also: Great circle distance
I really didn't think about the arccosines domain...
So i updated my code with your proposition to:
double distance(Position start, Position current) {
double x = sin(start.latitude) * sin(current.latitude) + cos(start.latitude) * cos(current.latitude) * cos(current.longitude - start.longitude);
if (x > 1 || x < -1) {
if (kDebugMode) {
print("error");
}
return 0;
}
return double.parse((acos(x) * 6371).toStringAsFixed(2));
}
It works fine, thank you for your help!
I am having problems understanding 2D motion vectors when moving certain objects at a given time. My knowledge of linear algebra is limited and I really don't know the exact search terms to look for, so I wanted to know whether anybody could help me or at least hint me in the right direction.
My problem looks like this:
I have two points, a startPoint, and an endPoint in space. They have each a specific location, denoted as (x_1, x_2) and (y_1, y_2) respectively. Both of these points have a time attached to it, named t_startPoint or t_endPoint, respectively. I now want to find out, for a given currentTime (= basically any point in time that is in between t_startPoint and t_endPoint), where exactly would a new point N be positioned on the connection line between those two points. I know the description is not trivial and that’s why I also added an image describing what I would like to do:
So far, this is what I have as my algorithm:
func update(_ time: Int64) {
let t_startPoint: Int64 = 1
let position_startPoint: = (1.0, 1.0)
let t_endPoint: Int64 = 5
let position_endPoint: Vector = (4.0, 5.0)
let currentTime = 3
let duration = t_endPoint - t_startPoint
let x = position_startPoint.x + ((position_endPoint.x - position_startPoint.x) / Float(duration)) * (Float(currentTime - t_startPoint))
let y = position_startPoint.y + ((position_endPoint.y - position_startPoint.y) / Float(duration)) * (Float(currentTime - t_startPoint))
//
However, no matter what I do, my objects keep overshooting, erratically moving back and forth, and I don't know where to start. Any help would be greatly appreciated!
For constant velocity moving there is relation:
(t-t1) / (t2-t1) = (x-x1) / (x2-x1)
x = x1 + (x2-t1) * (t-t1) / (t2-t1)
so your expresiion looks right. Check:
1 + (4-1) * (3-1) / (5-1) = 1 + 3 * 2 / 4 = 2.5 - exact middle, OK
I'm working on a game in Unity where you can walk around in a city that also exists in real life.
In the game you should be able to enter real-world coordinates, or use your phone's GPS, and you'll be transported to the in-game position of those coordinates.
For this, i'd need to somehow convert the game coordinates to latitude and longitude coordinates. I have some coordinates from specific buildings, and i figured i might be able to write a script to determine the game coordinates from those reference points.
I've been searching for a bit on Google, and though i have probably come across the right solutions occasionally, i've been unable to understand them enough to use it in my code.
If someone has experience with this, or knows how i could do this, i'd appreciate it if you could help me understand it :)
Edit: Forgot to mention that other previous programmers have already placed the world at some position and rotation they felt like using, which unfortunately i can't simply change without breaking things.
Tim Falken
This is simple linear math. The main issues you'll come across is the fact that your game coordinate system will be probably be reversed along one or more axis. You'll probably need to reverse the direction along the latitude (Y) axis of your app. Aside from that it is just a simple conversion of the scales. Since you say that this is the map of a real place you should be able to easily figure out the min\max lon\lat which your map covers. Take the absolute value of the difference between these two values and divide that by the width\height of your map in each direction. This will be the change in latitude per map unit value. Store this value and it should be easy to convert both ways between the two units. Make functions that abstract the details and you should have no problems calculating this either way.
I assume that you have been able to retrieve the GPS coordinates OK.
EDIT:
By simple linear math I mean something like this (this is C++ style psuedo code and completely untested; in a real world example the constants would all be member variables instead):
define('MAP_WIDTH', 1000);
define('MAP_HEIGHT', 1000);
define('MIN_LON', 25.333);
define('MIN_LAT', 20.333);
define('MAX_LON', 27.25);
define('MAX_LAT', 20.50);
class CoordConversion {
float XScale=abs(MAX_LON-MIN_LON)/MAP_WIDTH;
float YScale=abs(MAX_LAT-MIN_LAT)/MAP_HEIGHT;
int LonDir = MIN_LON<MAX_LON?1:-1;
int LatDir = MIN_LAT<MAX_LAT?1:-1;
public static float GetXFromLon(float lon) {
return (this.LonDir>0?(lon-MIN_LON):(lon-MAX_LON))*this.XScale;
}
public static float GetYFromLat(float lat) {
return (this.LatDir >0?(lat-MIN_LAT):(lat-MAX_LAT))*this.YScale;
}
public static float GetLonFromX(float x) {
return (this.LonDir>0?MIN_LON:MAX_LON)+(x/this.XScale);
}
public static float GetLatFromY(float y) {
return (this.LonDir>0?MIN_LAT:MAX_LAT)+(y/this.YScale);
}
}
EDIT2: In the case that the map is rotated you'll want to use the minimum and maximum lon\lat actually shown on the map. You'll also need to rotate each point after the conversion. I'm not even going to attempt to get this right off the top of my head but I can give your the code you'll need:
POINT rotate_point(float cx,float cy,float angle,POINT p)
{
float s = sin(angle);
float c = cos(angle);
// translate point back to origin:
p.x -= cx;
p.y -= cy;
// rotate point
float xnew = p.x * c - p.y * s;
float ynew = p.x * s + p.y * c;
// translate point back:
p.x = xnew + cx;
p.y = ynew + cy;
}
This will need to be done in when returning a game point and also it needs to be done in reverse before using a game point to convert to a lat\lon point.
EDIT3: More help on getting the coordinates of your maps. First find the city or whatever it is on Google maps. Then you can right click the highest point (furthest north) on your maps and find the highest longitude. Repeat this for all four cardinal directions and you should be set.
I have 2 coordinates and would like to do something seemingly straightforward. I want to figure out, given:
1) Coordinate A
2) Course provided by Core Location
3) Coordinate B
the following:
1) Distance between A and B (can currently be done using distanceFromLocation) so ok on that one.
2) The course that should be taken to get from A to B (different from course currently traveling)
Is there a simple way to accomplish this, any third party or built in API?
Apple doesn't seem to provide this but I could be wrong.
Thanks,
~Arash
EDIT:
Thanks for the fast responses, I believe there may have been some confusion, I am looking to get the course (bearing from point a to point b in degrees so that 0 degrees = north, 90 degrees = east, similar to the course value return by CLLocation. Not trying to compute actual turn by turn directions.
I have some code on github that does that. Take a look at headingInRadians here. It is based on the Spherical Law of Cosines. I derived the code from the algorithm on this page.
/*-------------------------------------------------------------------------
* Given two lat/lon points on earth, calculates the heading
* from lat1/lon1 to lat2/lon2.
*
* lat/lon params in radians
* result in radians
*-------------------------------------------------------------------------*/
double headingInRadians(double lat1, double lon1, double lat2, double lon2)
{
//-------------------------------------------------------------------------
// Algorithm found at http://www.movable-type.co.uk/scripts/latlong.html
//
// Spherical Law of Cosines
//
// Formula: θ = atan2( sin(Δlong) * cos(lat2),
// cos(lat1) * sin(lat2) − sin(lat1) * cos(lat2) * cos(Δlong) )
// JavaScript:
//
// var y = Math.sin(dLon) * Math.cos(lat2);
// var x = Math.cos(lat1) * Math.sin(lat2) - Math.sin(lat1) * Math.cos(lat2) * Math.cos(dLon);
// var brng = Math.atan2(y, x).toDeg();
//-------------------------------------------------------------------------
double dLon = lon2 - lon1;
double y = sin(dLon) * cos(lat2);
double x = cos(lat1) * sin(lat2) - sin(lat1) * cos(lat2) * cos(dLon);
return atan2(y, x);
}
See How to get angle between two POI?
Depending on how much work you want to put in this one, I would suggest looking at Tree Traversal Algorithms (check the column on the right), things like A* alpha star, that you can use to find your find from one point to another, even if obstacles are in-between.
If I understand you correctly, you have the current location and you have some other location. You want to find the distance (as the crow flies) between the two points, and to find a walking path between the points.
To answer your first question, distanceFromLocation will find the distance across the earth's surface between 2 points, that is it follows the curvature of the earth, but it will give you the distance as the crow flies. So I think you're right about that.
The second question is a much harder. What you want to do is something called path-finding. Path finding, require's not only a search algorithm that will decide on the path, but you also need data about the possible paths. That is to say, if you want to find a path through the streets, the computer has to know how the streets are connected to each other. Furthermore, if you're trying to make a pathfinder that takes account for traffic and the time differences between taking two different possible paths, you will need a whole lot more data. It is for this reason that we usually leave these kinds of tasks up to big companies, with lots of resources, like Google, and Yahoo.
However, If you're still interested in doing it, check this out
http://www.youtube.com/watch?v=DoamZwkEDK0