Distance between two points with MapKit WITHOUT euclidean distance calculation - swift

I have a game map that has been tiled over the world map of MapKit. I generate a path to take for the player. With this I find the 3 nearest nodes (in game cities) and select one at random then recurs this to find a 3rd node. I have some logic that means the chosen nodes at each stage aren't in any of the previous arrays to allow for a nice path and no "coming back on your self".
However, the issue I'm facing is I'm using CLLocation.distance(), this unfortunately uses an euclidean distance calculation due to the curvature of the earth. Is there any way to off set the curve as my current logic ends up in all paths slowly leaning towards the poles as the world map is just a flat image.
I've thought about translating CLLocation to a UIView between the first node and all possible second nodes, however this becomes massively intensive.
Any ideas on how to either offset the curve calulation or remove it all together?

Related

Anything in Leaflet that is similar to isLocationOnEdge() from Google Maps?

Google Maps has the function isLocationOnEdge(point, polyline, tolerance) that takes a tolerance value in degrees and uses it to determine whether a point falls near a polyline.
Is there anything similar in Leaflet(or some plug-in) that does the same thing?
A handful library for such operation is Turf.
For your case, a simple approach would be to:
Create a polygon out of your polyline using turf.buffer with appropriate "tolerance" (Turf takes a distance at Earth surface, or degrees).
Check whether your point is within that polygon or not using turf.inside.
Unfortunately, turf.buffer is only an approximation, it does not takes geodesy into account… therefore for big tolerance you will have a deformed shape.
An exact method could be to:
Use instead turf.pointOnLine to find the nearest point of the polyline.
turf.distance to measure the distance between those 2 points, and compare with your tolerance (or even just Leaflet latLng.distanceTo, but you would have to convert GeoJSON points back to Leaflet LatLngs).

Calculate fall time in SpriteKit

I am using SpriteKit and I need to calculate the fall time of an object (since it changes depending on the screen size). The problem is that the gravity of the scene is given in m/s^2, but all distances are measured in points.
I have tried to find the conversion between points and meters, but it was not very successful.
Any suggestions in how to deal with it?
You may be able to use something like this (A distance calculator written by another on StackOverflow) to calculate the distance between your node and the point directly below it near the ground or another empty node, then plug that distance into an equation to calculate the time.
Unfortunately, I can't give you any code because I've never tried this myself.

iPhone iOS is it possible to create a rangefinder with 2 laser pointers and an iPhone?

I'm working on an IPhone robot that would be moving around. One of the challenges is estimating distance to objects- I don't want the robot to run into things. I saw some very expensive (~1000$) laser rangefinders, and would like to emulate one using iPhone.
I got one or two camera feeds and two laser pointers. The laser pointers are mounted about 6 inches apart, at an angle The angle of lasers in relation to the cameras is known. The Angle of cameras to each other is known.
The lasers are pointing ahead of cameras, creating 2 dots on a camera feed. Is it possible to estimate the distance to the dots by looking at the distance between the dots in a camera image?
The lasers form a trapezoid from the
/wall \
/ \
/laser mount \
As the laser mount gets closer to the wall, the points should be moving further away from each other.
Is what I'm talking about feasible? Has anyone done something like that?
Would I need one or two cameras for such calculation?
If you just don't want to run into things, rather than have an accurate idea of the distance to them, then you could go "dambusters" on it and just detect when the two points become one - this would be at a known distance from the object.
For calculation, it is probaby cheaper to have four lasers instead, in two pairs, each pair at a different angle, one pair above the other. Then a comparison between the relative differences of the dots would probably let you work out a reasonably accurate distance. Math overflow for that one, though.
In theory, yes, something like this can work. Google "light striping" or "structured light depth measurement" for some good discussions of using this sort of idea on a larger scale.
In practice, your measurements are likely to be crude. There are a number of factors to consider: the camera intrinsic parameters (focal length, etc) and extrinsic parameters will affect how the dots appear in the image frame.
With only two sample points (note that structured light methods use lines, etc), the environment will present difficulties for distance measurement. Surfaces that are directly perpendicular to the floor (and direction of travel) can be handled reasonably well. Slopes and off-angle walls may be detectable, but you will find many situations that will give ambiguous or incorrect distance measures.

Calculate nearest point of KML polygon for iPhone app

I have a series of nature reserves that need to be plotted, as polygon overlays, on a map using the coordinates contained within KML data. I’ve found a tutorial on the Apple website for displaying KML overlays on map instances.
The problem is that the reserves vary in size greatly - from a small pond right up to several hundred kilometers in size. As a result I can’t use the coordinates of the center point to find the nearest reserves. Instead I need to calculate the nearest point of the reserves polygon to find the nearest one. With the data in KML - how would I go about trying to achieve this?
I've only managed to find one other person ask this and no one had replied :(
Well, there are a couple different solutions depending on your needs. The higher the accuracy required, the more work required. I like Phil's meanRadius parameter idea. That would give you a rough idea of which polygon is closest and would be pretty easy to calculate. This idea works best if the polygons are "circlish". If the polygon are very irregular in shape, this idea loses it's accuracy.
From a math standpoint, here is what you want to do. Loop through all points of all polygons. Calculate the distance from those points to your current coordinate. Then just keep track of which one is closest. There is one final wrinkle. Imagine a two points making a line segment that is very long. You are located one meter away from the midpoint of the line. Well, the distance to these two points is very large, while, in fact you are very close to the polygon. You will need to calculate the distance from your coordinate to every possible line segment which you can do in a variety of manners which are outlined here:
http://www.worsleyschool.net/science/files/linepoint/distance.html
Finally, you need to ask yourself, am I in any polygons? If you're 10 meters away from a point on a polygon, but are, in fact, inside the polygon, obviously, you need to consider that. The best way to do that is to use a ray casting algorithm:
http://en.wikipedia.org/wiki/Point_in_polygon#Ray_casting_algorithm

Kink detection in drawn polylines

Users can sketch in my app using a very simple tool (move mouse while holding LMB). This results in a series of mousemove events and I record the cursor location at each event. The resulting polyline curve tends to be rather dense, with recorded points almost every other pixel. I'd like to smooth this pixelated polyline, but I don't want to smooth intended kinks. So how do I figure out where the kinks are?
The image shows the recorded trail (red pixels) and the 'implied' shape as a human would understand it. People tend to slow down near corners, so there is usually even more noise here than on the straight bits.
Polyline tracker http://www.freeimagehosting.net/uploads/c83c6b462a.png
What you're describing may be related to gesture recognition techniques, so you could search on them for ideas.
The obvious approach is to apply a curve fit, but that will have the effect of smoothing away all the interesting details and kinks. Another approach suggested is to look at speeds and accelerations, but that can get hairy (direction changes can be very fast or very slow and deliberate)
A fairly basic but effective approach is to simplify the samples directly into a polyline.
For example, work your way through the samples (e.g.) from sample 1 to sample 4, and check if all 4 samples lie within a reasonable error of the straight line between 1 & 4. If they do, then extend this to points 1..5 and repeat until such a time as the straight line from the start point to the end point no longer provides a resonable approximation to the curve defined by those samples. Create a line segment up to the previous sample point and start accumulating a new line segment.
You have to be careful about your thresholds when the samples are too close to each other, so you might want to adjust the sensitivity when regarding samples fewer than 4-5 pixels away from each other.
This will give you a set of straight lines that will follow the original path fairly accurately.
If you require additional smoothing, or want to create a scalable vector graphic, then you can then curve-fit from the polyline. First, identify the kinks (the places in your polyline where the angle between one line and the next is sharp - e.g. anything over 140 degrees is considered a smooth curve, anything less than that is considered a kink) and break the polyline at those discontinuities. Then curve-fit each of these sub-sections of the original gesture to smooth them. This will have the effect of smoothing the smooth stuff and sharpening the kinks. (You could go further and insert small smooth corner fillets instead of these sharp joints to reduce the sharpness of the joins)
Brute force, but it may just achieve what you want.
Rather than trying to do this from the resultant data, have you considered looking at the timing of the data as it comes in? If the mouse stops or slows noticably, you use the trend since the last 'kink' (the last time the mouse slowed) to establish the direction of travel. If the user goes off in a new direction, you call it a kink, otherwise, you ignore the current slowing trend and start waiting for the next one.
Well, one way would be to use a true curve-fitting algorithm. Generate a bezier curve (with exact endpoints, using Catmull-Rom or something similar), then optimize & recursively subdivide (using distance from actual line points as a cost metric). This may be too complicated for your use-case, though.
Record the order the pixels are drawn in. Then, compute the slope between pixels that are "near" but not "close". I'm guessing a graph of the slope between pixel(i) and pixel(i+7) might exhibit easily identifable "jumps" around kinks in the curve.