Create circle around user inputted point and plot on map with custom latlon points - tableau-api

I have a question about if something is possible using Tableau.
I already have a coastline plotted on one map using custom LatLon coordinates and I would like to take a user inputted Lat and Lon and plot a circle around it with let's say radius 10 and display it on the same map.
I was using this tutorial before to plot a circle:
https://www.crowdanalytix.com/communityBlog/customers-within-n-miles-radius-analysis-using-tableau
But I don't think the same approach can work with user-inputted fields because then it would require restructuring the data..

Okay, a (much smarter LOL) coworker helped me figure this out....
So my goal was to graph distance band (like a distance of 5 miles around a coast) . In order to do this we can use the distance between two coastline points since they are connected by a line, not a curve...From there we can find the perpendicular point a certain distance away and connect those points. Much easier than my circle idea...

Related

Get route from a coordinate to a random distance

I have this specific use case I can't figure out how to implement.
I need to trace a route route from a starting point to a random distance from that point.
The goal is to for instance suggest a route to the users can take if they want to walk 10km from their starting point or any point on a map.
Any ideas? or is there any free/paid service for such usecase?
Thank you for your help
One approach you could try is creating an isodistance polygon, then randomly choosing a direction from the center and find the intersection with the shape.
This blog post generates an isodistance shape by calculating the distance to a grid of nearby points, then producing a concave hull around the perimeter.
https://blog.mapbox.com/dive-deeper-into-isodistances-c90bd5df9215
You could then randomly select a direction and draw a line to intersection that shape. One approach to that is to draw a line from the center to north, and then http://turfjs.org/docs/#transformRotate a random angle.
Then use https://turfjs.org/docs/#lineIntersect to find the intersection point.

SQL Server 2008 R2 large polygons along latitude

Working on the Pacific Ocean, i am dealing with huge polygons covering the whole area. Some of them are quite simple and are defined by 4 points in my shapefile.
However, when i import them into SQL server 2008 r2 as new geographies, due to the shape of the earth, i end up with curved lines while I would like the North and South boundaries to stick to some specific latitudes: for example, the north boundaries should follow the 30N latitude from 120E to 120W.
How can i force my polygons to follow the latitudes? Converting them as geometry could have been an option but since i will need to do some length and area calculations, i need to keep them as geography.
Do i need to add additional vertices along my boundaries to force the polygon to stay on a specific latitude? What should be the interval between each vertex?
Thanks for your help
Sylvain
You have already answered this yourself. Long distances between latitude coordinates will create curved lines to match the Earth's curvature. Therefore if you need to "anchor" them along a specific latitude you will need to manually insert points. As for the interval, there's no right or wrong, a little experimentation here (and considering how "anal" you want to be about it hugging the line) will give you the result you desire. 1 coordinate per degree should do it, might even be a little overkill.
That said, I do question why you would want to anchor them to create a projected "straight" line as this will skew the results of length and area calculations, the bigger the polygon, the bigger the skew.

Get intermediate points over a MKPolyline

I have a MKPolyline with two points (a start and an ending point), on a MKMapView. Is there any way to get some intermediate points (or coordinates) along with the line, or to split the line in many segments?
I want something like this: http://i.imgur.com/qcbS9.png, where the black endpoints are the starting and ending points of the line and red points are the ones who I want to get. Sorry for the bad drawing, but I made it in an online drawing tool.
Thank you
Are the lines you're interpolating quite short, geographically? If so you can just scale linearly along the line. If you want 10 segments then work out the difference between the start and end point's latitude values and same for the longitude. After your existing start point the next point will be (lat + 0.1*latDif, lng + 0.1*lngDif), then (lat + 0.2*latDif, lng + 0.2*lngDif). All pretty simple so long as you're prepared to assume the coordinates exist in a uniform grid, which they don't really but it might be fine if you're using it on a city-scale map.

How to determine if a latitude & longitude is within an ellipse

I have data describing a rotated ellipse (the center of the ellipse in latitude longitude coordinates, the lengths of the major and minor axes in kilometers, and the angle that the ellipse is oriented). I do not know the location of the foci, but assume there is a way to figure them out somehow. I would like to determine if a specific latitude longitude point is within this ellipse. I have found a good way to determine if a point is within an ellipse on a Cartesian grid, but don't know how to deal with latitude longitude points.
Any help would be appreciated.
-Cody O.
The standard way of doing this on a Cartesian plane would be with a ray-casting algorithm. Since you're on a sphere, you will need to use great circle distances to accurately represent the ellipse.
EDIT: The standard ray-casting algorithm will work on your ellipse, but its accuracy depends on a) how small your ellipse is, and b) how close to the equator it is. Keep in mind, you'd have to be aware of special cases like the date line, where it goes from 179 -> 180/-180 -> -179.
Since you already have a way to solve the problem on a cartesian grid, I would just convert your points to UTM coordinates. The points and lengths will all be in meters then and the check should be easy. Lots of matlab code is available to do this conversion from LL to UTM. Like this.
You don't mention how long the axes of the ellipse are in the description. If they are very long (say hundreds of km), this approach may not work for you and you will have to resort to thinking about great circles and so on. You will have to make sure to specify the UTM zone to which you are converting. You want all your points to end up in the same UTM zone or you won't be able to relate the points.
After some more research into my problem and posting in another forum I was able to figure out a solution. My ellipse is relatively small so I assumed it was a true (flat) ellipse. I was able to locate the lat lon of the foci of the ellipse then if the sum of the distances from the point of interest to each focus is less than 2a (the major axis radius), then it is within the ellipse. Thanks for the suggestions though.
-Cody

Fast way to convert array of points into triangle strip?

I have an array of CGPoints (basic struct with two floats: x and y). I want to use OpenGL ES to draw a textured curve using these points. I can do this fine with just two points, but it gets harder when I need to make a line from several points.
Currently I draw a line horizontally, calculate its angle from the points given, and then rotate it. I don't think doing this for all lines in a curve is a good idea. There's probably a faster way.
I'm thinking that I can "enlarge" or "constrict" all the points at once to make a curve with some sort of width.
I'm not positive what you want to accomplish, but consider this:
Based on a ordered list of points, you can draw a polyline using those points. If you want to have a polyline with a 2D texture on it, you can draw a series of quadrilaterals (using two triangles each, of course). You can generate these quadrilaterals using an idea similar to catmul-rom spline generation.
Consider a series of points p[i-1], p[i], p[i+1]. Now, for each i, you can find two points each an epsilon distance away from p[i] along the line perpendicular to the line connecting p[i-1] and p[i+1]. You can determine the two points generated for the endpoints in various ways, like using the perpendicular to the line from p[0] to p[1].
I'm not sure if this will be faster than your method, but you should be caching the results. If you are planning on doing this every frame, another type of solution to your problem may be needed.