Get XY Tile Coordinate at Z Zoom Level with Leaflet - leaflet

I have figured out how to get XYZ coordinates by extending Leaflet with a createTile function.
But what I'm wanting to know is how do I access the XY tile name/coordinate for a fixed Z zoom level around my GPS coordinates, even if I'm not zoomed in.
Why? I'm working on a P2P/decentralized version of Uber, and the XY coordinates are a good common/shared location index for users to lookup/subscribe/query against. As in, everybody within that X mile radius all will know the same XY coordinate name and use that as a deterministic key to find each other with.

This project will "Convert lon, lat to screen pixel x, y from 0, 0 origin, at a certain zoom level." https://github.com/mapbox/sphericalmercator
UPDATED:
function lng2tile(lon,z) { return (Math.floor((lon+180)/360*Math.pow(2,z))) }
function lat2tile(lat,z) { return (Math.floor((1-Math.log(Math.tan(lat*Math.PI/180) + 1/Math.cos(lat*Math.PI/180))/Math.PI)/2 *Math.pow(2,z))) }
Or try this:
var row = Math.floor((location.lng + 180) / (360 / Math.pow(2, zoomLevel)));
var col = Math.floor((90 + (location.lat * -1)) / (180 / Math.pow(2, (zoomLevel - 1))));

Related

How to correctly find UV on sphere

I have a sphere and a texture for it.
Texture consist of 16 tiles of zoom = 2 from OSM. Tile size is 256x256.
At top and bottom I added space to cover area in ranges [90, 85.0511] and [-85.0511, -90], proportionally. So texture size was 1024x1083.
I also tried texture without these two spaces, its size was 1024x1024 (map tiles only).
The problem is that after UV mapping on Y-axis objects are smaller on equator and bigger on poles.
There are two types of formulas
u = (lon + 180) / 360; // lon = [-180, 180]
v = (lat + 90) / 180; // lat = [-85.0511, 85.0511]
----
u = Math.atan2(z, x) / (2 * Math.PI) + 0.5; // x, y, z are vertex coordinates
v = Math.asin(y) / Math.PI + 0.5;
I tried all 8 variations: two textures, two u-formulas and two v-formulas.
The result is like on image above, or worse.
What am I doing wrong? Is it about texture, or UV-formulas, or something else?
P.S.: for poles (vertices in lat range = [-90, -85.0511], [85.0511, 90]) in fragment shader I don't use color from texture, but just solid color
OSM uses the Web Mercator projection. See also on OSM wiki.
The conversion from world (x,y,z) to texture (u,v) coordinates would be:
lon = atan2(y, x)
lat = atan2(z, sqrt(x*x+y*y))
u = (lon + pi)/(2*pi)
v = (log(tan(lat/2 + pi/4)) + pi)/(2*pi)
(I assume that z points north like in WGS-84 and all coordinates are right-handed.)
This projection doesn't cover the entire sphere: as the latitude approaches the poles, the v coordinate blows up to infinity. Therefore extending the map to the north or south direction is not going to be helpful.
Instead keep the original square 1024x1024 texture and render a texture mapped sphere capped at the ±85.051129° latitute (that's where v = 0,1) using the above coordinate mapping.
Alternatively (and this is more in-line with Web Mercator spirit), render each tile regular in the UV coordinates, and calculate the XYZ coordinates by reversing the above transformation.

Compute coordinates position with projection

Given 2 coordinates (point 1 and 2 in red) in WGS84 I need to find the coordinates of the point perpendicular (point 3) to the line at a given distance.
I could manage to make the math to compute this perpendicular point, but when displayed on the map, the point seems to be at a wrong place, probably because of the projection.
What I want on a map:
And what I have instead on the map:
How can I take into account the projection so that the point on the map appears perpendicular to the line? The algorithm below to compute the point comes from here: https://math.stackexchange.com/questions/93424/calculate-rectangle-coordinates-from-line-and-height
public static Coords ComputePerpendicularPoint(Coords first, Coords last, double distance)
{
double slope = -(last.Lon.Value - first.Lon.Value) / (last.Lat.Value - first.Lat.Value);
// number of km per degree = ~111km (111.32 in google maps, but range varies between 110.567km at the equator and 111.699km at the poles)
// 1km in degree = 1 / 111.32km = 0.0089
// 1m in degree = 0.0089 / 1000 = 0.0000089
distance = distance * 0.0000089 / 100; //0.0000089 => represents around 1m in wgs84. /100 because distance is in cm
double t = distance / Math.Sqrt(1 + (slope * slope));
Coords perp_coord = new Coords();
perp_coord.Lon = first.Lon + t;
perp_coord.Lat = first.Lat + (t * slope);
return perp_coord;
}
Thank you in advance!

Get the exact satellite image for a given Lat/Long bbox rectangle?

For a visualization I need an optical satellite image for a specific rectangular AOI, that is defined by two lat/long coordinates. I tried Mapbox Static Images API, which takes a lat/long bounding box and a resolution in width/height pixel for the output. The problem is that it looks like to me that if ratio of the lat/long box is not the same as the w/h pixels, it will add padding to the lat/long bounding box to fill the w/h of the pixel image.
And this would prevent me from combining the optical image with the other data, because I would not know which image pixel would (roughly) correspond to which lat/long coordinate.
I see three "solutions", but I don't know how to achive any of them.
"Make" Mapbox return the images with out padding.
Compute the ratio for the correct w/h pixel ratio using the lat/long coordinate, so there would be no padding. Maybe with https://en.wikipedia.org/wiki/Equirectangular_projection like discussed here: https://stackoverflow.com/a/16271669/380038?
Find a way to determine the lat/long coordinates of the optical satellite image so I can cut off the possible padding.
I checked How can I extract a satellite image from google maps given a Lat Long Rectangle?, but I would prefer to use my existing paid Mapbox account and I got the impression that I still wouldn't get the exact optical image or the exact corner coordinates of the optical image.
Mapbox Static Images API serves maps
You have optical image from other source
You want to overlay these data
Right?
Note the Red and Green pins: the waypoints are at opposite corners on Mapbox.
After Equirectangular correction Mapbox matches Openstreetmaps (little wonder), but Google coordinates are quite close too.
curl -g "https://api.mapbox.com/styles/v1/mapbox/streets-v11/static/[17.55490,47.10434,17.55718,47.10543]/600x419?access_token=YOUR_TOKEN_HERE" --output example-walk-600x419-nopad.png
What is your scale? 1 km - 100 km?
What is your source of optical image?
What is the required accuracy?
Just to mention, optical images have their own sources of distortions.
In practice:
You must have the extent of your non optical satellite data (let's preserve the mist around...) I'll call it ((x1, y1), (x2, y2)) We are coders, not cartographers - right!?
If you feed your extent to https://docs.mapbox.com/playground/static/ as
min longitude = x1, min lattitude = y1, max longitude = x2, max lattitude = y2
Select "Bounding box" entry! Do you see mapbox around your data!? Don't mind the exact dimensions, just check if mapbox is related to your data! May be you have to swap some values to get to the right corner of the globe.
If you have the right ((x1, y1), (x2, y2)) coordinates, do the equirectangular transformation to get the right pixel size.
You've called it Solution #2.
Let's say the with of your non optical satellite data is Wd, the height is Hd.
The mapbox image will fit your data, if you ask for Wm widht, and Hm height of mapbox data where
Wm = Wd
Hm = Wd * (y2 - y1) * cos(x1) / (x2 - x1)
Now you can pull the mapbox by
curl -g "https://api.mapbox.com/styles/v1/mapbox/streets-v11/static/[<x1>,<y1>,<x2>,<y2>]/<Wm>x<Hm>?access_token=<YOUR_TOKEN>" --output overlay.png
If (Hd == Hm)
then {you are lucky :) the two images just fit each other}
else { the two images are for the same area, but you have to scale the height of one of the images to make match }
Well... almost. You have not revealed what size of area you want to cover. The equation above is just an approximation which works up to the size of a smaller country (~100 km or so). For continent scale you probably have to apply more accurate formulas.
In my opinion, your #2 idea is the way to go. You do have the LLng bbox, so all that remains is calculate its "real" size in pixels.
Let us say that you want (or can allow, or can afford) a resolution of 50m per pixel, and the area is small enough not to have distortions (i.e., a rectangle of say 1 arcsecond of latitude and 1 arcsecond of longitude has top and bottom sides of the same length, with an error less than your chosen resolution). These are, I believe, very loose requisites and easy to fulfill.
Then, you just need to calculate the distance between the (Lat1, Lon1) and (Lat1, Lon2) points, and betwen (Lat1, Lon1) and (Lat2, Lon1). Divide that distance in meters by 50, and you'll get the exact number of pixels:
Lon1 Lon2
Lat1 +---------------+
| |
| |
Lat2 +---------------+
And you have a formula for that - the haversine formula.
If you need a higher precision, you could recourse to the Vincenty oblate spheroid (here a Javascript library). On the MT site (first link) there is a live calculator that you can use to plug data from your calls, and verify whether the approach is indeed working. I.e. you plug in your bounding box, get the distance in meters, divide and get the pixel size of the image (if the image is good, chances are that you can go with the simpler haversine. If it isn't, then there has to be some further quirk in the maps API - its projection, perhaps - that doesn't return the expected bounding box. But it seems unlikely).
I've had this exact problem when using a satellite image on an apple watch. I overlay some markers and a path. I convert everything from coordinates to pixels. Below is my code to determine the exact bbox result
var maxHoleLat = 52.5738902
var maxHoleLon = 4.9577606
var minHoleLat = 52.563994
var minHoleLon = 4.922364
var mapMaxLat = 0.0
var mapMaxLon = 0.0
var mapMinLat = 0.0
var mapMinLon = 0.0
let token = "your token"
var resX = 1000.0
var resY = 1000.0
let screenX = 184.0
let screenY = 224.0 // 448/2 = 224 - navbarHeight
let navbarHeight = 0.0
var latDist = 111000.0
var lonDist = 111000.0
var dx = 0.0
var dy = 0.0
func latLonDist(){
//calgary.rasc.ca/latlong.htm
let latRad = maxHoleLat * .pi / 180
//distance between 1 degree of longitude at given latitude
self.lonDist = 111412.88 * cos(latRad) - 0.09350*cos(3 * latRad) + 0.00012 * cos(5 * latRad)
print("lonDist = \(self.lonDist)")
//distance between 1 degree of latitude at a given longitude
self.latDist = 111132.95 - 0.55982 * cos(2 * latRad) + 0.00117 * cos(4 * latRad)
print("latDist = \(self.latDist)")
}
func getMapUrl(){
self.dx = (maxHoleLon - minHoleLon) * lonDist
self.dy = (maxHoleLat - minHoleLat) * latDist
//the map is square, but the hole not
//check if the hole has less x than y
if dx < dy {
mapMaxLat = maxHoleLat
mapMinLat = minHoleLat
let midLon = (maxHoleLon + minHoleLon ) / 2
mapMaxLon = midLon + dy / 2 / lonDist
mapMinLon = midLon - dy / 2 / lonDist
} else {
mapMaxLon = maxHoleLon
mapMinLon = minHoleLon
let midLat = (maxHoleLat + minHoleLat ) / 2
mapMaxLat = midLat + dx / 2 / latDist
mapMinLat = midLat - dx / 2 / latDist
}
self.imageUrl = URL(string:"https://api.mapbox.com/styles/v1/mapbox/satellite-v9/static/[\(mapMinLon),\(mapMinLat),\(mapMaxLon),\(mapMaxLat)]/1000x1000?logo=false&access_token=\(token)")
print("\(imageUrl)")
}

Map distance to zoom in Google Static Maps

I am using Google Static Maps to display maps in my AppleTV app. What I need is to somehow map a distance of e.g. 1km to the zoom parameter of the Static Maps API.
In other words I have an imageView in which I wish to load the map image and if I know that the height of my imageView is 400px, and I wish for this map to show a real Earth surface of 1000m North to South, how would I tell the API to return me the map with this exact zoom?
I found a very similar question here, however no suitable answer is provided.
As stated at Google Maps Documentation:
Because the basic Mercator Google Maps tile is 256 x 256 pixels.
Also note that every zoom level, the map has 2 n tiles.
Meaning that at zoomLevel 2,the pixels in any direction of a map are = 256 * 2² = 1024px.
Taking into account that the earth has a perimeter of ~40,000 kilometers, in zoom 0, every pixel ~= 40,000 km/256 = 156.25 km
At zoom 9, pixels are 131072: 1px = 40,000 km / 131072 = 0.305 km ... and so on.
If we want 400px = 1km, we have to choose the closest approximation possible, so: 1px = 1km/400 = 0.0025km
I tried zoom = 15 and obtained 1px = 0.00478 and zoom = 16 that gave me 1px = 0.00238km
Meaning that you should use zoom = 16, and you will have 0.955km every 400px in the Equator line and only for x coordinates.
As you go north or south in latitude, perimeter is everytime smaller, thus changing the distance. And of course it also changes the correlation in the y axis as the projection of a sphere is tricky.
If you want to calculate with a function the exact distance, you should use the one provided by Google at their documentation:
// Describe the Gall-Peters projection used by these tiles.
gallPetersMapType.projection = {
fromLatLngToPoint: function(latLng) {
var latRadians = latLng.lat() * Math.PI / 180;
return new google.maps.Point(
GALL_PETERS_RANGE_X * (0.5 + latLng.lng() / 360),
GALL_PETERS_RANGE_Y * (0.5 - 0.5 * Math.sin(latRadians)));
},
fromPointToLatLng: function(point, noWrap) {
var x = point.x / GALL_PETERS_RANGE_X;
var y = Math.max(0, Math.min(1, point.y / GALL_PETERS_RANGE_Y));
return new google.maps.LatLng(
Math.asin(1 - 2 * y) * 180 / Math.PI,
-180 + 360 * x,
noWrap);
}
};

How to draw real world coordinates rotated relative to the device around a center coordinate?

I'm working on a simple location-aware game where the current location of the user is shown on a game map, as well as the locations of other players around him. It's not using MKMapView but a custom game map with no streets.
How can I translate the other lat/long coordinates of other players into CGPoint values to represent them in the world scale game map with a fixed scale like 50 meters = 50 points in screen, and orient all the points such that the user can see in which direction he would have to go to reach another player?
The key goal is to generate CGPoint values for lat/long coordinates for a flat top-down view, but orient the points around the users current location similar to the orient map feature (the arrow) of Google Maps so you know where is what.
Are there frameworks which do the calculations?
first you have to transform lon/lat to cartesian x,y in meters.
next is the direction in degrees to your other players. the direction is dy/dx where dy = player2.y to me.y, same for dx. normalize dy and dx by this value by dividing by distance between playerv2 and me.
you receive
ny = dy / sqrt(dx*dx + dy*dy)
nx = dx / sqrt(dx*dx + dy*dy)
multipl with 50. now you have a point 50 m in direction of the player2:
comp2x = 50 * nx;
comp2y = 50 * ny;
now center the map on me.x/me.y. and apply the screen to meter scale
You want MKMapPointForCoordinate from MapKit. This converts from latitude-longitude pairs to a flat surface defined by an x and y. Take a look at the documentation for MKMapPoint which describes the projection. You can then scale and rotate those x,y pairs into CGPoints as needed for your display. (You'll have to experiment to see what scaling factors work for your game.)
To center the points around your user, just subtract the value of their x and y position (in MKMapPoints) from the points of all other objects. Something like:
MKMapPoint userPoint = MKMapPointForCoordinate(userCoordinate);
MKMapPoint otherObjectPoint = MKMapPointForCoordinate(otherCoordinate);
otherObjectPoint.x -= userPoint.x; // center around your user
otherObjectPoint.y -= userPoint.y;
CGPoint otherObjectCenter = CGPointMake(otherObjectPoint.x * 0.001, otherObjectPoint.y * 0.001);
// Using (50, 50) as an example for where your user view is placed.
userView.center = CGPointMake(50, 50);
otherView.center = CGPointMake(50 + otherObjectCenter.x, 50 + otherObjectCenter.y);