Convert coordinates between two rotated systems - coordinates

I have a map with coordinates in meters and an overlaying building plan with pixel coordinates:
I already know the scale factor and I am able to convert coordinates between the two systems if they are aligned (i.e. the overlay image is exactly horizontal with no rotation)--> conversionfactor (= number of overlay pixels in one meter on the map)
MAPx(ImageX) = centerpointX + ImageX * conversionfactor
MAPy(ImageY) = centerpointY + ImageY * conversionfactor
How can I convert between the coordinates if the overlay is rotated assuming that I have above formulas and I want to include a rotation angle?
EDIT (#tsauerwein):
Here is the marker style that you have requested:
planStyle = function(feature, resolution){
var style = new ol.style.Style({
image: new ol.style.Icon({
src: feature.dataURL,
scale: feature.resolution / resolution,
rotateWithView: true,
rotation: feature.rotation * (Math.PI / 180),
anchor: [.5, .5],
anchorXUnits: 'fraction',
anchorYUnits: 'fraction',
opacity: feature.opacity
})
})
return [style];
};

Assuming that you are using ol.source.ImageStatic: When you configure your layer, you have the size of the image in pixels(e.g. width=500, height=200) and also the extent that this image covers in coordinates.
Now, if you have a coordinate, you can easily check if the coordinate is inside the image extent (ol.extent.containsXY(extent, x, y)). Then you can also translate the real-world coordinate to a pixel coordinate:
// image size
var width = 500;
var height = 250;
// image extent
var extent = [2000, 0, 4000, 1000];
// coordinates
var x = 3000;
var y = 500;
if (ol.extent.containsXY(extent, x, y)) {
var pixelX = width * (x - extent[0]) / ol.extent.getWidth(extent);
var pixelY = height * (y - extent[1]) / ol.extent.getHeight(extent);
}
Doing it like this, it doesn't matter if the map is rotated or not.

Related

Dividing long and lat coordinates into sub-coordinates(smaller squares)?

I have 2 long, lat points of a rectangle(bottom left and top right) and I want to divide this rectangle into smaller ones based on a base area (long and lat) I already have. I already know that I can't deal with long and lat as distance measured with meters and kilometres but degrees on an approximation of Earth's surface shape.
The points taken is extracted by leaflet with a 4326 SRID and so are the original points. I need the centre of the "smaller squares" or the long and lat coordinates.
For example, this is my base rectangle 24.639567,46.782406 24.641452,46.785413 and for the rectangle, I want to divide 24.584749,46.612782 24.603323,46.653809.
First, let's turn your two points into a leaflet bounds object:
const bounds - L.latLngBounds(point1, point2)
Now let's pick a sample interval, meaning how many sub-rectangles across the width and height of your bounds. For example, a sampling size of 10 would give 100 sub-rectangles (10 x 10), though if your sub-rectangles don't need the same aspect-ratio as your main bounds, you could choose two separate sampling intervals (one for x and one for y)
const samplingInterval = 10 // easy to change
To properly interpolate through your main bounds, we'll grab the corners of it, as well as the width in longitude degrees, and height in latitude degrees, called dLat and dLng (for delta):
const sw = bounds.getSouthWest();
const nw = bounds.getNorthWest();
const ne = bounds.getNorthEast();
const dLat = ne.lat - sw.lat;
const dLng = ne.lng - nw.lng;
Now we can build an array of new bounds extrapolated from the original:
let subBounds = [];
for (let i = 0; i < samplingInterval - 1; i++){
for (let j = 1; j < samplingInterval; j++){
const corner1 = [
sw.lat + (dLat * i) / samplingInterval,
sw.lng + (dLng * j) / samplingInterval
];
const corner2 = [
sw.lat + (dLat * (i + 1)) / samplingInterval,
sw.lng + (dLng * (j + 1)) / samplingInterval
];
subBounds.push(L.latLngBounds(corner1, corner2));
}
}
Now to get the centers of these bounds, you can call .getCenter() on them:
const centerPoints = subBounds.map(bounds => bounds.getCenter());
Working codesandbox

Calculate lat long on static mapbox img for 256px tiles

I have a static image with only center point lat/long (for example https://api.mapbox.com/styles/v1/mapbox/light-v9/static/-78.4649,42.5128,5,0,0/300x200) and I want to put on this map some markers(lat.long) with the help of canvas.
But I need to calculate somehow the xy coordinates for those markers.
So I know the center of map(lat/long) and the lat/long marker coordinates. Is there any way to convert lat/long to xy knowing only zoom level and center?
Or if I know the xy of the center lat/long(it always be the same 150px * 100px) and zoom level, could I calculate the xy for other markers?
I have a lot of markers (>200, and they all are custom svg generated and so on) to place it on this map. I can't use mapbox mapbox static map because of the markers limitation and so on.
UPD: Based on the comments I updated the question.
How to calculate it for 256px square tiles?
Based on the OP comment I'm assuming that the requested image is square, for the sake of simplicity (TILE_SIZE could be decomposed in a TILE_SIZE_X and TILE_SIZE_Y component). I'm also assuming that the image is 256-pixels wide TILE_SIZE=256
I'm giving both the pixel coordinates relative to the center of the image (distanceInPixels function), and to the Lower Left Corner (imageCoordinates function). Changing to the Upper Left Corner in case that's necessary should be trivial (X will be equal and Y = TILE_SIZE -Y).
<!DOCTYPE html>
<html>
<body>
<p id="demo"></p>
<script>
var latLngMarker = {};
var latLngCenter = {};
// Image dimensions in pixels
var TILE_SIZE = 256;
var zoom = 5;
// Coordinates of the marker to be projected on the image
latLngMarker.lat = 41.850;
latLngMarker.lng = -87.650;
// Coordinates of the image center
latLngCenter.lat = 41.850;
latLngCenter.lng = -87.650;
// Coordinates projected on the cartographic plane (Mercator)
var centerProjected = project(latLngCenter);
var markerProjected = project(latLngMarker);
// The result should be X=Y=0, because I made Marker Lat Lng = Center Lat Lng
var distanceFromCenter = distanceInPixels(centerProjected, markerProjected);
alert("X: " + distanceFromCenter.x + " Y: " + distanceFromCenter.y);
// The result should be X=Y=256/2=128 for the same reason
var coords = imageCoordinates(centerProjected, markerProjected);
alert("X: " + coords.x + " Y: " + coords.y);
// The horizontal distance represented by one pixel for a given latitude and zoom level
function pixelResolution (latLng, zoom) {
var radius = 6378137.0 // semi-axis of WGS84 ellipsoid
var circumference = 2 * Math.PI * radius;
var distancePerImage = circumference * Math.cos(latLng.lat * Math.PI / 180.0) / Math.pow(2,zoom);
var distancePerPixel = distancePerImage / TILE_SIZE;
return distancePerPixel
}
// Web mercator projection.
function project(latLng) {
var siny = Math.sin(latLng.lat * Math.PI / 180);
siny = Math.min(Math.max(siny, -0.9999), 0.9999);
var xy = {};
xy.x = TILE_SIZE * (0.5 + latLng.lng / 360);
xy.y = TILE_SIZE * (0.5 - Math.log((1 + siny) / (1 - siny)) / (4 * Math.PI));
return xy
}
// Marker pixel coordinates relative to the image Center
function distanceInPixels(centerProjected, markerProjected) {
var delta = {};
var spacing = pixelResolution(latLngCenter, zoom);
delta.x = Math.round((centerProjected.x - markerProjected.x)/spacing);
delta.y = Math.round((centerProjected.y - markerProjected.y)/spacing);
return delta
}
// Marker pixel coordinates relative to the Lower Left Corner
function imageCoordinates(centerProjected, markerProjected) {
var pixelCoordinates = {};
var spacing = pixelResolution(latLngCenter, zoom);
var deltaPixels = distanceInPixels(centerProjected, markerProjected);
pixelCoordinates.x = TILE_SIZE / 2 - deltaPixels.x;
pixelCoordinates.y = TILE_SIZE / 2 - deltaPixels.y;
return pixelCoordinates
}
</script>
</body>
</html>
Note: I can confirm that the pixelResolution function only works with square image tiles with dimensions of powers of 2. The Math.pow(2,zoom); snippet gives the game away!
Web Mercator function based on:
https://developers-dot-devsite-v2-prod.appspot.com/maps/documentation/javascript/examples/map-coordinates
Horizontal distance represented by one pixel from :
https://wiki.openstreetmap.org/wiki/Zoom_levels
See also:
https://wiki.openstreetmap.org/wiki/Slippy_map_tilenames#Resolution_and_Scale
If you're going to linearly interpolate you'd need to know the lat/long & x/y for 2 points. It wouldn't be possible with only the center point unless you also have a conversion metric for pixels - ie. 50 pixels is .1 delta lat/long.
If you have the lat/long & x/y for two points you can create the ratio as y1 - y2 / lat1-lat2 or x1-x2/long1-long2 each of which should result in the same ratio
Then it'd be relatively easy, assume the ratio is 5 meaning 5px/l so you had a point that was (3,-4) away from that center point you'd simply multiple to find the pixel offset (15,-20) and add that to the center = (165, 80).
Since all of your images are zoomed the same amount you could manually calculate the ratio once and store it as a constant.
sudo/untested python:
def getRatio(latlongs=[(1,1),(0,0)], xys=[(5,5),(0,0)]:
return (xys[0][1]-xys[1][1]) / (latlongs[0][0] - latlongs[1][0])
centerLatLong = (5,5)
centerXY = (150, 100)
def getCoord(lat,long,ratio):
y = (lat-centerLatLong[0])*ratio + centerXY[1]
x = (long-centerLatLong[1])*ratio + centerXY[0]
return x, y

Map distance to zoom in Google Static Maps

I am using Google Static Maps to display maps in my AppleTV app. What I need is to somehow map a distance of e.g. 1km to the zoom parameter of the Static Maps API.
In other words I have an imageView in which I wish to load the map image and if I know that the height of my imageView is 400px, and I wish for this map to show a real Earth surface of 1000m North to South, how would I tell the API to return me the map with this exact zoom?
I found a very similar question here, however no suitable answer is provided.
As stated at Google Maps Documentation:
Because the basic Mercator Google Maps tile is 256 x 256 pixels.
Also note that every zoom level, the map has 2 n tiles.
Meaning that at zoomLevel 2,the pixels in any direction of a map are = 256 * 2² = 1024px.
Taking into account that the earth has a perimeter of ~40,000 kilometers, in zoom 0, every pixel ~= 40,000 km/256 = 156.25 km
At zoom 9, pixels are 131072: 1px = 40,000 km / 131072 = 0.305 km ... and so on.
If we want 400px = 1km, we have to choose the closest approximation possible, so: 1px = 1km/400 = 0.0025km
I tried zoom = 15 and obtained 1px = 0.00478 and zoom = 16 that gave me 1px = 0.00238km
Meaning that you should use zoom = 16, and you will have 0.955km every 400px in the Equator line and only for x coordinates.
As you go north or south in latitude, perimeter is everytime smaller, thus changing the distance. And of course it also changes the correlation in the y axis as the projection of a sphere is tricky.
If you want to calculate with a function the exact distance, you should use the one provided by Google at their documentation:
// Describe the Gall-Peters projection used by these tiles.
gallPetersMapType.projection = {
fromLatLngToPoint: function(latLng) {
var latRadians = latLng.lat() * Math.PI / 180;
return new google.maps.Point(
GALL_PETERS_RANGE_X * (0.5 + latLng.lng() / 360),
GALL_PETERS_RANGE_Y * (0.5 - 0.5 * Math.sin(latRadians)));
},
fromPointToLatLng: function(point, noWrap) {
var x = point.x / GALL_PETERS_RANGE_X;
var y = Math.max(0, Math.min(1, point.y / GALL_PETERS_RANGE_Y));
return new google.maps.LatLng(
Math.asin(1 - 2 * y) * 180 / Math.PI,
-180 + 360 * x,
noWrap);
}
};

Generate random GeoJSON polygons

Is there a way or tool to generate random GeoJSON polygons of specific size withing bounding box? Specifically I want to populate mongodb with a lot of random polygons and test specific functionality.
You could do it programmatically using the bounding box coordinates to generate the random bounding box coords for the rectangles.
For example, if your bounding box is [[100,100],[200,200]] you could do the following:
// generate a random width and height
// (e.g. with random numbers between 1 and 50)
var width = Math.floor(Math.random() * 50) + 1;
var height = Math.floor(Math.random() * 50) + 1;
// generate a random position that allows the rectangle to fit within the bounding box walls
// 100 is used in the calculation as 100 is the width and height of the example bounding box
var upperX = Math.floor(Math.random() * (100-width)) + 1;
var upperY = Math.floor(Math.random() * (100-height)) + 1;
var lowerX = upperX + width;
var lowerY = upperY + height;
var bounds = [[upperX, upperY], [lowerX, lowerY]];
// create rectangle
L.rectangle(bounds, {color: "#ff7800", weight: 1}).addTo(map);
// loop through above code some chosen number of times

Three.js camera rotation issue

I want to rotate the camera around the x-axis on the y-z plane while looking at the (0, 0, 0) point. It turns out the lookAt function behaves weird. When after rotating 180°, the geometry jump to another side unexpectedly. Could you please explain why this happens, and how to avoid it?
You can see the live demo on jsFiddle: http://jsfiddle.net/ysmood/dryEa/
class Stage
constructor: ->
window.requestAnimationFrame =
window.requestAnimationFrame or
window.webkitRequestAnimationFrame or
window.mozRequestAnimationFrame
#init_scene()
#make_meshes()
init_scene: ->
#scene = new THREE.Scene
# Renderer
width = window.innerWidth;
height = window.innerHeight;
#renderer = new THREE.WebGLRenderer({
canvas: document.querySelector('.scene')
})
#renderer.setSize(width, height)
# Camera
#camera = new THREE.PerspectiveCamera(
45, # fov
width / height, # aspect
1, # near
1000 # far
)
#scene.add(#camera)
make_meshes: ->
size = 20
num = 1
geo = new THREE.CylinderGeometry(0, size, size)
material = new THREE.MeshNormalMaterial()
mesh = new THREE.Mesh(geo, material)
mesh.rotation.z = Math.PI / 2
#scene.add(mesh)
draw: =>
angle = Date.now() * 0.001
radius = 100
#camera.position.set(
0,
radius * Math.cos(angle),
radius * Math.sin(angle)
)
#camera.lookAt(new THREE.Vector3())
#renderer.render(#scene, #camera)
requestAnimationFrame(#draw)
stage = new Stage
stage.draw()
You are rotating the camera around the X-axis in the Y-Z plane. When the camera passes over the "north" and "south" poles, it flips so as to stay right-side-up. The camera's up-vector is (0, 1, 0) by default.
Set the camera x-position to 100 so, and its behavior will appear correct to you. Add some axes to your demo for a frame of reference.
This is not a fault of the library. Have a look at the Camera.lookAt() source code.
If you want to set the camera orientation via its quaternion instead, you can do that.
three.js r.59