Calculating physical distance scrolled from scrollWheel NSEvent - swift

Making my first Swift app for macOS. Learning as I go...
I'm trying to make an app that calculates the total physical distance scrolled from a scrollwheel NSEvent handler. I'm using this code to attach the event handler:
NSEvent.addGlobalMonitorForEvents(matching: NSEvent.EventTypeMask.scrollWheel, handler: self.scrollWheel);
I'm getting the absolute deltaY value in the event handler by using let deltaY = abs(event.scrollingDeltaY)
I'm guessing the deltaY is reported in points, but how do I translate it to a physical distance (in inch)?
I'm manually calculating the ppi for the user's device using this code:
let width = NSScreen.main?.frame.width ?? 0
let height = NSScreen.main?.frame.height ?? 0
let diagonal = sqrt(pow(width, 2) + pow(height, 2))
let pixelsPerInch = ((width / CGDisplayScreenSize(CGMainDisplayID()).width) * 25.4) * CGFloat(NSScreen.main?.backingScaleFactor ?? 0
I'm using this formula to calculate the physical distance scrolled (in inches).
let distance = deltaY * 1 / pixelsPerInch * 40
The formula is hardcoded and obviously wrong. The multiplication by 40 is an approximation, as I have no idea if I'm on the right track. I'm looking for the right formula.
Any ideas?
Thanks in advance!

I believe you're close
CGDisplayScreenSize().width gives you the physical width in mm.
CGDisplayBounds().width gives you the width number of pixels.
And 1 inch = 25.4 mm
So:
let pixelsPerInch = CGDisplayBounds().width / ( CGDisplayScreenSize().width / 25.4 ))
...and...
let distance = deltaY / pixelsPerInch

Related

Get the exact satellite image for a given Lat/Long bbox rectangle?

For a visualization I need an optical satellite image for a specific rectangular AOI, that is defined by two lat/long coordinates. I tried Mapbox Static Images API, which takes a lat/long bounding box and a resolution in width/height pixel for the output. The problem is that it looks like to me that if ratio of the lat/long box is not the same as the w/h pixels, it will add padding to the lat/long bounding box to fill the w/h of the pixel image.
And this would prevent me from combining the optical image with the other data, because I would not know which image pixel would (roughly) correspond to which lat/long coordinate.
I see three "solutions", but I don't know how to achive any of them.
"Make" Mapbox return the images with out padding.
Compute the ratio for the correct w/h pixel ratio using the lat/long coordinate, so there would be no padding. Maybe with https://en.wikipedia.org/wiki/Equirectangular_projection like discussed here: https://stackoverflow.com/a/16271669/380038?
Find a way to determine the lat/long coordinates of the optical satellite image so I can cut off the possible padding.
I checked How can I extract a satellite image from google maps given a Lat Long Rectangle?, but I would prefer to use my existing paid Mapbox account and I got the impression that I still wouldn't get the exact optical image or the exact corner coordinates of the optical image.
Mapbox Static Images API serves maps
You have optical image from other source
You want to overlay these data
Right?
Note the Red and Green pins: the waypoints are at opposite corners on Mapbox.
After Equirectangular correction Mapbox matches Openstreetmaps (little wonder), but Google coordinates are quite close too.
curl -g "https://api.mapbox.com/styles/v1/mapbox/streets-v11/static/[17.55490,47.10434,17.55718,47.10543]/600x419?access_token=YOUR_TOKEN_HERE" --output example-walk-600x419-nopad.png
What is your scale? 1 km - 100 km?
What is your source of optical image?
What is the required accuracy?
Just to mention, optical images have their own sources of distortions.
In practice:
You must have the extent of your non optical satellite data (let's preserve the mist around...) I'll call it ((x1, y1), (x2, y2)) We are coders, not cartographers - right!?
If you feed your extent to https://docs.mapbox.com/playground/static/ as
min longitude = x1, min lattitude = y1, max longitude = x2, max lattitude = y2
Select "Bounding box" entry! Do you see mapbox around your data!? Don't mind the exact dimensions, just check if mapbox is related to your data! May be you have to swap some values to get to the right corner of the globe.
If you have the right ((x1, y1), (x2, y2)) coordinates, do the equirectangular transformation to get the right pixel size.
You've called it Solution #2.
Let's say the with of your non optical satellite data is Wd, the height is Hd.
The mapbox image will fit your data, if you ask for Wm widht, and Hm height of mapbox data where
Wm = Wd
Hm = Wd * (y2 - y1) * cos(x1) / (x2 - x1)
Now you can pull the mapbox by
curl -g "https://api.mapbox.com/styles/v1/mapbox/streets-v11/static/[<x1>,<y1>,<x2>,<y2>]/<Wm>x<Hm>?access_token=<YOUR_TOKEN>" --output overlay.png
If (Hd == Hm)
then {you are lucky :) the two images just fit each other}
else { the two images are for the same area, but you have to scale the height of one of the images to make match }
Well... almost. You have not revealed what size of area you want to cover. The equation above is just an approximation which works up to the size of a smaller country (~100 km or so). For continent scale you probably have to apply more accurate formulas.
In my opinion, your #2 idea is the way to go. You do have the LLng bbox, so all that remains is calculate its "real" size in pixels.
Let us say that you want (or can allow, or can afford) a resolution of 50m per pixel, and the area is small enough not to have distortions (i.e., a rectangle of say 1 arcsecond of latitude and 1 arcsecond of longitude has top and bottom sides of the same length, with an error less than your chosen resolution). These are, I believe, very loose requisites and easy to fulfill.
Then, you just need to calculate the distance between the (Lat1, Lon1) and (Lat1, Lon2) points, and betwen (Lat1, Lon1) and (Lat2, Lon1). Divide that distance in meters by 50, and you'll get the exact number of pixels:
Lon1 Lon2
Lat1 +---------------+
| |
| |
Lat2 +---------------+
And you have a formula for that - the haversine formula.
If you need a higher precision, you could recourse to the Vincenty oblate spheroid (here a Javascript library). On the MT site (first link) there is a live calculator that you can use to plug data from your calls, and verify whether the approach is indeed working. I.e. you plug in your bounding box, get the distance in meters, divide and get the pixel size of the image (if the image is good, chances are that you can go with the simpler haversine. If it isn't, then there has to be some further quirk in the maps API - its projection, perhaps - that doesn't return the expected bounding box. But it seems unlikely).
I've had this exact problem when using a satellite image on an apple watch. I overlay some markers and a path. I convert everything from coordinates to pixels. Below is my code to determine the exact bbox result
var maxHoleLat = 52.5738902
var maxHoleLon = 4.9577606
var minHoleLat = 52.563994
var minHoleLon = 4.922364
var mapMaxLat = 0.0
var mapMaxLon = 0.0
var mapMinLat = 0.0
var mapMinLon = 0.0
let token = "your token"
var resX = 1000.0
var resY = 1000.0
let screenX = 184.0
let screenY = 224.0 // 448/2 = 224 - navbarHeight
let navbarHeight = 0.0
var latDist = 111000.0
var lonDist = 111000.0
var dx = 0.0
var dy = 0.0
func latLonDist(){
//calgary.rasc.ca/latlong.htm
let latRad = maxHoleLat * .pi / 180
//distance between 1 degree of longitude at given latitude
self.lonDist = 111412.88 * cos(latRad) - 0.09350*cos(3 * latRad) + 0.00012 * cos(5 * latRad)
print("lonDist = \(self.lonDist)")
//distance between 1 degree of latitude at a given longitude
self.latDist = 111132.95 - 0.55982 * cos(2 * latRad) + 0.00117 * cos(4 * latRad)
print("latDist = \(self.latDist)")
}
func getMapUrl(){
self.dx = (maxHoleLon - minHoleLon) * lonDist
self.dy = (maxHoleLat - minHoleLat) * latDist
//the map is square, but the hole not
//check if the hole has less x than y
if dx < dy {
mapMaxLat = maxHoleLat
mapMinLat = minHoleLat
let midLon = (maxHoleLon + minHoleLon ) / 2
mapMaxLon = midLon + dy / 2 / lonDist
mapMinLon = midLon - dy / 2 / lonDist
} else {
mapMaxLon = maxHoleLon
mapMinLon = minHoleLon
let midLat = (maxHoleLat + minHoleLat ) / 2
mapMaxLat = midLat + dx / 2 / latDist
mapMinLat = midLat - dx / 2 / latDist
}
self.imageUrl = URL(string:"https://api.mapbox.com/styles/v1/mapbox/satellite-v9/static/[\(mapMinLon),\(mapMinLat),\(mapMaxLon),\(mapMaxLat)]/1000x1000?logo=false&access_token=\(token)")
print("\(imageUrl)")
}

Manually write world file (jgw) from Leaflet.js map

I have the need to export georeferenced images from Leaflet.js on the client side. Exporting an image from Leaflet is not a problem as there are plenty of existing plugins for this, but I'd like to include a world file with the export so the resulting image can be read into GIS software. I have a working script fort his, but I can't seem to nail down the correct parameters for my world file such that the resulting georeferenced image is located exactly correctly.
Here's my current script
// map is a Leaflet map object
let bounds = map.getBounds(); // Leaflet LatLngBounds
let topLeft = bounds.getNorthWest();
let bottomRight = bounds.getSouthEast();
let width_deg = bottomRight.lng - topLeft.lng;
let height_deg = topLeft.lat - bottomRight.lat;
let width_px = $(map._container).width() // Width of the map in px
let height_px = $(map._container).height() // Height of the map in px
let scaleX = width_deg / width_px;
let scaleY = height_deg / height_px;
let jgwText = `${scaleX}
0
0
-${scaleY}
${topLeft.lng}
${topLeft.lat}`
This seems to work well at large scales (ie zoomed in to city-level or so), but at smaller scales there is some distortion along the y-axis. One thing I noticed is that all examples of world files I can find (and those produced from QGIS or ArcMap) all have the x-scale and y-scale parameters being exactly equal (oppositely signed). In my calculations, these terms are different unless you are sitting right on the equator.
Example world file produced from QGIS
0.08984380916303301 // x-scale (size of px in x direction)
0 // rotation parameter 1
0 // rotation parameter 2
-0.08984380916303301 // y-scale (size of px in y direction)
-130.8723208723141056 // x-coord of top left px
51.73651369984968085 // y-coord of top left px
Example world file produced from my calcs
0.021972656250000017
0
0
-0.015362443783773333
-130.91308593750003
51.781435604431195
Example of produced image using my calcs with correct state boundaries overlaid:
Does anyone have any idea what I'm doing wrong here?
Problem was solved by using EPSG:3857 for the worldfile, and ensuring the width and height of the map bounds was also measured in this coordinate system. I had tried using EPSG:3857 for the worldfile, but measured the width and height of the map bounds using Leaflet's L.map.distance() function. To solve the problem, I instead projected corner points of the map bounds to EPSG:3857 using L.CRS.EPSG3857.project(), the simply subtracted the X,Y values.
Corrected code is shown below, where map is a Leaflet map object (L.map)
// Get map bounds and corner points in 4326
let bounds = map.getBounds();
let topLeft = bounds.getNorthWest();
let bottomRight = bounds.getSouthEast();
let topRight = bounds.getNorthEast();
// get width and height in px of the map container
let width_px = $(map._container).width()
let height_px = $(map._container).height()
// project corner points to 3857
let topLeft_3857 = L.CRS.EPSG3857.project(topLeft)
let topRight_3857 = L.CRS.EPSG3857.project(topRight)
let bottomRight_3857 = L.CRS.EPSG3857.project(bottomRight)
// calculate width and height in meters using epsg:3857
let width_m = topRight_3857.x - topLeft_3857.x
let height_m = topRight_3857.y - bottomRight_3857.y
// calculate the scale in x and y directions in meters (this is the width and height of a single pixel in the output image)
let scaleX_m = width_m / width_px
let scaleY_m = height_m / height_px
// worldfiles need the CENTRE of the top left px, what we currently have is the TOPLEFT point of the px.
// Adjust by subtracting half a pixel width and height from the x,y
let topLeftCenterPxX = topLeft_3857.x - (scaleX / 2)
let topLeftCenterPxY = topLeft_3857.y - (scaleY / 2)
// format the text of the worldfile
let jgwText = `
${scaleX_m}
0
0
-${scaleY_m}
${topLeftCenterPxX}
${topLeftCenterPxY}
`
For anyone else with this problem, you'll know things are correct when your scale-x and scale-y values are exactly equal (but oppositely signed)!
Thanks #IvanSanchez for pointing me in the right direction :)

2D Motion vectors - calculate location of object at a given time

I am having problems understanding 2D motion vectors when moving certain objects at a given time. My knowledge of linear algebra is limited and I really don't know the exact search terms to look for, so I wanted to know whether anybody could help me or at least hint me in the right direction.
My problem looks like this:
I have two points, a startPoint, and an endPoint in space. They have each a specific location, denoted as (x_1, x_2) and (y_1, y_2) respectively. Both of these points have a time attached to it, named t_startPoint or t_endPoint, respectively. I now want to find out, for a given currentTime (= basically any point in time that is in between t_startPoint and t_endPoint), where exactly would a new point N be positioned on the connection line between those two points. I know the description is not trivial and that’s why I also added an image describing what I would like to do:
So far, this is what I have as my algorithm:
func update(_ time: Int64) {
let t_startPoint: Int64 = 1
let position_startPoint: = (1.0, 1.0)
let t_endPoint: Int64 = 5
let position_endPoint: Vector = (4.0, 5.0)
let currentTime = 3
let duration = t_endPoint - t_startPoint
let x = position_startPoint.x + ((position_endPoint.x - position_startPoint.x) / Float(duration)) * (Float(currentTime - t_startPoint))
let y = position_startPoint.y + ((position_endPoint.y - position_startPoint.y) / Float(duration)) * (Float(currentTime - t_startPoint))
//
However, no matter what I do, my objects keep overshooting, erratically moving back and forth, and I don't know where to start. Any help would be greatly appreciated!
For constant velocity moving there is relation:
(t-t1) / (t2-t1) = (x-x1) / (x2-x1)
x = x1 + (x2-t1) * (t-t1) / (t2-t1)
so your expresiion looks right. Check:
1 + (4-1) * (3-1) / (5-1) = 1 + 3 * 2 / 4 = 2.5 - exact middle, OK

Spritekit rotating multiple nodes around center

I would like to apply a rotation to multiple nodes (selected nodes in my game) using the UIRotationGesture, based on the center point of all those nodes. I can already rotate a single node simply changing it's zRotation.
The problem with multiple nodes is that it changes position and zRotation based on a center node, and I cannot seem to understand how to manage that.
I would like something like this:
What I have to rotate a single node is this:
During the rotation gesture
theRotation = CGFloat(sender.rotation) + self.offset
theRotation = theRotation * -1
node.rotation = theRotation
After the rotation gesture
self.offset = theRotation * -1
Would you have an idea on how set the correct position and angle for my nodes during the rotation?
What I tried:
I tried to add a node in the center (where the white dot is in my pictures, which represents the center) and change the parent of my nodes to be this one, then apply the zRotation on this node, and then replace the right parents. This did not work as I cannot seem to change a parent (my nodes disappear), this is another one of my Stack Questions.
I tried to change the anchor point of my nodes to fit the center point and than rotate them using theRotation. It did not work as I cannot seem to set the anchor point at the center position (that I have). I tried changing the coordinates system of the center's position to fit the node's one, but this is still not working. node.convertPoint(center, fromNode: Self) gives me coordinated like -58;-74 when it's about -1;-.5 (or something like that). I do not understand this.
So now I am thinking to calculate the position and rotation myself, as those did not work, but I would need an idea on how to calculate those as I am not very good with trigonometry/linear algebra, sadly enough.
Thank you for you help!
How I calculate my center:
var maxX = nodesSelected[0].position.x
var minX = nodesSelected[0].position.x
var maxY = nodesSelected[0].position.y
var minY = nodesSelected[0].position.y
for node in nodesSelected{
if node.position.x > maxX{
maxX = node.position.x
}
if node.position.x < minX{
minX = node.position.x
}
if node.position.y > maxY{
maxY = node.position.y
}
if node.position.y > maxY{
minY = node.position.y
}
}
return CGPoint(x: (maxX-minX)/2+minX, y: (maxY-minY)+minY/2)
How I calculate the radius of the rotation (distance between a node and the center):
extension CGPoint {
func distance(point: CGPoint) -> CGFloat {
return abs(CGFloat(hypotf(Float(point.x - x), Float(point.y - y))))
}
How I get my rotation:
sender.rotation
Given a rotationAngle, you can calculate the new position of each node with the code below, you need to know a bit of trigonometry to understand the code.
Here I have an array of SKShapeNode that I called dots (It would be the equivalent of your green nodes in the image). And the centralDot would be your central SKSpriteNode.
for dot in dots {
let dx = dot.position.x - centralDot!.position.x // Get distance X from center
let dy = dot.position.y - centralDot!.position.y // Get distance Y from center
let current_angle = atan(dy / dx) // Current angle is the arctan of dy / dx
let next_angle = current_angle - rotationAngle // Sum how much you want to rotate in radians
// the new x is: center + radius*cos(x)
// the new y is: center + radius*sin(y)
// if dx < 0 you need to get the oposite value of the position
let new_x = dx >= 0 ? centralDot!.position.x + rotationRadius * cos(next_angle) : centralDot!.position.x - rotationRadius * cos(next_angle)
let new_y = dx >= 0 ? centralDot!.position.y + rotationRadius * sin(next_angle) : centralDot!.position.y - rotationRadius * sin(next_angle)
let new_point = CGPoint(x: new_x, y: new_y)
let action = SKAction.moveTo(new_point, duration: 0.2)
dot.runAction(action)
}
Hope this helps
Update:
The first code didn't helped, so I tried another one. This one worked better on my tests.
for i in 0..<dots.count {
let dot = dots[i]
let angle = rotationAngle + CGFloat(M_PI_2 * Double(i))
let new_x = rotationRadius * cos(angle) + centralDot!.position.x
let new_y = rotationRadius * sin(angle) + centralDot!.position.y
let new_point = CGPoint(x: new_x, y: new_y)
let action = SKAction.moveTo(new_point, duration: 1/60)
dot.runAction(action)
}
rotationRadius is a constant, the distance you want between the center and the green node.

How to make a physics impulse move the same percentage of the screen on all devices

So what I am trying to do is make it so that a physics impulsee seems to have the same effect on all devices. So basically if I can figure out A way to do the following I will be able to accomplish my goal.
First lets simplify things by taking out all gravity.
Basically I need to calculate the impulse it will take to get a physics object on the far left of the screen to get to the far right of the screen in the same amount of time no matter how big the screen size is.
The reason I ask is I am making a movement system based on the magnitude and angle of a swipe. However I want it to play the same way on every device. I am calculating magnitude by
(distance (in virtual points)) / (Time spent making gesture)
Then i am applying it as a physics impulse.
This is the code I am working with:
func Jump(angle: CGFloat, strength: CGFloat)
{
if (Ready == true)
{
var rangle:CGFloat = angle * CGFloat(M_PI / 180)
var translate:CGPoint = CGPoint(x: 1, y: 0)
var vx:CGFloat = ((translate.x * cos(rangle)) - (translate.y * sin(angle)))
var vy:CGFloat = ((translate.y * cos(rangle)) + (translate.x * sin(rangle)))
vx *= width
vy *= height
vx *= (strength)
vy *= (strength)
vx /= 4000
vy /= 4000
print("Applying Impulse VX: ")
print(vx)
print(" , VY: ")
print(vy)
println(" )")
var velx = Cavity.physicsBody?.velocity.dx
var vely = Cavity.physicsBody?.velocity.dy
Cavity.physicsBody?.velocity = CGVector(dx: CGFloat(velx!) / 2, dy: CGFloat(vely!) / 2)
Cavity.physicsBody?.applyImpulse(CGVectorMake(vx, vy))
//Cavity.physicsBody?.applyImpulse(CGVectorMake(1000 / width, 1000 / height))
}
}
So basically I want it to be so that if a strength of 1 or 2 is passed it will make the same looking result on all devices.
What you can do is make the strength relative to the screen size.
strengthAdjustment = (1/375*UIScreen.mainScreen().bounds.width)
This uses the iPhone 6 screen (4.7") width (375 pts) to make the strength = 1.
With an iPhone 5s the screen will be only 320 pts which and will only require 0.8533 of the impulse strength to move the width of the screen in the same amount of time.
Hopefully this helps you out.