Have Leaflet panTo not center - leaflet

I have a trail on a map that the user can "follow" by mousing over a graph (time and speed). If the user zooms in a lot, part of the trail may not be visible. When the user wants to see the part of the trail that is not showing I use the panTo method...
The panTo method of leaflet is currently also centering. I don't want to center, I want the map to move just enough to show a point. (The problem with panTo is it causes excessive map scrolling and a harsh user experience.)
I have tried changing the bounds, but that has an (unwanted) side affect of sometimes zooming out.
Any way I can do a "minimal" panTo?
This is a (working but unpolished) solution; map is our own map wrapper utility class, lmap is a leaflet map object in typescript, and toxy() is a method to convert lat/longs to x/y values.
if (!this.lmap.getBounds().contains(latlng)) {
let target = this.map.toxy(latlng);
let nw = this.map.toxy(this.lmap.getBounds().getNorthWest());
let se = this.map.toxy(this.lmap.getBounds().getSouthEast());
let x = 0, y = 0;
let margin = 75;
if (target.y < nw.y)
y = (-1 * (nw.y - target.y)) - margin;
else if (target.y > se.y)
y = (target.y - se.y) + margin;
if (target.x < nw.x)
x = (-1 * (nw.x - target.x)) - margin;
else if (target.x > se.x)
x = (target.x - se.x) + margin;
this.lmap.panBy(new L.Point(x, y));
}

First, fetch the bounds of the map (measured in pixels from the CRS origin) with map.getPixelBounds(). Then, use map.project(latlng, map.getZoom()) to get the coordinates (in pixels from the CRS origin) of the point you're interested.
If you're confused about this "pixels from the CRS origin" thing, read the "Pixel Origin" section at http://leafletjs.com/examples/extending/extending-2-layers.html .
Once you have these pixel coordinates, it should be a simple matter of checking whether the point is inside the viewport, and if not, how far away on each direction it is.

http://jsfiddle.net/jcollin6/b131tobj/2/
Should give you what you want
function clamp(n,lower,upper) {
return Math.max(lower, Math.min(upper, n));
}
//Shamelessly stolen from https://gist.github.com/dwtkns/d5b9b60285b8b0067c53
function getNearestPointInPerimeter(l,t,w,h,x,y) {
var r = l+w,
b = t+h;
var x = clamp(x,l,r),
y = clamp(y,t,b);
var dl = Math.abs(x-l),
dr = Math.abs(x-r),
dt = Math.abs(y-t),
db = Math.abs(y-b);
var m = Math.min(dl,dr,dt,db);
return (m===dt) ? {x: x, y: t} :
(m===db) ? {x: x, y: b} :
(m===dl) ? {x: l, y: y} : {x: r, y: y};
}
L.ExtendedMap = L.Map.extend({
panInside: function (latlng, pad, options) {
var padding = pad ? pad : 0;
var pixelBounds = this.getPixelBounds();
var center = this.getCenter();
var pixelCenter = this.project(center);
var pixelPoint = this.project(latlng);
var sw = pixelBounds.getBottomLeft();
var ne = pixelBounds.getTopRight();
var topLeftPoint = L.point(sw.x + padding, ne.y + padding);
var bottomRightPoint = L.point( ne.x - padding, sw.y - padding);
var paddedBounds = L.bounds(topLeftPoint, bottomRightPoint);
if (!paddedBounds.contains(pixelPoint)) {
this._enforcingBounds = true;
var nearestPoint = getNearestPointInPerimeter(
sw.x + padding,
ne.y + padding,
ne.x - sw.x - padding * 2,
sw.y - ne.y - padding * 2,
pixelPoint.x,
pixelPoint.y
);
var nearestPixelPoint = L.point(nearestPoint.x,nearestPoint.y)
var diffPixelPoint = nearestPixelPoint.subtract(pixelPoint);
var newPixelCenter = pixelCenter.subtract(diffPixelPoint);
var newCenter = this.unproject(newPixelCenter);
if (!center.equals(newCenter)) {
this.panTo(newCenter, options);
}
this._enforcingBounds = false;
}
return this;
}
});

Use this way
map.panTo([lat, lng]);
map.setZoom(Zoom);

Related

Why did Xcode warn me about making this a constant, and why does it still change?

Within my updateBlob function, Xcode warned me that pos is unchanged and should be changed to a let constant, even though I can see that it's being changed, and running the program does indeed change the position values. This all seemed to happen when I updated the BlobPos class with a defer keyword to update the x/y coordinates when it is sent the radius value. Although I could avoid using defer, why does the compiler warn me of making pos a constant, and the program is still able to change what should presumably be a constant?
class BlobPos
{
var x:CGFloat = 0
var y:CGFloat = 0
public init(radius:CGFloat) {
defer {
x = radius + 5
y = radius + 5
}
}
}
class Blob
{
var radius: CGFloat
var pos: BlobPos
init
(
radius: CGFloat,
pos: BlobPos,
)
{
self.radius = radius
self.pos = pos
}
}
func makeBlob() -> Blob
{
let radius = 8
let pos = BlobPos(radius:radius)
return Blob(radius: radius, pos: pos)
}
func updateBlob(blob:Blob)
{
let radius = blob.radius
let pos = blob.pos // compiler warning wanting me to turn this into a let constant instead of var
pos.x += 6
pos.y += 2
blob.pos = pos // strangely, new position is set
}
That is because BlobPos is a class and changing a class's properties doesn't change its location in memory, which is how classes are passed around (by reference to their location in memory). If BlobPos were a structure, then you would have to declare it a variable because structures are passed around by their values (not references to their locations in memory).

attempt to index local 'def' (a nil value)

I'm working on a game. At a certain point, I would like to create special GameObject. The special GameObject is called Projectile and is the same as GameObject but has a dx and dy as well as some other functions that only a projectile will have. I am trying to make Projectile extend the GameObject class, but I run into issues when I try to create an instance of Projectile. I've tried different ways of declaring Projectile and shuffling declaration order but can't seem to figure out why I'm getting the error mentioned in the title. Thank you!
The following works just fine:
table.insert(self.dungeon.currentRoom.objects, GameObject(
GAME_OBJECT_DEFS['pot'],
self.player.x,
self.player.y
))
But when I change "GameObject" to "Projectile" it does not.
table.insert(self.dungeon.currentRoom.objects, Projectile(
GAME_OBJECT_DEFS['pot'],
self.player.x,
self.player.y
))
The rest is supporting code. I'm using Class code from Matthias Richter
require 'src/Projectile'
require 'src/GameObject'
require 'src/game_objects'
GameObject = Class{}
function GameObject:init(def, x, y)
-- string identifying this object type
self.type = def.type
self.texture = def.texture
self.frame = def.frame or 1
-- whether it acts as an obstacle or not
self.solid = def.solid
self.defaultState = def.defaultState
self.state = self.defaultState
self.states = def.states
-- dimensions
self.x = x
self.y = y
self.width = def.width
self.height = def.height
-- default empty collision callback
self.onCollide = def.onCollide
end
Projectile = Class{__includes = GameObject}
function Projectile:init()
GameObject.init(self, def)
self.dx = 0
self.dy = 0
end
GAME_OBJECT_DEFS = {
['pot'] = {
type = 'pot',
texture = 'tiles',
frame = 14,
width = 16,
height = 16,
solid = true,
defaultState = 'idle',
states = {
['idle'] = {
frame = 14,
}
},
onCollide = function()
end
}
}
function Projectile:init()
GameObject.init(self, def)
self.dx = 0
self.dy = 0
end
def is nil
so in
function GameObject:init(def, x, y)
-- string identifying this object type
self.type = def.type
you index a nil value.
Same will happen for x and y

TERMINATION Stack is not empty at destruction. Error Swift

I am using Google Mobile Vision to detect face landmarks and the distances between them. The weird thing about this error is that it is not consistent and does not show up all the time I run the same function.
Here is the error message...
TERMINATION
ert_Error ebs_ObjectStack::~ebs_ObjectStack():
Stack is not empty at destruction.
This can be an indiaction for a stack leak.
Please check the code where this instance was used.
libc++abi.dylib: terminating with uncaught exception of type ert_Error
(lldb)
Usually the first time I run the app in 24 hours everything goes smoothly and occasionally afterwards. For the most part though it appears. As I said earlier I am using Google Mobile Vision to detect and store certain values.
Here is the function that is causing the error...
func train2(image2: UIImage) {
print("TRAIN2 has been called")
// let options1 = [GMVDetectorFaceLandmarkType: GMVDetectorFaceLandmark.all.rawValue, GMVDetectorFaceClassificationType: GMVDetectorFaceClassification.all.rawValue, GMVDetectorFaceTrackingEnabled: true, GMVDetectorFaceMinSize: 0.09] as [String : Any]
var deviceOrientation = UIDevice.current.orientation
var devicePostion = AVCaptureDevice.Position.back //.Position(rawValue: (currentCameraPosition?.hashValue)!)
if CameraController().currentCameraPosition == .rear {
devicePostion = AVCaptureDevice.Position.back
}else if CameraController().currentCameraPosition == .front {
devicePostion = AVCaptureDevice.Position.front
}
var lastKnownDeviceOrientation = UIDeviceOrientation(rawValue: 0)
var orientation = GMVUtility.imageOrientation(from: deviceOrientation, with: devicePostion, defaultDeviceOrientation: lastKnownDeviceOrientation!)
var options3 = [GMVDetectorImageOrientation:orientation.rawValue]
// var sampbuff = cvPixelBufferRef(from: image2)
// var AImage = GMVUtility.sampleBufferTo32RGBA(sampbuff as! CMSampleBuffer)
var happyFaces = GfaceDetector.features(in: image2, options: options3) as! [GMVFaceFeature] // THE ERROR SEEMS TO ORIGINATE FROM THIS LINE
print("The Amount Of Happy Faces is: \(happyFaces.count)")
for face:GMVFaceFeature in happyFaces {
print(face.smilingProbability)
print("Go Google")
var YAngle = Float(face.headEulerAngleY)
var ZAngle = Float(face.headEulerAngleZ)
print("Y Angle \(YAngle), Z Angle \(ZAngle)")
if YAngle > -2.0 && YAngle < 4.0 && ZAngle > -2.0 && ZAngle < 2.0 {
var proportion = ((face.bounds.width * face.bounds.width) + (face.bounds.height * face.bounds.height))
var ratio = sqrtf(Float(proportion))
DBetweenEyes = (Float(distanceBetween(face.leftEyePosition, face.rightEyePosition)) * 1000) / ratio
DBetweenLEyeAndLEar = (Float(distanceBetween(face.leftEyePosition, face.leftEarPosition)) * 1000) / ratio
DBetweenREyeAndREar = (Float(distanceBetween(face.rightEyePosition, face.rightEarPosition)) * 1000) / ratio
DBetweenLEarAndREar = (Float(distanceBetween(face.leftEarPosition, face.rightEarPosition)) * 1000) / ratio
DBetweenLEyeAndNose = (Float(distanceBetween(face.leftEyePosition, face.noseBasePosition)) * 1000) / ratio
DBetweenREyeAndNose = (Float(distanceBetween(face.rightEyePosition, face.noseBasePosition)) * 1000) / ratio
DBetweenLEyeAndMouth = (Float(distanceBetween(face.leftEyePosition, face.bottomMouthPosition)) * 1000) / ratio
DBetweenREyeAndMouth = (Float(distanceBetween(face.rightEyePosition, face.bottomMouthPosition)) * 1000) / ratio
DBetweenLEarAndMouth = (Float(distanceBetween(face.leftEarPosition, face.bottomMouthPosition)) * 1000) / ratio
DBetweenREarAndMouth = (Float(distanceBetween(face.rightEarPosition, face.bottomMouthPosition)) * 1000) / ratio
print("Distances Established")
}
}
}
I have just discovered that the error is very rare when called when a button is pressed after the view loads. The error occurs far more frequently when called in the viewDidLoad method. Could this indicate some sort of pattern? How would I fix this error?
Any help or suggestions are helpful.
Thanks in advance!

Coreplot CPTXYGraph plot not displaying correctly

I am using the latest version of Core Plot within a Swift program. The graph plots without a problem and shows the following
If you look at the plot the lower portion of the gradient fill for the plot does not match the xAxis which is the behaviour that I expect. instead it starts part way up the yAxis.
I have not experienced this behaviour before and am not sure where to hunt for the bug in my code. I haven't been able to find a relevant entry elsewhere within stackOverflow or elsewhere on the net for that matter.
I have tested the plotspace variables and they are on the money, min/max for the Y axis is as expected and the X/Y coords through the delegate functions shown below [number() and numberOfRecords()] are as expected. I am not sure what within core plot will be driving this behaviour.
If anyone is able to suggest other areas to examine for the error that would be greatly appreciated.
Rather than post all of the code initially, if someone can suggest an area to explore I will add that element of the code in to the question to help with the diagnosis. I have initially only included the code for the section of the app that defines the scatter line plot [configureLineGraph()] below the delegate functions below.
// Delegate functions for Core Plot
func numberOfRecords(for plot: CPTPlot) -> UInt {
numberOfRecordsToBeDisplayedOnGraph = UInt(arrayOfCommanderProgress.count)
return numberOfRecordsToBeDisplayedOnGraph // this needs to represent the number of records to be displayed on the graph
}
func number(for plot: CPTPlot, field fieldEnum: UInt, record idx: UInt) -> Any? {
let index = Int(numberOfRecordsToBeDisplayedOnGraph - idx) - 1
switch CPTScatterPlotField(rawValue: Int(fieldEnum))! {
case .X:
let xCoord = (arrayOfCommanderProgress[index].timeStamp.convertDateStringToDateObject()).timeIntervalSinceReferenceDate
return xCoord
case .Y:
var yCoord = Double()
if fieldEnum == 1 {
switch graphToDisplay {
case graphCombat:
yCoord = (Double(self.arrayOfCommanderProgress[index].combat)) + (100.0 * (Double(arrayOfRankProgress[index].combat)))
case graphTrade:
yCoord = (Double(self.arrayOfCommanderProgress[index].trade)) + (100.0 * (Double(arrayOfRankProgress[index].trade)))
case graphExploration:
yCoord = (Double(self.arrayOfCommanderProgress[index].explore)) + (100.0 * (Double(arrayOfRankProgress[index].explore)))
case graphFederation:
yCoord = (Double(self.arrayOfCommanderProgress[index].federation)) + (100.0 * (Double(arrayOfRankProgress[index].federation)))
case graphEmpire:
yCoord = (Double(self.arrayOfCommanderProgress[index].empire)) + (100.0 * (Double(arrayOfRankProgress[index].empire)))
case graphCQC:
yCoord = (Double(self.arrayOfCommanderProgress[index].cQC)) + (100.0 * (Double(arrayOfRankProgress[index].cQC)))
default:
break
}
}
return yCoord
}
}
// Funtions to set up the graphView and plots
func configureLineGraph() {
// Plotline variables for scatterplot
let plotLineTotal = CPTScatterPlot()
let areaGradient:CPTGradient = CPTGradient(beginning: CPTColor.clear(), ending: CPTColor(componentRed: 150/255.0, green: 180/255.0, blue: 200/255.0, alpha: 0.9))
areaGradient.angle = 90.0
let areaGradientFill:CPTFill = CPTFill(gradient: areaGradient)
// Set up plot lines for each type of total to be graphed
plotLineTotal.areaFill = areaGradientFill
// set up plotline characteristics
let plotLineType = CPTMutableLineStyle()
plotLineType.lineColor = CPTColor.darkGray()
// establish a reference to the graph custom view
guard let graph = graphView.hostedGraph else { return }
// Add additional plotlines to the plots Array if needed
let plots = [plotLineTotal]
for plot in plots {
plot.dataSource = self
plot.delegate = self
plot.areaBaseValue = CPTDecimalFromInteger(0) as NSNumber
plot.dataLineStyle = plotLineType
graph.add(plot, to: graph.defaultPlotSpace)
}
graph.reloadData()
}

Swift for-loop wont accept index as a variable multiplied with a size.widht value

Code:
The issue: Im not allowed to operate the index to size.widht. I know its an CG value but I am allowed to operate with literals. However, when I try to use an declared int instead it doesnt allows it either.
size.widht/5 * (index + 1) says "Cannot invoke '+' with an argument list of type '($T6,($T10))'.
func addBricks(CGsize) {
for var index = 0; index < 4; index++ {
var brick = SKSpriteNode(imageNamed: "brick")
brick.physicsBody = SKPhysiscsBody(rectangleOfSize: brick.frame.size)
var xPos = size.widht/5 * (index + 1)
var yPos = size.height - 50
brick.position = CGPointMake(xPos, yPos)
self.addChild(brick)
}
}
What could possible be wrong?