I followed the Bing Maps docs for Zoom into Clusters. Since Microsoft only embeds a screenshot, here's a working example. When you click on a cluster, notice that the cluster itself is not updated. Just pan the map very little bit, and then the map with its clusters is refreshed.
So either the Bing Maps documentation is broken or it is a Bing Maps bug.
Any idea for a workaround, e.g. how to force a map refresh after the map has zoomed in?
The relevant code (which does not update the map/clusters) is this, especially the last line:
function clusterClicked(e) {
if (e.target.containedPushpins) {
var locs = [];
for (var i = 0, len = e.target.containedPushpins.length; i < len; i++) {
//Get the location of each pushpin.
locs.push(e.target.containedPushpins[i].getLocation());
}
//Create a bounding box for the pushpins.
var bounds = Microsoft.Maps.LocationRect.fromLocations(locs);
//Zoom into the bounding box of the cluster.
//Add a padding to compensate for the pixel area of the pushpins.
map.setView({ bounds: bounds, padding: 100 });
}
This is a known issue that has been fixed in the experimental branch. You can try this out by adding "&branch=experimental" to the map script URL. All fixes and features in the experimental branch will be rolled into the main release branch at the end of July.
Also, just so you are aware, there are a bunch of interactive code samples for V8 available here: http://www.bing.com/api/maps/sdk/mapcontrol/isdk#overview
Related
im trying to create a route that follows a gps trace i provide.
The gps trace is cleaned up and it has no loops in it and is in correct order.
I checked it with other services.
It has 1920 points.
You can find the trace here GPX Files
Sadly if i create a route based on provided sdk example (github) i get loops in my path.
I was hoping you could help me to solve following problems:
how do i avoid loops while creating route by using HERE ios Swift SDK
how do i set route options is such way to follow provided point array and not create a fastest or balanced route.
Since i could not find those functions in Ios sdk i used additional REST API to filter the route a bit to remove all points that were not matched correctly according to here maps... before drawing the route.. ie everything with low probability, warnings, big distance to the road... yet the result is still not good. Here is a cleaned up file.. the file is being created after the original was maped / run once through HERE Maps. In this file all points that have low confidence or produce warnings or have big distance to original points .. are removed. This is the one i use to create a route and it still have the same issues like loops and weird turns.
Thank you very much in advance!
BR.
So far i have this code:
private lazy var router = NMACoreRouter()
#objc func do_routing_stuff( gps_trace :[NMAWaypoint]) {
var stops = [Any]()
stops = gps_trace
let routingMode = NMARoutingMode(routingType: .fastest,
transportMode: .car,
routingOptions: .avoidHighway)
// Trigger the route calculation
router.calculateRoute(withStops: stops ,
routingMode: routingMode)
{ [weak self] routeResult, error in
guard error == .none else {
self?.showMessage("Error:route calculation returned error code \(error.rawValue)")
return
}
guard let result = routeResult, let routes = result.routes, routes.count > 0 else {
self?.showMessage("Error:route result returned is not valid")
return
}
// Let's add the 1st result onto the map
self?.route = routes[0]
self?.updateMapRoute(with: self?.route)
// self?.startNavigation()
}
}
private func updateMapRoute(with route: NMARoute?) {
// remove previously created map route from map
if let previousMapRoute = mapRoute {
mapView.remove(mapObject:previousMapRoute)
}
guard let unwrappedRoute = route else {
return
}
mapRoute = NMAMapRoute(unwrappedRoute)
mapRoute?.traveledColor = .clear
_ = mapRoute.map{ mapView?.add(mapObject: $0) }
// In order to see the entire route, we orientate the
// map view accordingly
if let boundingBox = unwrappedRoute.boundingBox {
geoBoundingBox = boundingBox
mapView.set(boundingBox: boundingBox, animation: .linear)
}
}
in comparison same route presented with leaflet maps.
I believe the problem you have is that you are feeding the Routing API a large number of waypoints, all of which are in close proximity to each other.
You have almost 2000 waypoints in your GPX file (and ~1300 in your cleaned one). Each of these waypoints is less than 10 meters distance from their closest neighbors. This is not the type of data that the Routing API is really designed to work with.
I've experimented with your GPX Trace and I have come up with the following solution: simply skip a bunch of coordinates in your trace.
First, clean up your trace using the Route Matching API (which I believe you have been doing).
Second, pick the first trkpt in the GPX file as your first waypoint for the Routing call. Then skip the next 20 points. Pick the following trkpoint as the second waypoint. Repeat this until you are at the end of the file. Then add the last trkpt in the trace as the final waypoint.
Then call the Routing API and you should get a good match between your trace and your route, without any loops or other weird routing artefacts.
Some notes:
I have picked 20 as the number of traces to skip, because this would put about 200m in between each waypoint. That should be close enough to ensure that the Routing API does not deviate too much from the traced route. For larger traces you may wish to increase that number. For traces in urban environments with lots alternate routes, you may want to use a smaller number.
It's important to clean the data with the Route Matching API first, to avoid picking outliers as waypoints.
Also, you may not wish to use the "avoidHighways" option. Given your use case, there doesn't seem to be a benefit and I could see it causing additional problems.
By now you probably worked it out, but your waypoints are likely landing on bridges or tunnels that are along your route but not on the road you want. I.e. the waypoint is intended to be on the road under the bridge but the routing engine perceives that you want to drive on the bridge.
The routing engine is looping around those roads to drive you on that waypoint on the bridge or in the tunnel.
There is no simple solution to this that I have found.
I am using mapbox geocoder to search for a location in one of my mapbox project. I am able to implement the search functionality successfully but I am having hard time in extracting the selected address by the user from the list of searched/suggested addresses.
Is there any way to get the coordinates of the searched place?
I went through mapbox docs but only found an event 'moveend' which is fired when we fly to some location. So I am listening for that event and once fired, I am calling getBounds() function on the map. It somehow works but it doesn't give me the exact coordinates since getBounds() gives the South West and North East coordinates of the box. Can we get the exact coordinates of the selected place?
Below is the code I am using to search for a place.
initMapSearch() {
var _this = this;
var geocoder = new MapboxGeocoder({ accessToken: environment.mapbox.accessToken, mapboxgl: mapboxgl });
this.locationObj = geocoder.onAdd(_this.map)
document.getElementById('geocoder').appendChild(this.locationObj);
}
And this is my moveend event.
this.map.on('moveend', function() {
_this.lat = _this.map.getBounds()["_ne"]["lng"];
_this.long = _this.map.getBounds()["_ne"]["lat"];
});
Thanks in advance to the community.
You should listen to geocoder.on('result') which is passed the feature (and its location).
I am using this plugin : https://github.com/perliedman/leaflet-routing-machine.
My code looks like this:
for(let i=0; i<markers.length; i++){
points.push(L.latLng(markers[i].latitude, markers[i].longitude))
}
this.routingControl = L.Routing.control({
waypoints: points
}).addTo(this.map);
When I pass points filled with different latitude/longitude, it draws the route fine. But let's imagine the following scenario. let's say that points array contains 3 items. each item contains latitude/longitude and let's say that those latitude/longitude are the same. So something like this:
34.72581233927868 -80.71105957031251
34.72581233927868 -80.71105957031251
34.72581233927868 -80.71105957031251
Now what Routing control does is as it can't draw the route, it automatically zooms in in the maximum way and in the console, it shows the errors. {"message":"Zoom level must be between 0-20."}
Workaround 1: after drawing routes, i decided to use settimeout after 1 second and there I zoom at 11 by myseslf. this way it zooms out, but in the console, errors still stay. How do I fix this?
If you know your input can be invalid, then the cleanest way would be to filter it on the way in to remove duplicates. If that's not possible for some reason and you want to let LRM determine if there's an error, then you can catch the error event with:
L.Routing.control({
...
})
.addTo(this.map)
.on('routingerror', function(e) {
// do your error action here - e.g. zoom out
})
For more information on handling errors in LRM, see https://www.liedman.net/leaflet-routing-machine/api/#eventobjects
I have a Mapbox GL map with a single layer and multiple markers on that layer, I am trying to draw route + show route information (distance / time / routes to take from origin to destination) in my app using the Directions GL plugin. Unfortunately I can't find any information beyond setting origin / destination (as shown below) in order to display route + route data on my map. The only available information I could find was that mentioned in MapBox GL driving directions example, yet this is not what I really want as I don't want to show the origin / destination as A and B points, nor show the A/ B points search box as in the mapbox.com example above.
Can someone please help me by telling me what I'm missing here and how I can draw the route between origin / destination, display route info using the Mapbox GL plugin? Thanks
var map = new mapboxgl.Map({
container: 'map',
style: 'mapbox://styles/mapbox/streets-v8',
center: [userCoordinates.coords.longitude, userCoordinates.coords.latitude],
zoom: 15
});
var directions = new mapboxgl.Directions({
unit: 'metric',
profile: 'driving'
});
directions.setOrigin([userCoordinates.coords.longitude, userCoordinates.coords.latitude]);
map.on('click', function(e) {
var features = map.queryRenderedFeatures(e.point, { layers: ['gsLayer'] });
if (!features.length) {
return;
}
var feature = features[0];
directions.setDestination([feature.geometry.coordinates[0], feature.geometry.coordinates[1]]);
});
It sounds like you don't want to use a plugin at all and instead make a request directly to the Directions API.
I'd recommend taking a look at mapbox-sdk-js - a helpful js lib for making client requests. The API docs for directions can be found here.
I am trying to persist markers in an augmented reality game. Here is the gist of what I am doing:
I have my users recording and saving an area to an ADF. Then they drop marker’s into the scene and save out their position data in Unity World coordinates to a text file. I then restart the app, load and localize to the ADF and load the markers.
In order to get this working, I've modified the ARPoseController.cs file in the Unity demo package to use the Area Description as it's base frame. In the _UpdateTransformation method I've swapped out the frame pairs
pair.baseFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_START_OF_SERVICE;
pair.targetFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_DEVICE;
for
pair.baseFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_AREA_DESCRIPTION;
pair.targetFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_DEVICE;
I've also added some code confirming that I'm successfully localizing to the ADF, but I'm noticing that my markers position in Unity World Space do not position properly relative to real environment.
I can confirm that my markers save and load properly based on START_OF_SERVICE origin so I assume that they are properly serializing and deserializing. What could be causing this? Am I wrong in assuming this should just work by switching the base framepair to Area_Description instead of START_OF_SERVICE?
I had a similar problem getting the AR and ADF integrated, I had to modify the TangoPointCloud to check if you're using an AreaDescription in OnTangoDepthAvailable() and adjust the baseFrame target as required.
i.e.:
if (m_tangoDeltaPoseController.m_useAreaDescriptionPose)
{
pair.baseFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_AREA_DESCRIPTION;
pair.targetFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_DEVICE;
}
else
{
pair.baseFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_START_OF_SERVICE;
pair.targetFrame = TangoEnums.TangoCoordinateFrameType.TANGO_COORDINATE_FRAME_DEVICE;
}
That way, the geometry of the point cloud adjusts itself based on the ADF offset instead of from device start.
After that change, when I'm using the sample code for AR to drop markers, it registers the surface properly so I'm placing the markers in the correct spots and orientation. I'm still encountering some flakiness with the markers not adjusting when relocalized though, have to look into the AreaLearningInGameController for loop closure events.
Hope that helps!