Clustering behaviour buggy on mobile devices - cluster-analysis

i'm having trouble getting the clustering technique working on mobile and desktop devices equivalently.
The problem is well seen in the here maps examples at https://developer.here.com/api-explorer/maps-js/clustering/marker-clustering
Resolving all clusters at max zoom works well on desktop browsers. While on mobile devices e.g. IOS 12 Safari browser won't resolve all clusters when max zoom is reached.
Further more if i attach a tap event to clustering provider like (or in traditional way)
fromEvent(this.chargingstationCluster.provider, 'tap').subscribe((event: H.util.Event) => {
try {
event.target.getData().isCluster();
this.map.setViewBounds(event.target.getData().getBounds(), true);
} catch (e) {
}
})
On mobile devices the cluster does not get resolved, instead it is only repositioned to center.
Any hints or solutions on this?
**edit
Actually found out that mobile devices do not zoom to configured max zoom level.
E.g. having Max-Zoom level configured to 19, the ios devices only zooms to max 17.415037499278842

So actually mobile devices loose 1.584962500721158 points when calculating max zoom.
Adding 1.584962500721158 to max zoom finally leads to a max zoom of 19

Related

Styling layout in Codenameone for phone and tablet

I completed the development of my app, using a layout of form components that perfectly suited the iphone 6 in the simulator. I used a combination of Theme styles (always Base Resource) and some coded tweaks to the look and feel.
When i ran this up against the iphone 5 i had expected naively the display to shrink to fit, as you may expect with browser applications, but instead my components (labels largely) went off the edge of their containers. Panic.
I ended up having to measure the display height and judge the device from there and code up different sized components to get the right fit. This took some time.
Next to TestFlight in the AppStore. As part of wider testing I decided to install on my iPad 3, only to find the layout components rendered all very small. Panic.
I have now spent a couple of days resolving this just about. I basically use this method to determine the type of device 'category' to then apply the size of font or fontimage etc.
public static ScreenSizeEnum getScreenSize() {
if (Display.getInstance().isTablet() && Display.getInstance().getDisplayHeight() > 2000) {
return ScreenSizeEnum.TABLET;
}
else if (Display.getInstance().getDisplayHeight() < 950) {
return ScreenSizeEnum.SMALL;
} else if (Display.getInstance().getDisplayHeight() > 949 && Display.getInstance().getDisplayHeight() < 1200) {
return ScreenSizeEnum.MEDIUM;
} else {
return ScreenSizeEnum.LARGE;
}
}
This is unsurprisingly not foolproof. The Ipad 6 plus is not recognised as a tablet but has a large display height, but one of the Nexus's are a tablet but has a small display height.
My question is, how on earth do you get around this problem?
Tablets and phones come in different sizes but its important that you still get a quality component render regardless of form factor.
The CN1 KitchenSink demo didn't really address it. Many thanks in advance.
From the description of the issue it seems you are thinking about a tablet as a "large phone" which is the wrong way to look at it. A tablet has similar density to a phone but more real-estate in inches which means you need to design your app so it will use up the additional space more effectively.
Here are two screenshots of the same Kitchen Sink demo one running in an iPad and the other running in an iPhone. Notice the UI's look very differently as we adapted the UI to use up the additional space. Image's (e.g. multi-images) and fonts are determined by density not by the amount of pixels as the goal is to use the extra space not to fill up the screen with larger images/text.

skmaps in mobile is slow

We use the skmaps's(Skobbler) SDK in android and IOS,
Our function is to show one pin in skmaps, the other settings is default.
We can show the map on both android and IOS,
but the loading time is too long, it takes about 5~15 seconds.
I've been trying different Wifi network and different device(iphone 6s plus and HTC M9), but it's still slow.
Can anyone help me?
The SDK doesn’t seem to be slow at rendering.
So check the downloading process for the requested vector tiles.
How to improve it:
reduce the requested area - increase the zoom level so that as soon as some vector data becomes available something will get rendered (i.e. set the zoom level to 18 or 17)
reduce the size of the requested vector tiles - switch to using LightMaps (initMapSettings.setMapDetailLevel(SKMapsInitSetti ngs.SK_MAP_DETAIL_LIGHT) - the light maps contain fewer elements and thus are smaller and will be rendered faster
don't use pannable maps but use static maps when displaying the location of a certain POI - for this exact use case, showing a mini map associated with a POI most products use a static map (fixed png/jpg) as it renders instantly. This is especially relevant in scenarios when the POIs are in different parts of the world

Measuring distance between two iOS Devices

Yeah, I'm currently wondering about this.
In my use case the devices will be 50cm to 10m apart and I'd like it to be accurate to at least 10 cm. (Therefore GPS is not an option)
2 Ways spring to mind:
Sound: I asked about this in the dev forums and I'm in contact with laanlabs, about the code of their sonar ruler.
Picture on one device + Camera on the other: Seems easier to set up, since my user case involves the user facing one device at 90 degrees anyway. But it would be more work for the user to face the camero into the direction and it would not react to a change in distance.
Now the question: Is anyone aware of any code that does something like this already? Possibly a non-iPhone general c-Project?
Method with camera: we already know size for each device. You take a picture of device, calculate it's height/width to determine type of device (iPhone/iPod or iPad), than calculate a distance.
For example - if device is iPhone you know, that its size is 115x58 mm. On picture it NxM pixels. Now you can calculate the distance. (If N & M smaller hence distance is larger)
If you were to use the sound method one approach would be to have device A emit a sound, device B would then be listening for this at all times and on detection echo back a secondary sound. This would give you a round-trip time from which you could calculate distance - don't forget to compensate for latency between detection re-emission as well.
I am not sure about but this is what i found from one of the answers in this previous SO question How to measure distance between two iphone devices using bluetooth?
Using bluetooth for localization is a very well known research field . The short answer is: you can't. Signal strength isn't a good indicator of distance between two connected bluetooth devices, because it is too much subject to environment condition (is there a person between the devices? How is the owner holding his/her device? Is there a wall? Are there any RF reflecting surfaces?). Using bluetooth you can at best obtain a distance resolution of few meters, but you can't calculate the direction, not even roughly.
You may obtain better results by using multiple bluetooth devices and triangulating the various signal strength, but even in this case it's hard to be more accurate than few meters in your estimates.

Google Maps on IPhone won't zoom

I have an iPhone app that loads the following URL:
http://maps.google.com?q=Apple Stores&ll=37.331689,-122.030731&z=3
As expected, the google maps app loads on the phone, but zooms to the entire US. It does not appear to be respecting the z= parameter, or at least, not allowing a zoom level that shows stores in the area.
z=3 zooms to the country level. z=1 zooms to the "world" level. Larger numbers increase the zoom. Try something like z=8 or 9
http://maps.google.com?q=Apple+Stores&ll=37.331689,-122.030731&z=9
Edit: On second thought, 10 or 11 might be more appropriate. I guess it depends on how close you want the zoom. The maximum is 19

Safari iPhone - How to detect zoom level and offset?

I looking for options on how to track user zooming and panning on a page when viewed in Safari on an iPhone. Safari exposes move and gesture events, so theoretically I can keep a running tally of pan and zoom operations, but that seems like overkill since the browser must track that internally.
Is this information exposed through the Document Object Model?
When you zoom in, window.innerWidth is adjusted, but document.documentElement.clientWidth is not, therefore:
var zoom = document.documentElement.clientWidth / window.innerWidth;
(I've tested iOS4, without viewport <meta>).
However, I wouldn't rely on it for anything important. DOM viewport sizes/pixel sizes in mobile browsers are a complete mess.
On Mobile Safari and Android, here is an accurate way to measure how much the page has been zoomed.
Try it here: http://jsbin.com/cobucu/3 - change zoom then click measure.
Technique is to add a top level div:
<body>
<div id=measurer style="position:absolute;width:100%"></div>
and use the calculation:
function getZoom(){
return document.getElementById('measurer').offsetWidth / window.innerWidth;
}
The only problem is finding a tidy way to detect that the user has changed zoom (pinch, double tap, etc). Options:
webkitRequestAnimationFrame: very reliable, but likely to cause jankiness if using animations (due to performance hit)
setInterval: reliable but very ugly
touch events: look for two fingers or
double tap: ugly and maybe difficult to make 100% reliable
window.onresize + window.onorientationchange + window.onscroll: simple but totally unreliable (Edit: and onscroll can cause performance problems in WKWebView or Mobile Safari 8 or greater).
PS: Windows Phone needs a different solution (pinch-zoom doesn't change the viewport - pinch-zoom on Windows has its own separate viewport that is not visible to javascript).
Edit: Android Visual Viewport resize and scroll events may help? See https://developer.mozilla.org/en-US/docs/Web/API/VisualViewport#Events
According to the Safari Web Content Guide, zoom events (double tap) are not exposed, so I'm not sure how you can track this.
I do not believe this information is exposed through the DOM.
I actually think things might have moved on a little since Steve's answer, as having a look at the content guide link he provided I can see a section on Handling Multi-Touch Events and also Handling Gesture Events.
Haven't tried them yet but they look pretty promising. I'll provide an update once I've checked them out and have a demo link available...
I measure zoom this way (works on iOS only):
screenOrientedWidth = screen.width;
if (window.orientation == 90) {
screenOrientedWidth = screen.height;
}
return screenOrientedWidth / window.innerWidth;
It doesn't depend of how wide content is.
However, in iOS Safari window.innerWidth isn't correct inside a gestureend handler. You should defer such calculation for later execution. In GWT, I use scheduleDeferred, but I can't say how to implement this in pure JavaScript.
If you are using any elements with location:fixed this can get complicated, as the location:fixed coordinates are relative to the unzoomed window, where window coordinates are relative to the zoomed viewport. More info: How to position a fixed-location element on IOS browser when zoomed?