OSMdroid long click listener/double tap - multi-touch

I am currently using OSMdroid.
I want to be able to mark (with the touch screen) certain points on my map.
The first option is to use long clicks. the problem is that my system recognizes scrolls as long clicks.
The second option is to use double tab. This has the effect of changing the zoom level.
I want to be able to distinguish between long clicks and scrolls. is there a simple way to do this?
There are some suggetions in the following link:
However, I am not sure that these suggestions can be implemented with OSMdroid.
Thanks in advance!
Ariel

Note that this method will only allow a single touch event.
Your activity needs to implements MapEventsReceiver.
You then need to add A MapEventsOverlay to the mapwith:
//Handling Map events
MapEventsOverlay mapEventsOverlay = new MapEventsOverlay(this, this);
map.getOverlays().add(0, mapEventsOverlay); //inserted at the "bottom" of all overlays
Documented in the OSMDroidBonus Pack Tutorial

When I wanted to add some touch functionality to OsmMap, I also found that tap and scroll and such were already used.
So I made all my gestures require longpress with three fingers to start. The gestures were long press with three fingers then slide left or right or long press with three fingers then fling up. That way my gestures did not interfere with any normal map gestures.
My app also had to consume touch events after it detected the three finger longpress. Otherwise, the osm gestures and my gestures would compete causing weird scrolling and zooming.
This requires adding a special Overlay and code like this:
#Override
public boolean onTouchEvent(MotionEvent event, MapView mapView)
{
boolean detected = false;
if (myGestureDetector.onTouchEvent(event))
{
detected = true;
}
if (myGestureDetector.isTouchEventActive())
{
Log.d(D_LOG, "TOUCH EVENT ACTIVE DRAIN IT");
detected = true;
}
return detected;
}

Related

How to prevent Scroll Box from consuming touch event in Unreal?

I am working on a UI system for a mobile game in Unreal. I have functionality already in place that basically reads in if the player has swiped by basically getting the player's initial touch location and then checking if the player moves their finger a certain pixel distance and then it changes between pages.
I am trying to add a scroll box to the page as I find that not all of my content will fit on the screen. However, now that I have added it to my Widget, my swiping function no longer works. I am assuming this is because the scroll box is consuming the touch event. Is there a way to make it so that the scrolling does not consume the input event?
I had the same problem but unfortunately, I couldn't find how to continue to read touch events.
As a workaround, I created my own scrollbox, so that I could read the finger location while I'm scrolling.

Using pen to control scrollviewer and disable touch

For a page with a scrollviewer - using the hand gesture allows the user to scroll up and down and across or whatever. Using a pen doesn't do anything.
The requirement is to have a button to reverse the functionality of pen and touch.
If this button is pushed the touch can draw ink strokes on a canvas and the pen cannot. This works.
I'm not entirely sure how to proceed with the situations:
pen to be "gesture" to scroll page instead of touch.
hand to draw ink stroke on a canvas that is within a scrollviewer.
Is there a method or attribute that i can set that makes the pen/hand accept gestures?
thanks.
So I found a suggestion (can't remember where) ... that i should disable the zooming and scrolling. When that was set I called the method to allow for inking. Not entirely sure if that's the correct way of doing it but through various testing...it worked.

controlling iPhone zoom programmatically with javascript

I would like to follow my finger movement on an iPhone screen. However this results in rubber banding and scrolling and therefore I have to turn off the default behaviours.
As explained on this website
https://developer.apple.com/library/archive/documentation/AppleApplications/Reference/SafariWebContent/HandlingEvents/HandlingEvents.html
I've added eventlisteners, like so
document.addEventListener('touchmove', touchMove, false);
document.addEventListener('gesturechange', gestureChange, false);
and disabled the default behaviour, like so
function touchMove(event){event.preventDefault(); //other code here}
function gestureChange(event){event.preventDefault(); //other code here}
Now, I can do what I intended to, but I can not scale the page anymore. I'm still able to retrieve the touchstart coordinates and retrieve a zoom factor from gesturechange. Logically, I would like to use those to programmatically change the page zoom. How to do that with javascript?
So far I have some success with applying the eventlistener to a div (instead of the document) and turn oft the touchmove call using a boolean once gesturestart is detected. Actually this works pretty good. I can zoom, pan and double tap on the whole document and zoom and double tap on the div. But a pan on the div executes a function to pass the coordinates (and does not pan).

Handling touch events on UIImageViews

I have a UIImageView, above which I have multiple markers. All the markers are movable, i.e. when I touch a marker, I can move it all around the screen.
Problem: if I touch a marker, and I begin to move it to another place, if I cross another marker, the old marker is left in place of the new one, and I continue to move the new one. And I want to evade this somehow.
Thanks for the answer.
What are your markers? If they are UIView instances then I would suggest watching for touch events on them instead of the image view and then deal with the Z order while dragging.
I would also pay attention to touch up vs. touch moved to help with the issue of your stops and starts.
If they are not UIView instances then the issue sounds like it is with your touch down vs moved. i.e. you should keep track of what marker was touched on the down event and then ignore any events that hit another marker that are not down events until you get an up.
If these suggestions are not helpful then describing how you built your view structure would help.
Z-Order
When you receive a touch down event on a marker I would move it to the top of the z order of views within the parent. Then will allow it to visually slide over the other markers.

Position both the safari "touches" from a gesture event?

I want to get the position (relative or otherwise) of the two fingers/touches inside a gesture event (gesturestart, gesturechange, gestureend) on mobile Safari (iOS: iPad/iPhone). I know that the gesture* events don't actually provide this in the event args but i thought there might be a 'trick' to getting this info. It would be great to get the coords of the fingers while tracking a gesture (eg. scaling and moving an object in the same gesture).
Can this be done?
It turns out that this information is not directly available via the 'gesture' events. The touch events are the key and i was able to get the touches collection and use the first two touches to derive a delta/midpoint of the two sets of coords. This seems to work.
There are three arrays in the returned event object for touch event:
touches
targetTouches
changedTouches
I can't remember where I originally found this info, but a quick Google search brings up http://www.sitepen.com/blog/2008/07/10/touching-and-gesturing-on-the-iphone/
Got it! https://developer.apple.com/library/archive/documentation/AppleApplications/Reference/SafariWebContent/HandlingEvents/HandlingEvents.html Down at "Handling Multi-Touch Events"