Position both the safari "touches" from a gesture event? - iphone

I want to get the position (relative or otherwise) of the two fingers/touches inside a gesture event (gesturestart, gesturechange, gestureend) on mobile Safari (iOS: iPad/iPhone). I know that the gesture* events don't actually provide this in the event args but i thought there might be a 'trick' to getting this info. It would be great to get the coords of the fingers while tracking a gesture (eg. scaling and moving an object in the same gesture).
Can this be done?

It turns out that this information is not directly available via the 'gesture' events. The touch events are the key and i was able to get the touches collection and use the first two touches to derive a delta/midpoint of the two sets of coords. This seems to work.

There are three arrays in the returned event object for touch event:
touches
targetTouches
changedTouches
I can't remember where I originally found this info, but a quick Google search brings up http://www.sitepen.com/blog/2008/07/10/touching-and-gesturing-on-the-iphone/
Got it! https://developer.apple.com/library/archive/documentation/AppleApplications/Reference/SafariWebContent/HandlingEvents/HandlingEvents.html Down at "Handling Multi-Touch Events"

Related

How to prevent Scroll Box from consuming touch event in Unreal?

I am working on a UI system for a mobile game in Unreal. I have functionality already in place that basically reads in if the player has swiped by basically getting the player's initial touch location and then checking if the player moves their finger a certain pixel distance and then it changes between pages.
I am trying to add a scroll box to the page as I find that not all of my content will fit on the screen. However, now that I have added it to my Widget, my swiping function no longer works. I am assuming this is because the scroll box is consuming the touch event. Is there a way to make it so that the scrolling does not consume the input event?
I had the same problem but unfortunately, I couldn't find how to continue to read touch events.
As a workaround, I created my own scrollbox, so that I could read the finger location while I'm scrolling.

Handling touch events on UIImageViews

I have a UIImageView, above which I have multiple markers. All the markers are movable, i.e. when I touch a marker, I can move it all around the screen.
Problem: if I touch a marker, and I begin to move it to another place, if I cross another marker, the old marker is left in place of the new one, and I continue to move the new one. And I want to evade this somehow.
Thanks for the answer.
What are your markers? If they are UIView instances then I would suggest watching for touch events on them instead of the image view and then deal with the Z order while dragging.
I would also pay attention to touch up vs. touch moved to help with the issue of your stops and starts.
If they are not UIView instances then the issue sounds like it is with your touch down vs moved. i.e. you should keep track of what marker was touched on the down event and then ignore any events that hit another marker that are not down events until you get an up.
If these suggestions are not helpful then describing how you built your view structure would help.
Z-Order
When you receive a touch down event on a marker I would move it to the top of the z order of views within the parent. Then will allow it to visually slide over the other markers.

Possible to count the number of touches in iphone?

I want to count the number of touches in one second. So, if three touches were detected in one second do this, if 5 detected to that. I wanted to know if you can do that in touchesBegan.
All I want to do is execute different methods according to how fast the user touches the screen.
Thanks for your help.
I would suggest you use a UITapGestureRecognizer and then handle the taps accordingly depending on how many were detected. It is capable of detecting individual taps as well as how many fingers were set down.
Here's the documentation on it:
http://developer.apple.com/library/ios/#documentation/uikit/reference/UITapGestureRecognizer_Class/Reference/Reference.html

Is there a class / method to handle dragging views?

I found a useful tutorial to get started in understanding how to Cocoa handles touch events. I've used this as a base to create a custom script, I'm trying to make a UIView draggable, very similar to the native Maps application.
I've written a custom script, using
the touchesBegan method it will
capture where the input began and
compare it to the centre point of the
UIView using some conditional
statements.
The touchesMoved method will do some
further conditional statements to
determine whether the touch start
point and the center of the view will
move positively or negative.
I've also captured the views
boundaries so it doesn't go too far
out.
it's lacking the polished finished found in other applications such as Maps, or scrolling a UITable, such as the ease effect after the user has released their fingers, and the snapping effect when it reaches the boundaries is horrible.
Is there a method that takes a view and makes it draggable like this? If not I'll continue to refine my script.
Many thanks!
Maybe you are looking for UIScrollView

iPhone -- possible to tell the difference between a fingertip and a whole fingerpad?

Is it possible to detect exactly how much finger is in contact with the screen? Say I wanted to make a fingerprinting app, how would I detect the outline of a person's fingeR?
No, the UITouch system does a lot of processing to determine a single point location for each touch given the larger touched area. This is meant to aid the user as there can be some difference between where one thinks he is touching and where the screen is actually touched.