Is there a way to detect a finger's non-movement by using a combination of UITouch events?
The event methods touchesEnded and touchesCancelled are only fired when the event is cancelled or the finger lifted. I would like to know when a touch has stopped moving, even while it is still touching the screen.
Simply use the following UITouch property:
UITouchPhase phase;
if its value is UITouchPhaseStationary, then the finger has not moved on the screen since the last event received. This implies that you get the related touch in
touchesBegan:withEvent:
and then the user simply does not move his/her finger.
Related
In my iPad app...
I am doing some stuff of dragging an object....
My problem is for dragging I need two methods...
touches began some stuff1
touches moved some stuff2
Sometimes what happens that the user is not moving the object after touching.
So, whateever somestuff1 has been done.
I need to reverse it back..
So,How would I do that...
Means is there any event or notification that I can fire
if the user does the touches began and not touches ended.
You will always receive a touchesEnded:withEvent: message or a touchesCancelled:withEvent: message after you have received a touchesBegan:withEvent: message. You need to override both methods if you want to know when the user has lifted his finger.
If you want to track whether the user moved the touch before lifting his finger, you have to do that yourself. You can set a flag in your touchesMoved:withEvent: method, or you can save the original position of the touch in touchesBegin:withEvent: and then compare it with the final position of the touch in touchesEnded:withEvent: and touchesCancelled:withEvent:.
You have a - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event function, too, use it to check if the user hasn't made any moves.
In the example project aurioTouch application delegate the code indicates (and I've read elsewhere) that the touch event object passed to touchesBegan, touchesMoved, and touchesEnded will be the same object while it is still a single set of user actions, such as touching and moving a finger. When I override UIScrollView and implement these methods, the events that I get back are different objects. What am I missing here?
The UITouch objects will be the same, but they are packaged in a new event.
You are right that the UIEvent is reused when delivering touch events for one gesture. From the docs:
A UIEvent object representing a touch
event is persistent throughout a
multi-touch sequence; UIKit reuses the
same UIEvent instance for every event
delivered to the application. You
should never retain an event object or
any object returned from an event
object. If you need to keep
information from an event around from
one phase to another, you should copy
that information from the UITouch or
UIEvent object.
I presume the difference in behavior for your case results from the special event handling done by UIScrollView. Scroll views delay event delivery because they need to detect a scrolling intent by the user (swipe gestures). So they have to have a way of keeping UIEvents around—probably copying them to make sure they retain their original state. This might be the reason you see different objects.
Note that all of the above is only guessing.
I was trying to implement palm rejection functionality for a drawing app i developed for iPhone and noticed some weird behavior in touch events. When i place my palm on the screen and continuously lift some region of my hand up and then lower it down again, i get lots of touchBegin events but only a few touchEnd events. Is there something i don't know about the touch handling mechanism of iOS? Shouldn't be the number of touchEnd and touchBegin events belonging to each UITouch object equal?
There is only one view on my window and it occupies the entire screen. Both the view and the window have multitouch enabled. I'm counting the events by printing the number of touches using NSLog's at the beginning of touchBegin and touchEnd methods. So i'm taking into account the fact that a single event may contain info about multiple touches.
Don't forget to provide a handler for touchesCancelled events. You can get a touchesCancelled call after a touchesBegan and without a matching touchesEnded event.
Rather than looking at the number of touchesBegan:withEvent: and touchesEnded:withEvent: calls, you should look at the NSSet of UITouch objects passed to those methods. So, for example, if you placed one finger then a second finger on the screen, you'd get two touchesBegan:withEvent: calls. If you lifted both fingers from the screen simultaneously, you'd get a single touchesEnded:withEvent: call; the NSSet of UITouch objects passed in would indicate that two fingers were lifted.
I'm writing an iPhone app, and I want to handle multitouches. I'm using cocos2d libs. So I've made a CCLayer subclass and set it to be a CCStandartTouchDelegate. For some reason I don't want to use UIGestureRecognizer and to build a correct logic I should know the answers for these questions:
If I tap the screen with one finger, and then with the other one. How many touches will be caught in ccTouchesBegan?
If I tap the screen with two fingers and then will move only one of them. How many touches will be caught in ccTouchesMoved?
The best thing to do when you have a question like this is just to implement the callbacks, and in the implementation, log the parameters. For example:
- (BOOL)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Log everything (there will be repetition because the event contains the set of touches):
NSLog(#"ccTouchesBegan: touches = %#; event = %#", touches, event);
// Or, just log the number of touches to simplify the output:
NSLog(#"ccTouchesBegan: %d touches", [touches count]);
return kEventHandled;
}
Then just run your app and experiment, watching the log. You'll learn more this way (and faster) than you will by asking here.
But to answer your specific questions:
You should get one call to ccTouchesBegan for each tap (even if the first finger is still down when the second tap occurs). If the two fingers hit simultaneously, you'll get one call with two touches.
You'll get repeated calls to ccTouchesMoved each time one or more of the fingers moves. If only one finger is moving, each call will be passed a single touch. Stationary fingers will not generate events until they are moved or lifted.
Of course, remember to set isTouchEnabled = YES for your CCLayer or you won't get any callbacks at all.
I'm looking for a way to get all the current touches even if no event has occurred. I know about [UIEvent allTouches] but I need to be able to see "these are all the touches on the screen" even if none of them has changed. It seems like it should be possible because allTouches can access touches which haven't been updated, so the phone is tracking them.
Override touchesBegan and touchesEnded, and compare the current point against the cached point from the event.
Try hitTest:withEvent and filter with pointInside:withEvent.