I have two overlapping custom views that need to both receive touch events (e.g. touchesBegan and touchesMoved). However I can only get one of the Views (the top one) to receive the events. I have tried forwarding the events from one view to the other view using:
[otherView touchesEnded:touches withEvent:event];
but this does not always work.
I need the touch events to be sent to the two views simultaneously. Can anyone help?
If you intercept a touch, you should usually call [super methodYouAreIntercepting] at the end of the method, if you still want the touch to go through to the next layer. If you do this, and the two views are directly on top of each other, then you don't need to manually forward the touches the way you have been doing. because your comment above suggests that you haven't been calling super in the method, I bet this will solve your issue.
Related
I am successful in monitoring raw touches using GSEvent by hooking sendEvent. How do I extract touch information when multiple fingers are involved?
Update 1: iOS 5.01
Update 2: I managed to do this by going over the allTouches set contained in the event passed. It works fine, but bogs down a when gesture recognizers kick in for a 4 or 5 finger event..
You are kind of right. By overriding the sendEvent: method, and then getting the GSEvent from the UIEvent, you can get the underlying sytem information you are looking for. You could watch the "infoSize" field in the GSEvent record, which should tell you how many touches are involved in the event... But why use GSEvent? You could just put one big UIView in your application, set it's multiple touch interaction property as YES, override it's sendEvent method, and you should get every touch in there, even the 4 and 5 finger gestures. You can forward touches that are not important for you, and don't forward the ones that are not.
Hope this helps.
In the example project aurioTouch application delegate the code indicates (and I've read elsewhere) that the touch event object passed to touchesBegan, touchesMoved, and touchesEnded will be the same object while it is still a single set of user actions, such as touching and moving a finger. When I override UIScrollView and implement these methods, the events that I get back are different objects. What am I missing here?
The UITouch objects will be the same, but they are packaged in a new event.
You are right that the UIEvent is reused when delivering touch events for one gesture. From the docs:
A UIEvent object representing a touch
event is persistent throughout a
multi-touch sequence; UIKit reuses the
same UIEvent instance for every event
delivered to the application. You
should never retain an event object or
any object returned from an event
object. If you need to keep
information from an event around from
one phase to another, you should copy
that information from the UITouch or
UIEvent object.
I presume the difference in behavior for your case results from the special event handling done by UIScrollView. Scroll views delay event delivery because they need to detect a scrolling intent by the user (swipe gestures). So they have to have a way of keeping UIEvents around—probably copying them to make sure they retain their original state. This might be the reason you see different objects.
Note that all of the above is only guessing.
I was trying to implement palm rejection functionality for a drawing app i developed for iPhone and noticed some weird behavior in touch events. When i place my palm on the screen and continuously lift some region of my hand up and then lower it down again, i get lots of touchBegin events but only a few touchEnd events. Is there something i don't know about the touch handling mechanism of iOS? Shouldn't be the number of touchEnd and touchBegin events belonging to each UITouch object equal?
There is only one view on my window and it occupies the entire screen. Both the view and the window have multitouch enabled. I'm counting the events by printing the number of touches using NSLog's at the beginning of touchBegin and touchEnd methods. So i'm taking into account the fact that a single event may contain info about multiple touches.
Don't forget to provide a handler for touchesCancelled events. You can get a touchesCancelled call after a touchesBegan and without a matching touchesEnded event.
Rather than looking at the number of touchesBegan:withEvent: and touchesEnded:withEvent: calls, you should look at the NSSet of UITouch objects passed to those methods. So, for example, if you placed one finger then a second finger on the screen, you'd get two touchesBegan:withEvent: calls. If you lifted both fingers from the screen simultaneously, you'd get a single touchesEnded:withEvent: call; the NSSet of UITouch objects passed in would indicate that two fingers were lifted.
I'm writing an iPhone app, and I want to handle multitouches. I'm using cocos2d libs. So I've made a CCLayer subclass and set it to be a CCStandartTouchDelegate. For some reason I don't want to use UIGestureRecognizer and to build a correct logic I should know the answers for these questions:
If I tap the screen with one finger, and then with the other one. How many touches will be caught in ccTouchesBegan?
If I tap the screen with two fingers and then will move only one of them. How many touches will be caught in ccTouchesMoved?
The best thing to do when you have a question like this is just to implement the callbacks, and in the implementation, log the parameters. For example:
- (BOOL)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Log everything (there will be repetition because the event contains the set of touches):
NSLog(#"ccTouchesBegan: touches = %#; event = %#", touches, event);
// Or, just log the number of touches to simplify the output:
NSLog(#"ccTouchesBegan: %d touches", [touches count]);
return kEventHandled;
}
Then just run your app and experiment, watching the log. You'll learn more this way (and faster) than you will by asking here.
But to answer your specific questions:
You should get one call to ccTouchesBegan for each tap (even if the first finger is still down when the second tap occurs). If the two fingers hit simultaneously, you'll get one call with two touches.
You'll get repeated calls to ccTouchesMoved each time one or more of the fingers moves. If only one finger is moving, each call will be passed a single touch. Stationary fingers will not generate events until they are moved or lifted.
Of course, remember to set isTouchEnabled = YES for your CCLayer or you won't get any callbacks at all.
So when I see ccTouchesBegan (or touchesBegan for that fact of the matter) I see something like this usually:
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
}
But, what I am not getting is how do you just detect if one object has been touched? For example how would I check if a specific CCSprite I declared has been touched?
Sorry if this is a newbish question but, I just don't understand and if you need more clarification just ask how I can clarify myself more.
I'm not familiar with cocoas2d but in the standard API it sends the touches first to the view touched and then up the view responder chain to a view that has a controller. If that controller does not handle the touch then it goes up to the next view until it ends up at the Window object.
See Responder Objects in the Responder Chain
The best place to trap touches for specific objects is in the object themselves. In the case of sprite-like view, the sprite itself most likely needs to respond to the touch e.g. by moving itself. If you need the touch to be communicated to another object, you should use the delegate pattern so that the sprite can tell its delegate how its been touched.
That last sentence sounded weird.
I don't have the samples in front of me but there should be an example in the Cocos2D download package which demonstrates a touch event and how it propagates down to sprites.