Why is the UIEvent a different object in touchesBegan and touchesMoved? - iphone

In the example project aurioTouch application delegate the code indicates (and I've read elsewhere) that the touch event object passed to touchesBegan, touchesMoved, and touchesEnded will be the same object while it is still a single set of user actions, such as touching and moving a finger. When I override UIScrollView and implement these methods, the events that I get back are different objects. What am I missing here?

The UITouch objects will be the same, but they are packaged in a new event.

You are right that the UIEvent is reused when delivering touch events for one gesture. From the docs:
A UIEvent object representing a touch
event is persistent throughout a
multi-touch sequence; UIKit reuses the
same UIEvent instance for every event
delivered to the application. You
should never retain an event object or
any object returned from an event
object. If you need to keep
information from an event around from
one phase to another, you should copy
that information from the UITouch or
UIEvent object.
I presume the difference in behavior for your case results from the special event handling done by UIScrollView. Scroll views delay event delivery because they need to detect a scrolling intent by the user (swipe gestures). So they have to have a way of keeping UIEvents around—probably copying them to make sure they retain their original state. This might be the reason you see different objects.
Note that all of the above is only guessing.

Related

GSEvent and multiple fingers

I am successful in monitoring raw touches using GSEvent by hooking sendEvent. How do I extract touch information when multiple fingers are involved?
Update 1: iOS 5.01
Update 2: I managed to do this by going over the allTouches set contained in the event passed. It works fine, but bogs down a when gesture recognizers kick in for a 4 or 5 finger event..
You are kind of right. By overriding the sendEvent: method, and then getting the GSEvent from the UIEvent, you can get the underlying sytem information you are looking for. You could watch the "infoSize" field in the GSEvent record, which should tell you how many touches are involved in the event... But why use GSEvent? You could just put one big UIView in your application, set it's multiple touch interaction property as YES, override it's sendEvent method, and you should get every touch in there, even the 4 and 5 finger gestures. You can forward touches that are not important for you, and don't forward the ones that are not.
Hope this helps.

Forcing Touches to Multiple Views

I have two overlapping custom views that need to both receive touch events (e.g. touchesBegan and touchesMoved). However I can only get one of the Views (the top one) to receive the events. I have tried forwarding the events from one view to the other view using:
[otherView touchesEnded:touches withEvent:event];
but this does not always work.
I need the touch events to be sent to the two views simultaneously. Can anyone help?
If you intercept a touch, you should usually call [super methodYouAreIntercepting] at the end of the method, if you still want the touch to go through to the next layer. If you do this, and the two views are directly on top of each other, then you don't need to manually forward the touches the way you have been doing. because your comment above suggests that you haven't been calling super in the method, I bet this will solve your issue.

Missing touchEnd events on iPhone

I was trying to implement palm rejection functionality for a drawing app i developed for iPhone and noticed some weird behavior in touch events. When i place my palm on the screen and continuously lift some region of my hand up and then lower it down again, i get lots of touchBegin events but only a few touchEnd events. Is there something i don't know about the touch handling mechanism of iOS? Shouldn't be the number of touchEnd and touchBegin events belonging to each UITouch object equal?
There is only one view on my window and it occupies the entire screen. Both the view and the window have multitouch enabled. I'm counting the events by printing the number of touches using NSLog's at the beginning of touchBegin and touchEnd methods. So i'm taking into account the fact that a single event may contain info about multiple touches.
Don't forget to provide a handler for touchesCancelled events. You can get a touchesCancelled call after a touchesBegan and without a matching touchesEnded event.
Rather than looking at the number of touchesBegan:withEvent: and touchesEnded:withEvent: calls, you should look at the NSSet of UITouch objects passed to those methods. So, for example, if you placed one finger then a second finger on the screen, you'd get two touchesBegan:withEvent: calls. If you lifted both fingers from the screen simultaneously, you'd get a single touchesEnded:withEvent: call; the NSSet of UITouch objects passed in would indicate that two fingers were lifted.

Not sure how to use ccTouchesBegan to do what I want it to do

So when I see ccTouchesBegan (or touchesBegan for that fact of the matter) I see something like this usually:
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
}
But, what I am not getting is how do you just detect if one object has been touched? For example how would I check if a specific CCSprite I declared has been touched?
Sorry if this is a newbish question but, I just don't understand and if you need more clarification just ask how I can clarify myself more.
I'm not familiar with cocoas2d but in the standard API it sends the touches first to the view touched and then up the view responder chain to a view that has a controller. If that controller does not handle the touch then it goes up to the next view until it ends up at the Window object.
See Responder Objects in the Responder Chain
The best place to trap touches for specific objects is in the object themselves. In the case of sprite-like view, the sprite itself most likely needs to respond to the touch e.g. by moving itself. If you need the touch to be communicated to another object, you should use the delegate pattern so that the sprite can tell its delegate how its been touched.
That last sentence sounded weird.
I don't have the samples in front of me but there should be an example in the Cocos2D download package which demonstrates a touch event and how it propagates down to sprites.

Function that processes touches

I have an UITableView and I want to detect double touches on UITableViewCell.
I wanted to create a method, that will be called from the main app cycle and will process touches. (i.e. keep in memory time of 2 last touches and if the time is less than 0.5 of a second than it is a double touch)
Is there a better way to achieve this?
UITouch already handles timing double-touches for you; it's in the -tapCount of the UITouch object. Look at Event Delivery for some generic code for managing it.