Iphone multitouch handling - iphone

I'm writing an iPhone app, and I want to handle multitouches. I'm using cocos2d libs. So I've made a CCLayer subclass and set it to be a CCStandartTouchDelegate. For some reason I don't want to use UIGestureRecognizer and to build a correct logic I should know the answers for these questions:
If I tap the screen with one finger, and then with the other one. How many touches will be caught in ccTouchesBegan?
If I tap the screen with two fingers and then will move only one of them. How many touches will be caught in ccTouchesMoved?

The best thing to do when you have a question like this is just to implement the callbacks, and in the implementation, log the parameters. For example:
- (BOOL)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Log everything (there will be repetition because the event contains the set of touches):
NSLog(#"ccTouchesBegan: touches = %#; event = %#", touches, event);
// Or, just log the number of touches to simplify the output:
NSLog(#"ccTouchesBegan: %d touches", [touches count]);
return kEventHandled;
}
Then just run your app and experiment, watching the log. You'll learn more this way (and faster) than you will by asking here.
But to answer your specific questions:
You should get one call to ccTouchesBegan for each tap (even if the first finger is still down when the second tap occurs). If the two fingers hit simultaneously, you'll get one call with two touches.
You'll get repeated calls to ccTouchesMoved each time one or more of the fingers moves. If only one finger is moving, each call will be passed a single touch. Stationary fingers will not generate events until they are moved or lifted.
Of course, remember to set isTouchEnabled = YES for your CCLayer or you won't get any callbacks at all.

Related

GSEvent and multiple fingers

I am successful in monitoring raw touches using GSEvent by hooking sendEvent. How do I extract touch information when multiple fingers are involved?
Update 1: iOS 5.01
Update 2: I managed to do this by going over the allTouches set contained in the event passed. It works fine, but bogs down a when gesture recognizers kick in for a 4 or 5 finger event..
You are kind of right. By overriding the sendEvent: method, and then getting the GSEvent from the UIEvent, you can get the underlying sytem information you are looking for. You could watch the "infoSize" field in the GSEvent record, which should tell you how many touches are involved in the event... But why use GSEvent? You could just put one big UIView in your application, set it's multiple touch interaction property as YES, override it's sendEvent method, and you should get every touch in there, even the 4 and 5 finger gestures. You can forward touches that are not important for you, and don't forward the ones that are not.
Hope this helps.

what after only touches Began?

In my iPad app...
I am doing some stuff of dragging an object....
My problem is for dragging I need two methods...
touches began some stuff1
touches moved some stuff2
Sometimes what happens that the user is not moving the object after touching.
So, whateever somestuff1 has been done.
I need to reverse it back..
So,How would I do that...
Means is there any event or notification that I can fire
if the user does the touches began and not touches ended.
You will always receive a touchesEnded:withEvent: message or a touchesCancelled:withEvent: message after you have received a touchesBegan:withEvent: message. You need to override both methods if you want to know when the user has lifted his finger.
If you want to track whether the user moved the touch before lifting his finger, you have to do that yourself. You can set a flag in your touchesMoved:withEvent: method, or you can save the original position of the touch in touchesBegin:withEvent: and then compare it with the final position of the touch in touchesEnded:withEvent: and touchesCancelled:withEvent:.
You have a - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event function, too, use it to check if the user hasn't made any moves.

Missing touchEnd events on iPhone

I was trying to implement palm rejection functionality for a drawing app i developed for iPhone and noticed some weird behavior in touch events. When i place my palm on the screen and continuously lift some region of my hand up and then lower it down again, i get lots of touchBegin events but only a few touchEnd events. Is there something i don't know about the touch handling mechanism of iOS? Shouldn't be the number of touchEnd and touchBegin events belonging to each UITouch object equal?
There is only one view on my window and it occupies the entire screen. Both the view and the window have multitouch enabled. I'm counting the events by printing the number of touches using NSLog's at the beginning of touchBegin and touchEnd methods. So i'm taking into account the fact that a single event may contain info about multiple touches.
Don't forget to provide a handler for touchesCancelled events. You can get a touchesCancelled call after a touchesBegan and without a matching touchesEnded event.
Rather than looking at the number of touchesBegan:withEvent: and touchesEnded:withEvent: calls, you should look at the NSSet of UITouch objects passed to those methods. So, for example, if you placed one finger then a second finger on the screen, you'd get two touchesBegan:withEvent: calls. If you lifted both fingers from the screen simultaneously, you'd get a single touchesEnded:withEvent: call; the NSSet of UITouch objects passed in would indicate that two fingers were lifted.

Not sure how to use ccTouchesBegan to do what I want it to do

So when I see ccTouchesBegan (or touchesBegan for that fact of the matter) I see something like this usually:
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
}
But, what I am not getting is how do you just detect if one object has been touched? For example how would I check if a specific CCSprite I declared has been touched?
Sorry if this is a newbish question but, I just don't understand and if you need more clarification just ask how I can clarify myself more.
I'm not familiar with cocoas2d but in the standard API it sends the touches first to the view touched and then up the view responder chain to a view that has a controller. If that controller does not handle the touch then it goes up to the next view until it ends up at the Window object.
See Responder Objects in the Responder Chain
The best place to trap touches for specific objects is in the object themselves. In the case of sprite-like view, the sprite itself most likely needs to respond to the touch e.g. by moving itself. If you need the touch to be communicated to another object, you should use the delegate pattern so that the sprite can tell its delegate how its been touched.
That last sentence sounded weird.
I don't have the samples in front of me but there should be an example in the Cocos2D download package which demonstrates a touch event and how it propagates down to sprites.

Implement custom zooming for a UIScrollView

I have a UIScrollView with the requirement that, when zooming, the contentSize.height should remain the same. Zooming in from 200x100 should result in a new contentSize of 400x100 instead of 400x200, for instance. I'd like to do my own drawing while the user is zooming.
I don't think I can use the normal zooming behaviour of UIScrollView to achieve this, so I'm trying to roll my own. (I could just let it do its thing and then redraw my contents when -scrollViewDidEndZooming:withView:atScale: gets called, but that wouldn't be very pretty).
Currently I am subclassing UIScrollView and trying to do my own zooming when two fingers are on the screen:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if ([touches count] != 2) {
[super touchesMoved:touches withEvent:event];
} else {
// do my own stuff
}
}
I thought that by overriding touchesBegan:withEvent:, touchesMoved:withEvent:, touchesEnded:withEvent: and touchesCancelled:withEvent: in this way should work, but it doesn't.
An earlier failed attempt was to place a transparent view on top of the scrollview and send touches that I'm not interested in to the scrollview :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if ([touches count] != 2) {
[self.theScrollView touchesMoved:touches withEvent:event];
} else {
// do my own stuff
}
}
Any help would be appreciated.
Thanks,
Thomas
You probably won't be able to maintain decent performance during zooming if you attempt to redraw your content on every frame of a pinch-zooming event. I'd recommend taking the approach of letting the UIScrollView zoom a scaled version of your drawing in or out, then redraw the content to be sharp at the end of the zoom in the -scrollViewDidEndZooming:withView:atScale: delegate method. This is what I do in my application, and it ends up working very well.
There are some tricks to resizing your content view properly at the end of a zoom, which I describe in this answer. Basically, you need to intercept the setting of a transform to your content view so that you can set it to a scale factor of 1 when you redraw your content. You'll also need to keep track of that scale factor, because the UIScrollView doesn't, and use that scale factor to adjust the transform that UIScrollView tries to apply to your content view with subsequent zoom operations.
You could use a modification of this technique to force a redraw of your content during the pinch-zooming, but in my tests this ended up being far too jerky to provide a good user experience.
I'm not sure what you mean by this:
I thought that by overriding
touchesBegan:withEvent:,
touchesMoved:withEvent:,
touchesEnded:withEvent: and
touchesCancelled:withEvent: in this
way should work, but it doesn't.
Do you not get the events? You should receive the events, but I think there is a logic error in your if statement that may have been preventing this from working.
if ([touches count] != 2)
This is a problem, because the likelihood of the two touches happening precisely the same time is low. You'll want to be able to handle when touches happen independently, as well as when a user holds a finger stationary, and moves the other one. In these scenarios (which are common) you may only get one touch in that NSSet, even though the other is still valid.
The solution to handling touches properly is to remember some things about which touches came in and which touches left. Remember, the address of the UITouch does not change for the life of the touch, so you can safely compare addresses to ensure you are still dealing with the same touch as before, and track it's lifecycle.
If you are not getting the touch events, then that is a different problem altogether, you may need to turn set multiTouchEnabled:YES
I'm trying to do the same thing as this and I really want to be able to redraw while it's zooming. Fixing it up at the end in scrollViewDidEndZooming:withView:atScale is not really good enough.
The way I do it is pass a dummy view in viewForZoomingInScrollView: and read the height of this dummy view and set the frame on the actual content view to whatever I want. Because the frame changes, it means that drawRect gets called everytime. It seems fine on the simulator, I'm just drawing a few lines. But I don't actually own a device though, so I can't test it properly.
Also in the code you've got, you have touchesBegan:withEvent: but then you are forwarding to super touchesMoved:withEvent: instead of touchesBegan:withEvent: