Cancel UITouch Events When View Covered By Modal UIViewController - iphone

I am writing an application where the user has to move some stuff on the screen using his fingers and drop them. To do this, I am using the touchesBegan,touchesEnded... function of each view that has to be moved.
The problem is that sometimes the views are covered by a view displayed using the [UIViewController presentModalViewController] function. As soon as that happens, the UIView that I was moving stops receiving the touch events, since it was covered up. But there is no event telling me that it stopped receiving the events, so I can reset the state of the moved view.
The following is an example that demonstrates this. The functions are part of a UIView that is being shown in the main window. It listens to touch events and when I drag the finger for some distance, it presents a modal view that covers everything. In the Run Log, it prints what touch events are received.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"touchesBegan");
touchStart=[[touches anyObject] locationInView:self];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint touchAt=[[touches anyObject] locationInView:self];
float xx=(touchAt.x-touchStart.x)*(touchAt.x-touchStart.x);
float yy=(touchAt.y-touchStart.y)*(touchAt.y-touchStart.y);
float rr=xx+yy;
NSLog(#"touchesMoved %f",rr);
if(rr > 100) {
NSLog(#"Show modal");
[viewController presentModalViewController:[UIViewController new] animated:NO];
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"touchesEnded");
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"touchesCancelled");
}
But when I test the application and trigger the modal dialog to be displayed, the following is the output in the Run Log.
[Session started at 2010-03-27
16:17:14 -0700.] 2010-03-27
16:17:18.831
modelTouchCancel[2594:207]
touchesBegan 2010-03-27 16:17:19.485
modelTouchCancel[2594:207]
touchesMoved 2.000000 2010-03-27
16:17:19.504
modelTouchCancel[2594:207]
touchesMoved 4.000000 2010-03-27
16:17:19.523
modelTouchCancel[2594:207]
touchesMoved 16.000000 2010-03-27
16:17:19.538
modelTouchCancel[2594:207]
touchesMoved 26.000000 2010-03-27
16:17:19.596
modelTouchCancel[2594:207]
touchesMoved 68.000000 2010-03-27
16:17:19.624
modelTouchCancel[2594:207]
touchesMoved 85.000000 2010-03-27
16:17:19.640
modelTouchCancel[2594:207]
touchesMoved 125.000000 2010-03-27
16:17:19.641
modelTouchCancel[2594:207] Show modal
Any suggestions on how to reset the state of a UIView when its touch events are interrupted by a modal view?

If you are controling when the modal view is being displayed, can you also send a notification at the same time to tell the rest of your app that they should reset the moved view?

Related

How to detect touch end of image view when touches moving

I have an image view. i detected touch in image view like this
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
int viewTag=[touch view].tag;
if ([[touch view] isKindOfClass:[UIImageView class]])
{
//My code
}
}
and touches moved on image view. whenever my touch moved out off the image view in that particular time i need one alert view. how to detect that touch over from the image view in touches moving?...
I recommend using a UIPanGestureRecognizer and adding it to a larger super-view of the image-view you want to detect on. That way even if the touch starts outside and moves into and out of your image-view you can follow the movement of the touch in your gesture handler.
It's pretty easy, make a method called handlePan: for example, create the gesture recognizer using your handler method, add it to the appropriate super-view. Now whenever the gesture is active and the touch moves your handler method will get called and you can check to see if it is inside your image view.
You should use this method...
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
int viewTag=[touch view].tag;
if ([[touch view] isKindOfClass:[UIImageView class]])
{
//My code
}
else
{
//show the alertView here
}
}
and to check that the initial click was on the imageView you have to set a flag in the touchesBegan method... and check it accordingly in the touchesMoved method
You may add a transparent UIButton of the same size on top of the UIImageViewand track UIControlEventTouchDragOutside
[button addTarget:self action:#selector(draggedOutside:) forControlEvents:UIControlEventTouchDragExit];

Alphabetic gesture recognition in all screens of my app

I am working on detecting alphabetic gestures in my app. So when the user draws a C in the screen there is a special action that takes place and so on. I am using recognizer class that has pre defined data about each alphabets touch points and the detection is ok. I want this feature in all my screens so i add the below methods to appDelegate class and detect touches in the window only, what happens here is that other views like tableview ,scrollview inside screens block the touch events from being sent to the window - If that happens perfectly then my code would work like a charm. Any help is appreciated.
- (void)processGestureData
{
NSString *gestureName = [recognizer findBestMatchCenter:&center angle:&angle score:&score];
NSLog(#"gesture Name: %#",gestureName);
if ([gestureName isEqualToString:#"N"] || [gestureName isEqualToString:#"n"])
{//handle N gesture
}
if ([gestureName isEqualToString:#"C"])
{//handle C gesture
}
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[recognizer resetTouches];
[recognizer addTouches:touches fromView:self.window];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[recognizer addTouches:touches fromView:self.window];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[recognizer addTouches:touches fromView:self.window];
[self processGestureData];
}
I think you need some touch intercepting window that will sit about all touches.
If your gesture is recognised, process that, else pass on touch to your view controller. Refer this link for details

Multiple touch on the iPhone issue

How can I make it so that while the user is playing on the joystick to move the character at the centre they can also touch the bottom right of the screen (the second touch) to fire the gun? I have looked at other questions but I still cant figure it out...
this is basically the code....:
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
//make the touch point...
UITouch *touch = [[event allTouches]anyObject];
CGPoint point = [touch locationInView:touch.view];
if (//touching a certain area (on the joystick)) {
//do stuff
}
else if (point.x > 300 && point.y > 200) {
/fire method
}
}
so basically how do I call touchesBegan again to find the position of CGPoint point, and work out if the fire method should be called?
Thanks
EDIT:
I have tried to do that 2nd option and done this:
in view controller.h I added:
#interface fireView: UIView
#end
and in .m I added:
#implementation fireView -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"hi");
}
#end
....but it doesn't log/print "hi"?
Use 2 UIGestureRecognizers. You can create 2 invisible views of needed size - one for joystick and one for fire button. For every view use single gesture recognizer. And then you will be able to handle taps on these view by in different methods without checking if it was fire or joystick.
Let's think you have 2 views - joystickView and fireView already. Then do it like
UITapGestureRecognizer* fireTapGestureRec= [[UITapGestureRecognizer alloc]
initWithTarget:self action:#selector(fireTapped:)];
fireTapGestureRec.delegate = self;
fireTapGestureRec.numberOfTapsRequired = 1;
fireTapGestureRec.numberOfTouchesRequired = 1;
[fireView addGestureRecognizer:fireTapGestureRec];
[fireTapGestureRec release];
and write fireTapped: to handle the event. And the same thing for joystick.
EDIT
Second option (suggested in comments)
Make subclasses of UIView, like
#interface fireView: UIView
#interface joystickView: UIView
and for each subclass write it own touchesBegan: or touchesEnded:

How to wait for touch input on iOS

I am running an animation but I don't want to start the animation until the user touches the screen. I thought of using a loop but that takes a lot of overhead and I couldn't even get it to work for this.
I am aware of the touchesEnded and touchedBegin methods but I am not sure how to use them in this manner.
Thanks in advance.
you can just use touchesBegan / Ended in your viewcontroller code and it will trigger when you touch the screen(touchesBegan) or lift your finger off(touches)
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
//start animation
}
OR
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
//startAnimation
}
GestureRecognizers are your friend: Gesture Recognizers
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(didTap:)];
[viewToTap addGestureRecognizer:tapGesture];
- (void) didTap:(UIGestureRecognizer*) sender {
// start you animation here
}

iPhone: Click view behind transparent UIScrollView

I have a UIScrollView that is set to have a clear background. Part of the scrollview does have content, but part does not (so it shows other views behind it). I would like to be able to click through the UIScrollView and to the MKMapView behind, but only for the transparent portion of the UIScrollView.
I have found some code which I am having a real hard time understanding how to get working:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if (![self yourMethodThatDeterminesInterestingTouches:touches withEvent:event])
[self.nextResponder touchesBegan:touches withEvent:event];
}
Could someone help me wrap my mind around how to forward a touch event to a view that is behind another view? Can I call - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event from a UIViewController?
What we did was to subclass UIScrollView and implement logic that passes responsibility down to views under it, if the touch happens inside of the transparent area.
In our case the transparent area is defined by contentOffset of 120 on Y axis, meaning our content starts 120 points below the start of the UIScrollView, and the code looks like this:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
if (self.contentOffset.y < 0 && point.y < 0.0) {
return NO;
} else {
return YES;
}
}
Obviously this response is well past its prime but hopefully this is helpful to anyone searching for a solution.
Basically, it's up to you to determine what touch events you care to forward to another responder. If you simply want to forward all touch events, just remove that if statement in the code you posted so the next responder will receive all the touch events.