hitTest:withEvent: Not Working - iphone

I'm making an app where I have a background view and that has six UIImageView's as subviews. I have a UITapGestureRecognizer to see when one of the UIImageViews is tapped on and thie handleTap method below is what the gesture recognizer calls. However, when I run this, the hitTest:withEvent: always returns the background view even when I tap on one of the imageViews. Does it have something to do with the event when I call hitTest?
Thanks
- (void) handleTap: (UITapGestureRecognizer *) sender
{
if (sender.state == UIGestureRecognizerStateEnded)
{
CGPoint location = [sender locationInView: sender.view];
UIView * viewHit = [sender.view hitTest:location withEvent:NULL];
NSLog(#"%#", [viewHit class]);
if (viewHit == sender.view) {}
else if ([viewHit isKindOfClass:[UIImageView class]])
{
[self imageViewTapped: viewHit];
NSLog(#"ImageViewTapped!");
}
}
}

UIImageView are, by default, configured to not register user interaction.
From the UIImageView documentation:
New image view objects are configured to disregard user events by
default. If you want to handle events in a custom subclass of
UIImageView, you must explicitly change the value of the
userInteractionEnabled property to YES after initializing the object.
So, right after you initialize your views you should have:
view.userInteractionEnabled = YES;
This will turn the interaction back on and you should be able to register touch events.

There's a rewrite on your approach (single GR on the containing view) that works, but it'll make our brain hurt getting the coordinate systems right, which is definitely the problem in the posted code.
The better answer is to attach N gesture recognizers to each of the UIImageViews. They can all have the same target and use the same handleTap: method. The handleTap: can get the view without searching any geometry like this:
UIImageView *viewHit = (UIImageView *)sender.view;

Related

Pass taps through a UIPanGestureRecognizer

I'd like to detect swipe on the entire screen, however, the screen contains UIButtons, and if the user taps one of these buttons, I want the Touch Up Inside event to be triggered.
I've create a UIView on the top of my screen, and added a UIPanGestureRecognizer on it to detect the swipe, but now I need to pass the gesture through that view when I detect that it's a tap rather than a swipe.
I know how to differentiate the gestures, but I've no idea on how to pass it to the view below.
Can anyone help on that? Thanks!
Thanks for your answer. The link helped me to solve part of my problem.
I've set the buttons as subviews of my gestureRecognizer view and I can now start a swipe from one of the buttons (and continue to use the buttons as well). I managed to prevent the buttons to go to the "down" state by using the following code :
UIPanGestureRecognizer *swipe = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(swipeDetected:)];
swipe.maximumNumberOfTouches = 1;
swipe.delaysTouchesBegan =YES;
swipe.cancelsTouchesInView = YES;
[self.gestureRecognitionView addGestureRecognizer:swipe];
there is a BOOL property of UIGestureRecognizer cancelsTouchesInView. default is yes. set it to NO , and the touches will pass thru to the UIView
also have a look at the solution for this question
If you want to prevent the recognizer from receiving the touch at all, UIGestureRecognizerDelegate has a method gestureRecognizer:shouldReceiveTouch: you can use:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
// don't override any other recognizers
if (gestureRecognizer != panRecognizer) {
return YES;
}
CGPoint touchedPoint = [touch locationInView:self.someButton];
return CGRectContainsPoint(self.someButton.bounds, touchedPoint) == NO;
}

Dragging UIButtons in UIScrollView?

I have a scrollview in my Main view and I have three subviews on my scrollview. And I have UIButtons in all my subviews.
Now, I want to drag those buttons from one subview to another subview (while dragging the buttons, the scrollview should not scroll).
How do I do this?
I'm not completely sure if this snippet works for this particular case (an UIControl inside a UIScrollView), but my understanding of UIResponder chain suggests me that it should :)
- (void)viewDidLoad { // or wherever you initialize your things
...
// Add swipe event to UIButton so it will capture swipe intents
UIPanGestureRecognizer *panGR = [[UIPanGestureRecognizer alloc] init];
[panGR addTarget:self action:#selector(panEvent:)];
[button addGestureRecognizer:panGR];
[panGR release];
}
- (void)panEvent:(id)sender {
button.center = [sender locationInView:self.view];
}
If this works (can't test it right now, but it did work for me in a similar situation), then you should add more code to handle the drag & drop related events (maybe disable Clip Subviews option in the UIScrollView, add the button to the new superview if the location intersects with the CGRect of the destination, return the button to the original location if it doesn't, etc).
So, what's happening in those lines? When you begin touching the UIButton, the order doesn't get to the UIScrollView because the event could follow as a touch event (handled by the UIButton), or as a pan event (handled by the UIScrollView). When you move your finger, the event is dismissed by the UIButton's responder because there's no Gesture Recognizer that knows how to proceed if the finger is moved.
But when you add a Gesture Recognizer to the UIButton who actually knows what to do when the finger is moved, everything is different: the UIButton will not dismiss the event, and the UIScrollView will never realize that there was a touch moving over it.
I hope my explanation is accurate and comprensible enough. Let me know if a) it doesn't work or b) there's something unclear.
Good luck :)
Try
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
if (allowAppDrag && [gestureRecognizer isKindOfClass:[UIPanGestureRecognizer class]]) {
return NO;
}
return YES;
}

UIButton interaction inside UIPageViewController

I'm using an UIPageViewController in my application and I wanted to have a few UIButtons inside it, sort of like a menu. The problem I have is that when I put an UIButton (or any other interactive element) near the edges of the screen and tap it, instead of the UIButton action being applied, what happens is that the page changes (because the tap on the edge of the screen changes the page on the UIPageViewController). I'd like to know if there's a way to make it so that the UIButton has higher priority than the UIPageViewController so that when I tap the button, it applies the appropriate action instead of changing the page.
I came here with the same problem. Split’s link has the answer.
Make your root view controller the delegate of each of the UIPageViewController’s gesture recognizers, then prevent touches from being delivered if they occur inside any UIControl:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch
{
return ([touch.view isKindOfClass:[UIControl class]] == NO);
}
UIPageViewController has two UIGestureRecognizers. You can access them via gestureRecognizers property. Determine which one is UITapGestureRecognizer and then use this. Hope this helps.
For people that just want to copy/paste code, here is mine :
// I don't want the tap on borders to change the page
-(void) desactivatePageChangerGesture {
for (UIGestureRecognizer* gestureRecognizer in self.pageViewController.gestureRecognizers) {
if ([gestureRecognizer isKindOfClass:[UITapGestureRecognizer class]]) {
gestureRecognizer.enabled = NO;
}
}
}
Just call this function after the UIPageViewController creation.
I had this same problem, and was unsure how to handle the UIGestureRecognizer delegate methods. This short example assumes you are using the "Page Based Application" project type in Xcode 4. Here is what I did:
In RootViewController.h, I made sure to announce that RootViewController would handle the UIGestureRecognizerDelegate protocol:
#interface RootViewController : UIViewController <UIPageViewControllerDelegate, UIGestureRecognizerDelegate>
In RootViewController.m, I assigned RootViewController as the delegate for the UITapGestureRecognizer. This is done at the end of the viewDidLoad method. I did this by iterating over each gestureRecognizer to see which one was the UITapGestureRecognizer.
NSEnumerator *gestureLoop = [self.view.gestureRecognizers objectEnumerator];
id gestureRecognizer;
while (gestureRecognizer = [gestureLoop nextObject]) {
if ([gestureRecognizer isKindOfClass:[UITapGestureRecognizer class]]) {
[(UITapGestureRecognizer *)gestureRecognizer setDelegate:self];
}
}
Finally, I added the gestureRecognizer:shouldReceiveTouch method to the bottom of RootViewController.m (This is copied directly from Split's link):
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
if ([touch.view isKindOfClass:[UIControl class]]) {
// we touched a button, slider, or other UIControl
return NO; // ignore the touch
}
return YES; // handle the touch
}
Comment out these line from your code
self.view.gestureRecognizers = self.pageViewController.gestureRecognizers;
or use UIGestureRecognizer as told by Split
Hope this will help you
OLD ANSWER: If your UIPageViewController has a transitionStyle of UIPageViewControllerTransitionStyleScroll and you are in iOS 6.0+, then you can't use the gestureRecognizer:shouldReceiveTouch: method, because there is no way to set the delegate to self on the gestureRecognizers since pageViewController.gestureRecognizers will return nil. See UIPageViewController returns no Gesture Recognizers in iOS 6 for more information about that.
If you simply want to make sure your UIPageViewController passes along button touch events to a UIButton, you can use
for (UIScrollView *view in _pageViewController.view.subviews) {
if ([view isKindOfClass:[UIScrollView class]]) {
view.delaysContentTouches = NO;
}
}
if you have a transitionStyle of UIPageViewControllerTransitionStyleScroll and you are in iOS 6.0+.
See this answer about why delaysContentTouches = NO is needed for some cases of a UIButton in a UIScrollView
UPDATE: After doing a little more research it appears that if your issue is that the UIButton click seems to only be called sometimes, then that is actually probably the desired behavior inside a UIScrollView. A UIScrollView uses the delaysContentTouches property to automatically determine if the user was trying to scroll or trying to press a button inside the scroll view. I would assume it is best to not alter this behavior to default to NO since doing so will result in an inability to scroll if the user's finger is over a button.
None of the solutions here where you intercept the UIPageViewController's tap gesture recognizers worked for me. I'm targeting iOS 8 and 9.
What worked is to override the functions touchesBegan, touchesCancelled, touchesMoved, and touchesEnded in my custom button which is a subclass of UIControl. Then I just manually send the .TouchUpInside control event if the touch began and ended within the frame of my custom button.
I didn't have to do anything special for the containing page view controller, or the view controller that contains the page view controller.
Swift 5 answer here should do the job.
pageViewController.view.subviews.compactMap({ $0 as? UIScrollView }).first?.delaysContentTouches = false

Detect if a touch is on a moving UIImageView from a View Controller

I have a UIViewController that is detecting touch events with touchesBegan. There are moving UIImageView objects that float around the screen, and I need to see if the touch event landed on one of them. What I am trying:
UITouch* touch = [touches anyObject];
if ([arrayOfUIImageViewsOnScreen containsObject: [touch view]]) {
NSLog(#"UIImageView Touched!");
}
But this never happens. Also if I were to do something like this:
int h = [touch view].bounds.size.height;
NSLog([NSString stringWithFormat: #"%d", h]);
it outputs the height of the entire UIViewController (screen) everytime, even if I touch one of the UIImageViews, so clearly [touch view] is not giving me the UIImageView. How do I detect when only a UIImageView is pressed? Please do not suggest using UIButtons.
Thank you!
If you only want to detect when a UIImageView is pressed, check the class:
if (touch.view.class == [UIImageView class]) {
//do whatever
}
else {
//isnt a UIImageView so do whatever else
}
Edit----
You haven't set the userInteraction to enabled for the UIImageView have you?!
I know you said please do not suggest using UIButtons, but buttons sound like the best/easiest way to me.
You could try sending the hitTest message to the main view's CALayer with one of the touches - it'll return the CALayer furthest down the subview hierarchy that you touched.
You could then test to see if the layer you touched is a layer of one of the UIImageView's, and proceed from there.
This code uses a point generated from a UIGestureRecognizer.
CGPoint thePoint = [r locationInView:self.view];
thePoint = [self.view.layer convertPoint:thePoint toLayer:self.view.layer.superlayer];
selectedLayer = [self.view.layer hitTest:thePoint];
If you want to check the touch means use CGRectContainsPoint.
1.Capture the touch event and get the point where you touched,
2.Make a CGRect which bounds the object you want to check the touch event,
3.Use CGRectContainsPoint(CGRect , CGPoint) and catch the boolean return value.
http://developer.apple.com/library/ios/#DOCUMENTATION/GraphicsImaging/Reference/CGGeometry/Reference/reference.html
Here is the class reference for CGRect.
Forgot about this question- the problem was that I did not wait until viewDidLoad to set userInteractionEnabled on my UIImageView.

How to handle touch event on UILabel as subview of UITableViewCell?

My app has a custom UITableView. In the cellForRowAtIndexPath delegate method of its UIViewController I am instantiating custom UITableViewCell objects that contain multiple custom UILabels (actually a sub-class of OHAttributedLabel) as subviews of the content view.
I have tried setting userInteractionEnabled = YES on the label, then adding touch events in the view controller, but that isn't working.
Ideas?
Thanks
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
if (CGRectContainsPoint([self.site frame], [touch locationInView:self.view])){
//do whatever you want
}
}
Or
UILabel *label = =[UILabel alloc]init];
label.userInteractionEnabled = YES;
UITapGestureRecognizer *tapGesture =
[[[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(labelTap)] autorelease];
[label addGestureRecognizer:tapGesture];
A UILabel isn't a UIControl so you won't get events on UIControlEventTouchUpInside or similar. Why not use a button instead? You can make it look exactly like a label.
Regardless you will probably need to set addTarget:action:forControlEvents: and tag on the UIButton in your cellForRowAtIndexPath: method. In that method, detect which cell's button was tapped by examining the tag value.
If you must use UILabel then you need to subclass it and intercept the touchesBegan/touchesEnded methods (inherited from UIResponder) to detect UIControlEventTouchUpInside yourself.
Problem in OHAttributedLabel. This label also handles tap on links. So for handle tap on any point of label (not just link) you must
self.textLabel.onlyCatchTouchesOnLinks = NO;
Where self.textLabel is your OHAttributedLabel.
And don't forget of userInteractionEnabled.
I don't know if it is the same problem but... I added a label and could not get it to recognize a touch, I eventually realised that it was because I was adding it as a subview, but its frame was outside its parent's frame, hence the touch heirarchy broke
I just had the problem with using static table cells for a settings table where I wanted the whole cell to trigger the first responder for the cell's textfield.
I ended up adding a transparent (custom, blank title) button behind the label (touch disabled) and textfield after not getting any touches using gesture recognizers. I think it should work in a more elegant way, but it solved the task for now and that limited purpose. (and you can just drag connect from the button's default action)
Bit ugly. Then again, it just describes the area behind the text field reacting to touch. Which was the intention after all. So maybe its just not that fancy.
Will keep it until I find the reason for the recognizers not firing.
you can use TTTAttributedLabel to instead it. it's very easy.
when you initial the UITableViewCell,you can delegate:TTTAttributedLabelDelegate
like :
#interface MyTableViewCell : UITableViewCell<TTTAttributedLabelDelegate>{
UILabel *nameLabel;
TTTAttributedLabel *valueLabel;
}
when you initial ,you could add link to label :
[valueLabel addLinkToPhoneNumber:valueStr withRange:NSMakeRange(0, valueStr.length)];
so,you could do anything you want:
- (void)attributedLabel:(TTTAttributedLabel *)label didSelectLinkWithPhoneNumber:(NSString *)phoneNumber{
//do anything you want.
}