GestureRecogniser and disabling drag gesture of a UIView in a UIScrollView - iphone

Recently I had a problem. How to disable scrolling in a particular area of a UIScrollView, particularly the area occupied by a UIView or subview.
I've readen a lot about subclassing and other long approaches to solve this.
But recently I solved this problem in a simplier manner without subclassing:
UIPanGestureRecognizer *panrecognizer = [[UIPanGestureRecognizer alloc] init];
and then
[panrecognizer setCancelsTouchesinView:NO];
[mySubViewInScroll addGestureRecognizer:panrecognizer];
I created the UIPanGestureRecognizer without an Action passed to it and then added the recognizer to the view in the scroller. In such way the gestures on the view will be captured but expressly not handled by the view or by the superviews because we passed to the object no Action.
The question is this. Is this a correct approach to handle this type of prolem or it's better to do otherwise. I mean Apple will accept this kind of application with this approach ?

In effect I think that this, even if not the best is the most practical solution..since messing around with the classes and subclassing to achieve only a partial scroll lock of the screen seems to be very odd. So let-s see if Apple will accept this kind of solution...

try rewrite ScrollView's
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
to return chi

Related

ICarousel in UIScrollView

I added ICarousel to my IOS project and it worked fine. I could scroll pictures.
Then my view contain another data. So I needed to add a UISCrollView wich cover all my view. So, now, I have some elements (labels, textViews, and the UIVIew for ICarousel) in my ScrollView.
The ScrollView works fine. But now, the ICarousel doesn't switch pictures. Pictures are loaded (I see the first one and a part of the second one) but the carousel doesn't work anymore.
Did someone have the same problem? How to solve it?
Edit:
After #Wain advices, I tried this:
- (void)viewDidLoad
{
UIPanGestureRecognizer *panRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan:)];
[self.view addGestureRecognizer:panRecognizer];
panRecognizer.delegate = self;
}
- (void)pan:(id)sender {
NSLog(#"Pan");
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer{
return YES;
}
It doesn't work, but I checked with a breakpoint, it ran shouldRecognizeSimultaneouslyWithGestureRecognizer method when I tried a horizontal scroll.
Where am I wrong?
Looks like a clash between gesture recognisers because, by default, only one can be active at any one time.
Not sure why your scroll view covers everything, but you should be able to make your controller the delegate of the gestures and implement gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: to allow them to recognise concurrently.
UIScrollView has a panGestureRecognizer property that you can access to set yourself as the delegate.
iCarousel is a bit different as it doesn't make the gesture available publicly so if setting the delegate on the scroll view doesn't work you can edit the carousel (which sets itself as the delegate) to implement the delegate method.

Three Layers of Pan Gesture Recogniser Confusion

Whilst developing an app I have come up against a problem with having too many pan gesture recognisers.
My first pan gesture recogniser is on the MainViewController which is a parent of the RecipeSearchVC. This gesture recogniser slides the whole view left or right.
My second pan gesture recogniser in in the RecipeSearchParametersVC which is a parent of a Page View Controller.
The third pan gesture gesture recogniser is added to a UIControl Wheel nested inside of a view controller which is represented by the PageViewController.
I know this sounds insane and it could be argued that this is poor design. However, I believe that is this worked cohesively it would be fine.
When trying to rotate the wheel it will rotate for a second or two before the gesture is overtaken by either the PageViewController or the MainViewController. More often than not it is the MainViewController that takes over. What techniques could I employ to clearly separate each of these gesture recognisers?
EDIT:
Apologies for the vagueness of my description when it comes to the pan gesture recognisers.
The MainViewController has it's own UIPanGestureRecpgniser to allow it to move everything left or right.
The RecipeSearchParametersVC only has a UIPanGestureRecogniser because of the UIPageViewController it contains. It does not add the gesture recogniser itself, but simply takes them from the the pageViewController.
The UIControl's gesture recognisers allows it to track the rotation it should undergo.
In taking the advice given, I may remove the gestures from the page view controller and substitue them with buttons. I only intended this to work like the images (which can be scrolled to reveal more images) found in iBooks, and so I thought that it would work fine.
UIControl UIPanGestureRecogniser Code
/**
* sent to the control when a touch related to the given event enters the control’s bounds
*
* #param touch uitouch object that represents a touch on the receiving control during tracking
* #param event event object encapsulating the information specific to the user event
*/
- (BOOL)beginTrackingWithTouch:(UITouch *)touch
withEvent:(UIEvent *)event
{
[super beginTrackingWithTouch:touch withEvent:event];
CGPoint touchPoint = [touch locationInView:self];
// filter out touchs too close to centre of wheel
CGFloat magnitudeFromCentre = [self calculateDistanceFromCentre:touchPoint];
if (magnitudeFromCentre < 40) return NO;
// calculate distance from centre
CGFloat deltaX = touchPoint.x - _container.center.x;
CGFloat deltaY = touchPoint.y - _container.center.y;
// calculate the arctangent of the opposite (y axis) over the adjacent (x axis) to get the angle
_deltaAngle = atan2(deltaY, deltaX);
_startTransform = _container.transform;
// selection in limbo so set all sector image's to minimum value by changing current one
[self getSectorByValue:_currentSector].alpha = kMinimumAlpha;
return YES;
}
Unfortunately due to the nature of my controller hierarchy I was forced to rethink the design of my app.
The MainViewController with the UIPanGestureRecogniser has stayed as is.
The UIPageViewController with the UIControl has moved to a separate static view controller.
This works far better but is not yet ideal. The UIPageViewController steals any horizontal panning, however this can probably be fixed by implementing buttons as an alternative to the scrolling.
The UIControl did not have a gesture recogniser, but I override the beginTrackingWithTouch: and other methods to track the touches.
I suppose the answer should be: if you are layering too many gestures, you're doing it wrong.
You will need to add a container to the wheel, and then you can do something like that, if I am not missing something, this code must work.
UIPanGestureRecognizer* pan1 = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan1:)];
[self.view addGestureRecognizer:pan1];
UIPanGestureRecognizer* pan2 = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan2:)];
[view2 addGestureRecognizer:pan2];
UIPanGestureRecognizer* pan3 = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan3:)];
[view3 addGestureRecognizer:pan3];
- (void) pan1:(UIPanGestureRecognizer*)sender{
NSLog(#"%#",sender);
}
- (void) pan2:(UIPanGestureRecognizer*)sender{
NSLog(#"%#",sender);
}
- (void) pan3:(UIPanGestureRecognizer*)sender{
NSLog(#"%#",sender);
}
I believe you would have defined 3 different UIPanGesture objects and attached it to the appropriate views.
While technically this is correct it could cause some confusion, for example, if you're having several overlapping views (you have 3 here) and need a touch to be sent to a view that's not at the top of the view stack what would happen? The gesture could end up being confusing.
Instead, it's possible to attach a single gesture recognizer to the superview of several target views and delegate the gesture to the correct view based on the coordinates of where the user is touching. For that you need to normalize the touch coordinates originating from any of the subviews to the superview where the UIPanGesture is defined. That way you can know where the pan happend on the wheel or elsewhere.
PS: This being said, I strongly feel this is a design gotcha and it will hurt you. I have done things like this but eventually you would stumble on some corner case where the user interaction would be horrid. If this is the main view of your app I suggest you rethink the design.
What techniques could I employ to clearly separate each of these gesture recognisers?
You should look into the UIGestureRecognizerDelegate method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
Asks the delegate if two gesture recognizers should be allowed to recognize gestures simultaneously.
This method is called when recognition of a gesture by either gestureRecognizer or otherGestureRecognizer would block the other gesture recognizer from recognizing its gesture. Note that returning YES is guaranteed to allow simultaneous recognition; returning NO, on the other hand, is not guaranteed to prevent simultaneous recognition because the other gesture recognizer's delegate may return YES.
I believe this will solve your problem of gestures being 'overtaken' by other gestures.

UIPanGestureRecognizer in iOS 4.3 not working

I have a UIImageView subclass and I need to have a pan gesture so I added the following code:
UIPanGestureRecognizer * panRecognizer = [[UIPanGestureRecognizer alloc]initWithTarget:self action:#selector(handlePan)];
[self addGestureRecognizer:panRecognizer];
but my handlePan selector never gets called.
Is there something else i need to do?
Thanks
If your object is UIImageView subclass, you have to enable user interaction. It is set to NO by default for UIImageView.
self.userInteractionEnabled = YES;
I had EXACTLY the same problem using the StoryBoard - I created a sub-view in my main view, dropped a Pan Gesture Recognizer on it, created an action and joined the pan to it, and it didn't work.
My sub-view had User Interaction Enabled on it and it didn't work. Frustrated, I deleted my Pan Gesture, added it back, everything seemed hooked up and it still didn't work.
Finally, I looked at the SUPERVIEW and its User Interaction Enabled was checked off. Checking it on and it worked.
So as a caveat, if it's not working, look at the parent views too!

MBProgressHUD tap to cancel, longer text?

I'd like to use MBProgressHUD (or similar look) as alternative to default UIAlertView.
I need a canceling capability on this view.
I tried adding the following method to MBProgressHUD class but it didn't get called when touched.
Any idea?
(void) touchesEnded: (NSSet*) touches withEvent: (UIEvent*)event
I can't use gesture recognizer since my lowest target version is 3.1.2.
Also, it seems complex to enlarge label size for MBProgressHUD's text.
Are there altanatives than fixing MBProgessHUD for the purpose?
I just had quick look at MBProgressHUD and would use that. First, change the size of the HUD by modifying layoutSubviews in MBProgressHUD.h. I would then create a new button class (UIButton subclass) and add this as a subview of the HUD.
This is a super old thread, but it would be way easier just to set hud's UserInteractionEnabled:YES and add a tapGestureRecognizer to it.
Cheers.

MKMapview with UIGestureRecognizers

So I have a UIView as my Root view on my iPad. I add some subviews to it, amongst which there's also an MKMapView.
The thing I am trying to achieve ist to detect a 3-finger swipe across the screen so I can react to it. Furthermore I need to distinguish between 3-finger swipe to the left or to the right.
Before I added the mapview, I was experimenting with touchesMoved etc. Since I found out this to be inaccurate, I moved to using UISwipegestureRecognizer which worked well.
Anyway, once I added the Map, it ate all my touches. So I kept looking for answers.
This one seemed promising:
Intercepting/Hijacking iPhone Touch Events for MKMapView
as well as subclassing UIWindow and intercepting the touches.
Well, it turns out, none of them work well for me, since in both cases I end up either in
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
or in the situation to recognize the touch type etc.
This is, what I was trying to do in first place!
So my question is: Is there a way to use UIGestureRecognizers the way I described above to maintain my controls while keeping the functionality of the Mapview?
Have I decribed my problem accurately enough?
Greetz.
This answer describes subclassing a gesture recogniser to ensure that it does not interfere with other recognisers.
You could add a custom recogniser to MKMapView that intercepts three finger swipe (and reacts appropriately) but then allows all other gestures to be processed as normal by the MKMapView.
The link above give the example adding simple finger press by subclassing UIGestureRecognizer. For a three finger swipe I would subclass UISwipeGestureRecognizer.
I have had success using this method to add two finger rotate to a MKMapView without messing up pan and zoom etc.
The map should not "eat" all your touches. There's something wrong with that - I have successfully added long press and tap recognizers to the MKMapView, and they worked as expected.
Make sure to assign your controller the delegate of all the gesture recognizers you create, and be sure to implement the method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer :(UIGestureRecognizer *)otherGestureRecognizer;
You can gain further control by implementing
-(BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer;
After that, it's very much up to the details of your implementation to make the gesture recognizer work with the map view. In my experience you can add your own, and it should not be interfering with your recognizers.
Some other thoughts:
- Overlay an invisible user interaction enabled view over the MKMapView to take events from it. Depending on how much interaction you need on the map view you could turn this on and off, etc.
- Add another swipe recognizer to the map view itself (in addition to the one on the parent view).
You could also play around with UIView hitTest method but for me that was always too low level.
question is old, but actual. my two cents, especially for OSX
If You need to detect pinch from region changes by code:
(I set an ivar to detect in in orther places of controller code)
func mapView(_ mapView: MKMapView, regionWillChangeAnimated animated: Bool) {
guard let view = mapView.subviews.first else{ return }
self.regionChangeIsFromUserInteraction = false
#if os(iOS)
guard let recognizers = view.gestureRecognizers else{ return }
#elseif os(OSX)
// on OSX they are in maps..
let recognizers = mapView.gestureRecognizers
#endif
for recognizer : AimGestureRecognizer in recognizers{
let state = recognizer.state
if state == .began || state == .ended {
self.regionChangeIsFromUserInteraction = true
break
}
}
#if DEBUG
print("regionChangeIsFromUserInteraction ", self.regionChangeIsFromUserInteraction)
#endif
}
Would removing the swipe gesture recognizer from your MkMapView resolve the issue?
UISwipeGestureRecognizer *leftSwipeGesture = [[UISwipeGestureRecognizer alloc] init];
UISwipeGestureRecognizer *rightSwipeGesture = [[UISwipeGestureRecognizer alloc] init];
leftSwipeGesture.direction = UISwipeGestureRecognizerDirectionLeft;
rightSwipeGesture.direction = UISwipeGestureRecognizerDirectionRight;
[YourMKMapView removeGestureRecognizer:leftSwipeGesture];
[YourMKMapView removeGestureRecognizer:rightSwipeGesture];
That way, MKMapView will not respond to swipe gestures and resigns responder to the container view controller of the MKMapView.