MKMapview with UIGestureRecognizers - iphone

So I have a UIView as my Root view on my iPad. I add some subviews to it, amongst which there's also an MKMapView.
The thing I am trying to achieve ist to detect a 3-finger swipe across the screen so I can react to it. Furthermore I need to distinguish between 3-finger swipe to the left or to the right.
Before I added the mapview, I was experimenting with touchesMoved etc. Since I found out this to be inaccurate, I moved to using UISwipegestureRecognizer which worked well.
Anyway, once I added the Map, it ate all my touches. So I kept looking for answers.
This one seemed promising:
Intercepting/Hijacking iPhone Touch Events for MKMapView
as well as subclassing UIWindow and intercepting the touches.
Well, it turns out, none of them work well for me, since in both cases I end up either in
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
or in the situation to recognize the touch type etc.
This is, what I was trying to do in first place!
So my question is: Is there a way to use UIGestureRecognizers the way I described above to maintain my controls while keeping the functionality of the Mapview?
Have I decribed my problem accurately enough?
Greetz.

This answer describes subclassing a gesture recogniser to ensure that it does not interfere with other recognisers.
You could add a custom recogniser to MKMapView that intercepts three finger swipe (and reacts appropriately) but then allows all other gestures to be processed as normal by the MKMapView.
The link above give the example adding simple finger press by subclassing UIGestureRecognizer. For a three finger swipe I would subclass UISwipeGestureRecognizer.
I have had success using this method to add two finger rotate to a MKMapView without messing up pan and zoom etc.

The map should not "eat" all your touches. There's something wrong with that - I have successfully added long press and tap recognizers to the MKMapView, and they worked as expected.
Make sure to assign your controller the delegate of all the gesture recognizers you create, and be sure to implement the method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer :(UIGestureRecognizer *)otherGestureRecognizer;
You can gain further control by implementing
-(BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer;
After that, it's very much up to the details of your implementation to make the gesture recognizer work with the map view. In my experience you can add your own, and it should not be interfering with your recognizers.
Some other thoughts:
- Overlay an invisible user interaction enabled view over the MKMapView to take events from it. Depending on how much interaction you need on the map view you could turn this on and off, etc.
- Add another swipe recognizer to the map view itself (in addition to the one on the parent view).
You could also play around with UIView hitTest method but for me that was always too low level.

question is old, but actual. my two cents, especially for OSX
If You need to detect pinch from region changes by code:
(I set an ivar to detect in in orther places of controller code)
func mapView(_ mapView: MKMapView, regionWillChangeAnimated animated: Bool) {
guard let view = mapView.subviews.first else{ return }
self.regionChangeIsFromUserInteraction = false
#if os(iOS)
guard let recognizers = view.gestureRecognizers else{ return }
#elseif os(OSX)
// on OSX they are in maps..
let recognizers = mapView.gestureRecognizers
#endif
for recognizer : AimGestureRecognizer in recognizers{
let state = recognizer.state
if state == .began || state == .ended {
self.regionChangeIsFromUserInteraction = true
break
}
}
#if DEBUG
print("regionChangeIsFromUserInteraction ", self.regionChangeIsFromUserInteraction)
#endif
}

Would removing the swipe gesture recognizer from your MkMapView resolve the issue?
UISwipeGestureRecognizer *leftSwipeGesture = [[UISwipeGestureRecognizer alloc] init];
UISwipeGestureRecognizer *rightSwipeGesture = [[UISwipeGestureRecognizer alloc] init];
leftSwipeGesture.direction = UISwipeGestureRecognizerDirectionLeft;
rightSwipeGesture.direction = UISwipeGestureRecognizerDirectionRight;
[YourMKMapView removeGestureRecognizer:leftSwipeGesture];
[YourMKMapView removeGestureRecognizer:rightSwipeGesture];
That way, MKMapView will not respond to swipe gestures and resigns responder to the container view controller of the MKMapView.

Related

Three Layers of Pan Gesture Recogniser Confusion

Whilst developing an app I have come up against a problem with having too many pan gesture recognisers.
My first pan gesture recogniser is on the MainViewController which is a parent of the RecipeSearchVC. This gesture recogniser slides the whole view left or right.
My second pan gesture recogniser in in the RecipeSearchParametersVC which is a parent of a Page View Controller.
The third pan gesture gesture recogniser is added to a UIControl Wheel nested inside of a view controller which is represented by the PageViewController.
I know this sounds insane and it could be argued that this is poor design. However, I believe that is this worked cohesively it would be fine.
When trying to rotate the wheel it will rotate for a second or two before the gesture is overtaken by either the PageViewController or the MainViewController. More often than not it is the MainViewController that takes over. What techniques could I employ to clearly separate each of these gesture recognisers?
EDIT:
Apologies for the vagueness of my description when it comes to the pan gesture recognisers.
The MainViewController has it's own UIPanGestureRecpgniser to allow it to move everything left or right.
The RecipeSearchParametersVC only has a UIPanGestureRecogniser because of the UIPageViewController it contains. It does not add the gesture recogniser itself, but simply takes them from the the pageViewController.
The UIControl's gesture recognisers allows it to track the rotation it should undergo.
In taking the advice given, I may remove the gestures from the page view controller and substitue them with buttons. I only intended this to work like the images (which can be scrolled to reveal more images) found in iBooks, and so I thought that it would work fine.
UIControl UIPanGestureRecogniser Code
/**
* sent to the control when a touch related to the given event enters the control’s bounds
*
* #param touch uitouch object that represents a touch on the receiving control during tracking
* #param event event object encapsulating the information specific to the user event
*/
- (BOOL)beginTrackingWithTouch:(UITouch *)touch
withEvent:(UIEvent *)event
{
[super beginTrackingWithTouch:touch withEvent:event];
CGPoint touchPoint = [touch locationInView:self];
// filter out touchs too close to centre of wheel
CGFloat magnitudeFromCentre = [self calculateDistanceFromCentre:touchPoint];
if (magnitudeFromCentre < 40) return NO;
// calculate distance from centre
CGFloat deltaX = touchPoint.x - _container.center.x;
CGFloat deltaY = touchPoint.y - _container.center.y;
// calculate the arctangent of the opposite (y axis) over the adjacent (x axis) to get the angle
_deltaAngle = atan2(deltaY, deltaX);
_startTransform = _container.transform;
// selection in limbo so set all sector image's to minimum value by changing current one
[self getSectorByValue:_currentSector].alpha = kMinimumAlpha;
return YES;
}
Unfortunately due to the nature of my controller hierarchy I was forced to rethink the design of my app.
The MainViewController with the UIPanGestureRecogniser has stayed as is.
The UIPageViewController with the UIControl has moved to a separate static view controller.
This works far better but is not yet ideal. The UIPageViewController steals any horizontal panning, however this can probably be fixed by implementing buttons as an alternative to the scrolling.
The UIControl did not have a gesture recogniser, but I override the beginTrackingWithTouch: and other methods to track the touches.
I suppose the answer should be: if you are layering too many gestures, you're doing it wrong.
You will need to add a container to the wheel, and then you can do something like that, if I am not missing something, this code must work.
UIPanGestureRecognizer* pan1 = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan1:)];
[self.view addGestureRecognizer:pan1];
UIPanGestureRecognizer* pan2 = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan2:)];
[view2 addGestureRecognizer:pan2];
UIPanGestureRecognizer* pan3 = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan3:)];
[view3 addGestureRecognizer:pan3];
- (void) pan1:(UIPanGestureRecognizer*)sender{
NSLog(#"%#",sender);
}
- (void) pan2:(UIPanGestureRecognizer*)sender{
NSLog(#"%#",sender);
}
- (void) pan3:(UIPanGestureRecognizer*)sender{
NSLog(#"%#",sender);
}
I believe you would have defined 3 different UIPanGesture objects and attached it to the appropriate views.
While technically this is correct it could cause some confusion, for example, if you're having several overlapping views (you have 3 here) and need a touch to be sent to a view that's not at the top of the view stack what would happen? The gesture could end up being confusing.
Instead, it's possible to attach a single gesture recognizer to the superview of several target views and delegate the gesture to the correct view based on the coordinates of where the user is touching. For that you need to normalize the touch coordinates originating from any of the subviews to the superview where the UIPanGesture is defined. That way you can know where the pan happend on the wheel or elsewhere.
PS: This being said, I strongly feel this is a design gotcha and it will hurt you. I have done things like this but eventually you would stumble on some corner case where the user interaction would be horrid. If this is the main view of your app I suggest you rethink the design.
What techniques could I employ to clearly separate each of these gesture recognisers?
You should look into the UIGestureRecognizerDelegate method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
Asks the delegate if two gesture recognizers should be allowed to recognize gestures simultaneously.
This method is called when recognition of a gesture by either gestureRecognizer or otherGestureRecognizer would block the other gesture recognizer from recognizing its gesture. Note that returning YES is guaranteed to allow simultaneous recognition; returning NO, on the other hand, is not guaranteed to prevent simultaneous recognition because the other gesture recognizer's delegate may return YES.
I believe this will solve your problem of gestures being 'overtaken' by other gestures.

How to prioritize gesture recognizers and touches in a UIView

From reading the UIGestureRecognizer Class Reference it is implied that the API will handle the prioritizing of touches and gesture controls for you, making sure that your touchesBegan and related methods are not called on the view unless the gesture recognizers have first failed:
A window delivers touch events to a gesture recognizer before it delivers them to the hit-tested view attached to the gesture recognizer. Generally, if a gesture recognizer analyzes the stream of touches in a multi-touch sequence and does not recognize its gesture, the view receives the full complement of touches. If a gesture recognizer recognizes its gesture, the remaining touches for the view are cancelled.
I have added a swipe gesture to my view, and it is working. Via some logging, when I do a single swipe, the method reports as such. However, my touchesBegan method is also reporting via its log, despite that the touchesCancelled method is, as expected, also receiving a message.
I want, and expect, the gesture recognize to prevent touchesBegan or touchesMoved from being called.
So my question is: for the gesture recognizer to in fact delay touches based on its state, is there additional setup necessary? The docs do not suggest anything else as necessary.
My setup is simply:
swipeUpTwoFinger=[[[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(doubleSwipeUp:)]autorelease];
swipeUpTwoFinger.direction=UISwipeGestureRecognizerDirectionUp;
swipeUpTwoFinger.numberOfTouchesRequired=2;
[self addGestureRecognizer:swipeUpTwoFinger];
I have also tried this test to make sure a recognizer has failed before processing with touchesBegan (this test should not be necessary if you believe what the docs say above) but the touchesBegan is still processing the log line after this test:
if (swipeUpTwoFinger.state==UIGestureRecognizerStateFailed)
It sounds like you need:
swipeUpTwoFinger.delaysTouchesBegan = YES;

Detecting a finger being held on an object

I am trying to have an image that when the user touches it, it wiggles and as soon as the user lifts their finger it stops.
Is there a gesture that I can use to detect when the finger is down, not just on the initial touch, or when the user moves there finger?
I have tried a LongPress gesture, but that does not get called the entire time the finger is on the view. Can anyone help me with the best way to active this. Right now i am doing it using touchesBegin, touchesMoved, touchesEnd, but i was wondering if there is a better way.
Any suggestions are greatly appreciated.
Thanks
EDIT
Based on the comments, I slightly misunderstood the original question, so I edit my answer to a different solution, which hopefully is a bit more clear (and answers the actual question - not the one that was in my head).
A LongPress gesture is continuous (where a tap gesture is not). That means, the recognizer callback will continue to be invoked until the gesture is complete - which does not happen until the "longpress" is released. So, the following should do what you want. NOTE: I think you want to "start shaking" a view when the long-press is recognized, then "stop shaking" the view when the fingers are released. I just pretended you have functions for that. Substitute appropriately.
- (void)handleLongPress:(UILongPressGestureRecognizer*)gestureRecognizer
{
if (gestureRecognizer.state == UIGestureRecognizerStateBegan) {
StartShakingView(gestureRecognizer.view);
} else if (gestureRecognizer.state == UIGestureRecognizerStateEnded) {
StopShakingView(gestureRecognizer.view);
}
}
The Apple Touches sample includes code that demonstrate using both UIResponder and UIGestureRecognizer methods.
Either should work for what you're doing.
Simple answer - you could make the image a UIButton, and start the wiggle on TouchDown, and stop it on TouchUpInside or TouchUpOutside
It sounds like you want to subclass UIGestureRecognizer, which, as I recall, gets the touchesBegan:... and associated methods. Read the notes on subclassing in the UIGestureRecognizer reference. Or use a UIButton as SomaMan suggests.

UIView, how to determine when touches entered the view

It appears that all the touch methods of a UIView are only called if the touches began within the bounds of that view. Is there a way to have a view respond to a user who has touched outside the view, but then dragged his fingers into the view?
In case it matters, my specific application is for dragging a MKPinAnnotationView (using built-in 4.0 dragging). I want something to happen if the user drags a pin onto another view (which happens to be an AnnotationView as well, but it could be anything). No method for dragging is called until I let go of the pin; and no method no the UIView that's being dragged to seems to be called unless I started by touching from within the view.
Because the superview is a MKMapView, it is difficult to just use the touchesMoved event of that and check if the user is in the right location or not. Thanks!
So after playing around with it for a while, I found that the answer given here actually gave me what I needed, even though the question being asked was different.
It turns out you can subclass UIGestureRecognizer; and have it handle all the touches for the view that it has been added to (including an MKMapView). This allows all the normal MKMapView interactions to still behave without any problem; but also alerts me of the touches. In touchesMoved, I just check the location of the touch; and see if it is within the bounds of my other view.
From everything I tried; this seems to be the only way to intercept touchesMoved while the user is dragging an MKAnnotation.
You sure can:
(HitstateView.h)
#import <UIKit/UIKit.h>
#interface HitstateView : UIView {
id overrideObject;
}
#property (nonatomic, retain) id overrideObject;
#end
(HitstateView.m)
#import "HitstateView.h"
#implementation HitstateView
#synthesize overrideObject;
- (void)dealloc {
self.overrideObject = nil;
[super dealloc];
}
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView == self) {
return overrideObject;
}
return hitView;
}
#end
Make this view the size of your touch area. Set the overideObject to the view you want the touches to go. IIRC it ought to be a subview of the HitstateView.
Every view inherits UIResponder so every view gets touchesBegan/Moved/Ended - I do not think starting the touch outside the view means the view gets no event when the touch moves over the view. If you want to get a notification that something has been dragged onto your MKMapView you should make a subclass that handles the touch but then passes the event to super, allowing the hierarchy to do whatever it needs to do with the touch. You don't need to capture or modify the event just observe it.
It depends on how your views are set up. Generally leveraging the responder chain is the best way to go. It allows you to play tricks, though it may be too specific to address your particular needs.
You can also play tricks with forward events by override hit testing:
http://developer.apple.com/library/ios/#documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/MultitouchEvents/MultitouchEvents.html%23//apple_ref/doc/uid/TP40009541-CH3-SW3
Your particular case sounds pretty exotic, so you may have to play tricks like having a parent view whose frame is large enough to contain both views in question.

MKMapView regionDidChangeAnimated not always called!

This is frustrating me!!!
It will be called most of the time but then it stops responding to the pinches. It will be called on a screen rotate and a double tap. Not to a pinch!
Help!
I was working on some code that had the same issue and turns out the problem was a subview with a UIGestureRecognizer had been added as a subview to MKMapView, and sometimes, they would cause some delegate methods not to fire.
So make sure you aren't adding subviews or anything to the MKMapView.
Hope this helps.
I was moving the map in code and then it appears I needed to call
[mapView setNeedsDisplay];
After!
I think this problem may have something to do with multi-threading.
I had the same problem this morning. I use a gesture recognizer to capture long press event and then add a pin to the mapview. If works well but after a few rounds, the region did change method stop being called.
I tried a few solutions here but none works. Then I recalled some other issue I had before with multi-threading nature of actions. So I try to moved the code that controls the mapview in long press action to a block that runs in main thread. And the problem is solved.
I managed to solve this problem by disabling the gesture recognizer within the touchesBeganCallback
self.tapInterceptor.touchesBeganCallback = ^(NSSet *touches, UIEvent *event) {
self.tapInterceptor.enabled = NO;
// do something
};
and reenabling it in the regionDidChangeAnimated delegate method
- (void)mapView:(MKMapView *)mapView regionDidChangeAnimated:(BOOL)animated {
self.tapInterceptor.enabled = YES;
// do something
}
Whenever a tap gesture recognizer added to the mapview, setting
recognizer.cancelsTouchesInView = NO;
takes care of the problem if your business logic allows for double processing if touches on mapview (by MKMapView AND the gesture recognizer
that was most recently interfering with the region[Will,Did]ChangeAnimated:)