How to prevent simultaneous UIGestureRecognizers - iphone

I am using a UIPanGestureRecognizer on several card-like views to let the user move the views around the screen. It's very nice that they can put down 3 fingers and pickup 3 cards at once, however, some of my functionality isn't designed to work like that.
I'd like to only allow 1 gesture recognizer to run at a time. Is there a preferred way to do this?
I've considered:
gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: but it already returns 'NO' by default.
Setting an instance variable when the first gesture begins, but I'm concerned about multithreaded access to this variable (Should I use #synchronized, or would it be too much overhead?).
Keeping an array of the gesture recognizers and checking their state in gestureRecognizerShouldBegin: to ensure that none are in progress.
Thanks.

The best practice is using one (global) gesture recognizer in view that's being superview for your cards with hitTest: for determining which card has been touched. It will allow you to work with multiple touches correctly.

Put a single UIPanGestureRecognizer on the common superview of all your cards, and then do hit detection to find the card in question when the gesture starts. That way you only have 1 gesture recognizer, so only one gesture can run at a time.
Edit: BTW, your idea of keeping an ivar, while clumsy, would work. UIGestureRecognizer is part of UIKit and only operates on the main thread, so you don't have to worry about multithreaded access. But like I said, it's clumsy. Using a single "master" UIGestureRecognizer instead is cleaner.

Related

Using UITapGestureRecognizer rather than manually calling tapCount

I've been checking for multiple taps, whether it is 2 or 10 by simply calling tapCount on any touch:
[[touches anyObject] tapCount]==2
This simply checks for a double tap.
It works fine. I'm wondering if there is any particular reason to instead start using UITapGestureRecognizer.
It would seem that the UITapGestureRecognizer API provides wrappers around the same functionality as just inspecting touches directly, as above. Things like tapCount and the number of fingers on the screen don't require UITapGestureRecognizer.
For things like swipes, I can see the simplicity in letting UIKit handle recognizing those, as they are harder to code manually, but for a tapCount? Where's the real gain here, what am I missing?
Gesture recognizers provide for coordination in processing multiple gesture types on the same view. See the discussion of the state machine in the documentation.
If a tap is the only gesture of interest, you may not find much value, but the architecture comes in handy if you want to coordinate the recognition of taps with other gestures provided either by you or system supplied classes, such as scroll views. Gesture recognizers are given first crack at touches, so you will need to use this architecture, if you want, for example, to recognize touches in a child of a scroll view, before the scroll view processes them.
The gesture recognizers can also be set to defer recognition, so, for example, the action for a single tap is not called until a double tap has timed out.
In general, the gesture recognizer approach is a good one to adopt because it allows gestures to be managed in a consistent fashion across apps and code sources. If Apple wanted to add an assistive technology preference that allowed the user to select a longer over which a double tap would be recognized. they could do this without requiring any changes to the code of developers using standard gesture recognizers.
I should add that gesture recognizers can be added directly to your storyboard or nib, so in most cases you only need to code the target action, which could be a time saver in new code.
UITapGestureRecognizer provides a cleaner, easier to use API, but no new functionality. So for your case, no reason.

Confining a swipe gesture to a certain area (iPhone)

I have just managed to implement detection of a swipe gesture for my app. However I would like to confine the area where the gesture is valid. Thinking about this I came up with a possible solution which would be to check whether the start & finish coordinates are within some area. I was just wondering if there's a better or preferred method of doing something like this.
Simply create an invisible UIView (= with transparent background) and set its frame so it encloses the region you want to detect the gesture into.
Then, simply add a UISwipeGestureRecognizer to that view, and you are done.
Read the generic UIGestureRecognizer Class Reference and the part of the Event Handling Guide for iOS that talks about UIGestureRecognizers for more info.
Of course you could also manage the detection of the swipe gesture by yourself using custom code like explained here in the very same guide but why bother when UIGestureRecognizers can manage everything for you?

Slide of UIScrollView

I made a slideview using a uiview and detecting touches to move pages. This slideview is almost like this, except that I made it works like a UITableView.
Now I'm using this to uivews with uiscrollviews. The problem is, "how to distribute touch events to scrollview or slideview?". I had the logic to do. Basically, the uiscrollviews are vertical and slideview is horizontal.
I tried hitTest to keep the touchBegan,Moved,Ended in slideview. When I get a touch movement horizontally, I keep to slideview, when vertically, distribute to uiscrollview. But I cannot figure out how to distribute events to uiscrollview.
Calling [scrollView touchesBegan:touches withEvent:event] doesn't work. I supposed uiscrollview has a different way to work.
If you don't find a clue to your answer, probably, you're wrong.
UIScrollView uses a own way to get touchesBegan, Moved and Ended. Way that I don't, but it's mean if you override touchesBegan to make UIScrollView stops to work, you won't get it. Using hitTest in superview of scroll, you can get the touches before UIScrollView but you can't change the touches target while touches is happening.
After all, there is one way to solve this, ashly, three ways.
1- Simulate touches
I didn't test this, you'll know below. Events come from UIWindow and distributed to subview by - (void)sendEvent:(UIEvent *)event. We don't know how touches target is saved, and change this is completely out of question. But we can use the idea of override superview's hitTest to know what the user will do to make a 'WA' to change the target. To do this, simulate a event of touch ended. Supposed target will be reset. Simulate a event of touch begin again, and this time make sure to let hittest get scrollview.
You can find how simulate events here. The problem is, probably your app will be rejected due using private methods.
2- Make your own UIScrollView
This should be the best or the worst, depending what you want to do. I believe it's painful. And isn't what you want to do right now.
3- Surrender to 'Nest UIScrollView'
To make slideshow of pdf, hq, docs and books, it's the best and painless way. Put a UIScrollView inside another and let them reach an agreement of scrolling. http://developer.apple.com/library/ios/#documentation/WindowsViews/Conceptual/UIScrollView_pg/NestedScrollViews/NestedScrollViews.html

Splitting a touch sequence between multiple UIGestureRecognizer instances

I'm developing an iPhone/iPad app that supports dragging items between table views. Since all the tables don't fit on screen, I've written a custom UIScrollView that lays them out horizontally, and supports paging.
While I've gotten the primary drag and drop together, there are a few remaining issues I can't get past.
After the user has selected an item to drag, and is dragging, they cannot scroll the UIScrollView to find the destination UITableView.
Sometimes the user will want to drag the item within the same table view. But once the drag has begun, the table view no longer recognizes the scroll gesture.
I've tried a variety of different options, including implementing a UIGestureRecognizerDelegate and allowing multiple gesture recognizers to recognize gestures simultaneously.
The problem, as I see it stems from this description from the Event Handling Guide: "iOS recognizes one or more fingers touching the screen as part of a multitouch sequence. This sequence begins when the first finger touches down on the screen and ends when the last finger is lifted from the screen."
UIGestureRecognizer instances always match against the entire sequence. In my case, I want to split a single sequence down into discrete gestures -- some touches recognize a dragging of an item, while different touches within the same sequence should be recognized as a swipe or scroll gesture. Effectively, I want my gesture recognizers to recognize simultaneously, but only different touches. Once one recognizes a touch as part of a gesture, that touch should be ignored by the others.
I haven't found a way to solve all these issues coherently using the default UIGestureRecognizer subclasses, and am now about to write my own custom mutli-part gesture recognizer.
I'd rather not have to though -- is there any more appropriate way to achieve the same result?
Given the silence here, and a blog post I just found, I believe the answer is that, no there is no way to do sub-gesture recognition with the standard framework.
For those looking to do something similar, take a look at this project/blog post, which is an attempt to create a sub-gesture recognition library:
http://sunetos.com/items/2010/10/31/adding-subgestures-to-ios-gesture-recognition/
I haven't used it -- I ended up manually crafting my own iteractions -- but will consider refactoring to use it if it pans out.

How to intercept touch events globally?

I have an view which is sometimes covered by some other views. However, if the user slides the finger across the screen, I want to slide that underlying view across the screen, too.
I could start making custom views for all those covering subviews and forward all kinds of touch events, but that's somewhat cumbersome. Maybe there's some kind of notification or another way that a UIView or UIControl subclass can be aware of touch events happening right now, no matter where they are.
In short: I need an UIView subclass or UIControl subclass which knows about any touch events happening on the entire screen. Or at least if tht's not possible, knowing about any touch events happening above itself in the same underlying superview.
Another description: There are 20 views, all reside inside the same superview. The first view is covered by 19 others. But if the user slides across the screen, that first view must slide too, so it must be aware of touch events.
Is there any better solution that making all 19 views forward touch events? (yes, all 19 views respond to touch events in this example)
Perhaps the hitTest:withEvent: method of UIView will help, it's helped me achieve what you are trying in the past.