I'm developing an iPhone/iPad app that supports dragging items between table views. Since all the tables don't fit on screen, I've written a custom UIScrollView that lays them out horizontally, and supports paging.
While I've gotten the primary drag and drop together, there are a few remaining issues I can't get past.
After the user has selected an item to drag, and is dragging, they cannot scroll the UIScrollView to find the destination UITableView.
Sometimes the user will want to drag the item within the same table view. But once the drag has begun, the table view no longer recognizes the scroll gesture.
I've tried a variety of different options, including implementing a UIGestureRecognizerDelegate and allowing multiple gesture recognizers to recognize gestures simultaneously.
The problem, as I see it stems from this description from the Event Handling Guide: "iOS recognizes one or more fingers touching the screen as part of a multitouch sequence. This sequence begins when the first finger touches down on the screen and ends when the last finger is lifted from the screen."
UIGestureRecognizer instances always match against the entire sequence. In my case, I want to split a single sequence down into discrete gestures -- some touches recognize a dragging of an item, while different touches within the same sequence should be recognized as a swipe or scroll gesture. Effectively, I want my gesture recognizers to recognize simultaneously, but only different touches. Once one recognizes a touch as part of a gesture, that touch should be ignored by the others.
I haven't found a way to solve all these issues coherently using the default UIGestureRecognizer subclasses, and am now about to write my own custom mutli-part gesture recognizer.
I'd rather not have to though -- is there any more appropriate way to achieve the same result?
Given the silence here, and a blog post I just found, I believe the answer is that, no there is no way to do sub-gesture recognition with the standard framework.
For those looking to do something similar, take a look at this project/blog post, which is an attempt to create a sub-gesture recognition library:
http://sunetos.com/items/2010/10/31/adding-subgestures-to-ios-gesture-recognition/
I haven't used it -- I ended up manually crafting my own iteractions -- but will consider refactoring to use it if it pans out.
Related
I tried implementing the selecting multiple items with a two-finger pan gesture. However, the checkmarks didn't always appear and disappear when tapping edit to start the process, or tapping done when finished.
I later discovered that it works fine when using a UITableViewController after choosing the different controller from cocoa touch menu, instead of the UIViewController and UITableView I was using before.
So my question is: is it correct for me to now assume that these gestures when used in a table are really meant for a dedicated table view controller (with all the extra functionality you only get from it)?
Without any example code, I can't really see what might be going wrong. Have a look at the documentation to see if you are implementing it correctly.
https://developer.apple.com/documentation/uikit/uitableviewdelegate/selecting_multiple_items_with_a_two-finger_pan_gesture
I've been checking for multiple taps, whether it is 2 or 10 by simply calling tapCount on any touch:
[[touches anyObject] tapCount]==2
This simply checks for a double tap.
It works fine. I'm wondering if there is any particular reason to instead start using UITapGestureRecognizer.
It would seem that the UITapGestureRecognizer API provides wrappers around the same functionality as just inspecting touches directly, as above. Things like tapCount and the number of fingers on the screen don't require UITapGestureRecognizer.
For things like swipes, I can see the simplicity in letting UIKit handle recognizing those, as they are harder to code manually, but for a tapCount? Where's the real gain here, what am I missing?
Gesture recognizers provide for coordination in processing multiple gesture types on the same view. See the discussion of the state machine in the documentation.
If a tap is the only gesture of interest, you may not find much value, but the architecture comes in handy if you want to coordinate the recognition of taps with other gestures provided either by you or system supplied classes, such as scroll views. Gesture recognizers are given first crack at touches, so you will need to use this architecture, if you want, for example, to recognize touches in a child of a scroll view, before the scroll view processes them.
The gesture recognizers can also be set to defer recognition, so, for example, the action for a single tap is not called until a double tap has timed out.
In general, the gesture recognizer approach is a good one to adopt because it allows gestures to be managed in a consistent fashion across apps and code sources. If Apple wanted to add an assistive technology preference that allowed the user to select a longer over which a double tap would be recognized. they could do this without requiring any changes to the code of developers using standard gesture recognizers.
I should add that gesture recognizers can be added directly to your storyboard or nib, so in most cases you only need to code the target action, which could be a time saver in new code.
UITapGestureRecognizer provides a cleaner, easier to use API, but no new functionality. So for your case, no reason.
I have just managed to implement detection of a swipe gesture for my app. However I would like to confine the area where the gesture is valid. Thinking about this I came up with a possible solution which would be to check whether the start & finish coordinates are within some area. I was just wondering if there's a better or preferred method of doing something like this.
Simply create an invisible UIView (= with transparent background) and set its frame so it encloses the region you want to detect the gesture into.
Then, simply add a UISwipeGestureRecognizer to that view, and you are done.
Read the generic UIGestureRecognizer Class Reference and the part of the Event Handling Guide for iOS that talks about UIGestureRecognizers for more info.
Of course you could also manage the detection of the swipe gesture by yourself using custom code like explained here in the very same guide but why bother when UIGestureRecognizers can manage everything for you?
When you're working with an atypical nested UITableViews setup - where you have an outer vertical UITableView that hosts 90° rotated UITableViews (see: Looking for a UI library to present Data horizontaly in iOS ):
is there a way to make iOS process vertical and horizontal touches at the same time?
I found that iOS is very clever in processing touches:
horizontal touches make the relevant horizontal UITableView scroll, while a vertical swipe makes the outer UITableView scroll. Perfect.
Only, I'd love to be able to move my finger diagonally and see the outer UITableView and the inner UITableView scroll at the same time.
I tried a few approaches (playing with canCancelContentTouches, delaysContentTouches, and touch messages) but I haven't found a way to make this happen.
EDIT:
Here's a XCode4 project that shows this behavior: http://marcanton.io/other/stackoverflow/nestedtableviews.zip
EDIT:
I submitted this issue to Apple Developer Technical Support, here's their reply:
Thank you for writing to Apple
Worldwide Developer Technical Support.
I am responding to your inquiry
concerning touch events in embedded
UITableViews.
Typically this is an approach that is
not recommended. The issue is that
UITableView inherits from UIScrollView
and as stated in the documentation for
UIScrollView:
"Important: You should not embed
UIWebView or UITableView objects in
UIScrollView objects. If you do so,
unexpected behavior can result because
touch events for the two objects can
be mixed up and wrongly handled."
http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UIScrollView_Class/Reference/UIScrollView.html%23//apple_ref/occ/cl/UIScrollView
So that this time, there is not a
workaround for getting both to scroll
at the same time.
I recommend that you file an
enhancement request at
http://developer.apple.com/bugreporter/
detailing what you would like to see
us add in a future release.
Still, I think that there has to be a way to enable this functionality, although I understand that this is not recommended. In fact, Apple does not even recommend hosting UITableViews inside another UITableView, but with the exception made above, it works quite beautifully.
I'll keep this question updated with our collective findings.
EDIT: There actually is a way, detailed here: http://marcanton.io/blog/nested-orthogonal-tableviews/
This would have to be a custom mirroring of intercepted touch events. Touch events follow the responder chain model, which means that if an object in the responder chain (the top most (outermost) view) cannot handle the event or action, it resends the message to the next responder (in this case the background UITableView in the chain). This is why you are seeing the horizontal events go to the horizontal UITableView and the vertical events going to the vertical UITableView. A diagonal touch event has applicable horizontal and vertical events, so the top-most view (the outer vertical UITableView) can respond to the vertical touches and swallows the event.
If you think about it, all vertical touches likely have a little bit of horizontal events (think about when you flick your finger), so there is likely some work done in the background to determine how to interpret the touch event (either as a vertical or horizontal).
I found this tread on passing events down to the next object in the responder chain. You might want to give this a try as a partial solution to your puzzle. The rest is to figure out how to capture the horizontal touch events and pass them along to the next responder.
Interesting, I haven't played around with this kind of setup yet, but I would try to intercept touch events on the nested UITableViews and delegate any vertical movement to the outer UITableView - and vice-versa.
I'm about to start a new iPhone app that requires a certain functionality but I'm not sure if it's doable. I'm willing to research but first I just wanted to know if I should at least consider it or not.
I haven't seen this in an app before (which is my main concern, even though I haven't seen too many apps since I don't own an iPhone), but an example would be the iPhone shortcuts panels: you can hold on an app, and then drag it to another panel, sweeping while still dragging it. But this is the core app, is it possible to reproduce something similar within a normal app?
I only need to be sure it can be done before I start digging, I don't need code examples or anything, but if you have some exact resources that you consider helpful, that would be appreciated.
Thanks!
Yes. If you have your custom UIView subclass instance inside a UIScrollView, your view controller just needs to set the UIScrollView to delay content touches and not allow it to cancel touch events.
[scrollView setCanCancelContentTouches:NO];
[scrollView setDelaysContentTouches:YES];
When the user taps and holds in the custom view, the event goes to that custom view, which can process the touch events to drag an item around, but if the user quickly swipes, it scrolls the view.
The "panel" view that you're referring to appears to be a UIPageControl view — although, perhaps, the specific incarnation of this view that Apple uses for the iPhone's home page may be customized.
Instances of generic UIView views that you might touch-and-drag will receive touch events. By overriding methods in the view, these events can be processed and passed to the page control, in order to tell it to "sweep" between pages.
If I wanted to do what you're asking about, that's how I might approach it. It seems doable to me, in any case.
Start with this: Swip from one view to the next view
Try using a UIButton that tracks the time since the state of the button changed to "highlighted". You may need to do this in order to track the dragging and move the button around:
Observing pinch multi-touch gestures in a UITableView
Check to see if the button starts overlapping one side of the screen while being dragged. If s certain amount of time elapses since the button first started overlapping the edge and then manipulate the UIScrollView so that it switches to the next page on the corresponding side of the screen
You may need to use NSTimer to keep track of how long the button is held down, etc.
In any case there's no reason why this couldn't work.
If UIButton doesn't work then perhaps try a custom subclass of UIControl (which tracks the same touch down actions etc.). If that doesn't work then use the window event intercept thing to track everything.