I know that you can use a UITapDetector to detect any time the screen is tapped. But, is there a way to detect any time the screen is simply touched? Long, short, whatever, the finger doesn't even have to come up again, just any time the screen is touched at all, have something happen.
Yes, implement UIResponder touchesBegan(_:with:) and related methods. Now you are receiving the raw touches (not completely raw, since they are still associated with the hit-test view); and you can interpret or respond to them however you like. That's how you'd implement a drawing app, for example.
That in fact is what we used to have to do for all touches, before gesture recognizers were invented.
Related
I've been checking for multiple taps, whether it is 2 or 10 by simply calling tapCount on any touch:
[[touches anyObject] tapCount]==2
This simply checks for a double tap.
It works fine. I'm wondering if there is any particular reason to instead start using UITapGestureRecognizer.
It would seem that the UITapGestureRecognizer API provides wrappers around the same functionality as just inspecting touches directly, as above. Things like tapCount and the number of fingers on the screen don't require UITapGestureRecognizer.
For things like swipes, I can see the simplicity in letting UIKit handle recognizing those, as they are harder to code manually, but for a tapCount? Where's the real gain here, what am I missing?
Gesture recognizers provide for coordination in processing multiple gesture types on the same view. See the discussion of the state machine in the documentation.
If a tap is the only gesture of interest, you may not find much value, but the architecture comes in handy if you want to coordinate the recognition of taps with other gestures provided either by you or system supplied classes, such as scroll views. Gesture recognizers are given first crack at touches, so you will need to use this architecture, if you want, for example, to recognize touches in a child of a scroll view, before the scroll view processes them.
The gesture recognizers can also be set to defer recognition, so, for example, the action for a single tap is not called until a double tap has timed out.
In general, the gesture recognizer approach is a good one to adopt because it allows gestures to be managed in a consistent fashion across apps and code sources. If Apple wanted to add an assistive technology preference that allowed the user to select a longer over which a double tap would be recognized. they could do this without requiring any changes to the code of developers using standard gesture recognizers.
I should add that gesture recognizers can be added directly to your storyboard or nib, so in most cases you only need to code the target action, which could be a time saver in new code.
UITapGestureRecognizer provides a cleaner, easier to use API, but no new functionality. So for your case, no reason.
I made a slideview using a uiview and detecting touches to move pages. This slideview is almost like this, except that I made it works like a UITableView.
Now I'm using this to uivews with uiscrollviews. The problem is, "how to distribute touch events to scrollview or slideview?". I had the logic to do. Basically, the uiscrollviews are vertical and slideview is horizontal.
I tried hitTest to keep the touchBegan,Moved,Ended in slideview. When I get a touch movement horizontally, I keep to slideview, when vertically, distribute to uiscrollview. But I cannot figure out how to distribute events to uiscrollview.
Calling [scrollView touchesBegan:touches withEvent:event] doesn't work. I supposed uiscrollview has a different way to work.
If you don't find a clue to your answer, probably, you're wrong.
UIScrollView uses a own way to get touchesBegan, Moved and Ended. Way that I don't, but it's mean if you override touchesBegan to make UIScrollView stops to work, you won't get it. Using hitTest in superview of scroll, you can get the touches before UIScrollView but you can't change the touches target while touches is happening.
After all, there is one way to solve this, ashly, three ways.
1- Simulate touches
I didn't test this, you'll know below. Events come from UIWindow and distributed to subview by - (void)sendEvent:(UIEvent *)event. We don't know how touches target is saved, and change this is completely out of question. But we can use the idea of override superview's hitTest to know what the user will do to make a 'WA' to change the target. To do this, simulate a event of touch ended. Supposed target will be reset. Simulate a event of touch begin again, and this time make sure to let hittest get scrollview.
You can find how simulate events here. The problem is, probably your app will be rejected due using private methods.
2- Make your own UIScrollView
This should be the best or the worst, depending what you want to do. I believe it's painful. And isn't what you want to do right now.
3- Surrender to 'Nest UIScrollView'
To make slideshow of pdf, hq, docs and books, it's the best and painless way. Put a UIScrollView inside another and let them reach an agreement of scrolling. http://developer.apple.com/library/ios/#documentation/WindowsViews/Conceptual/UIScrollView_pg/NestedScrollViews/NestedScrollViews.html
I have a UIWebView which is embedded in a UIScrollView. The webView is resized so that the scroll view manages all the scrolling (I need control over the scrolling).
In the webView I have disabled userSelection via '-webkit-user-select: none;'
Everything is working fine except one annoying detail. When I hold down my finger on the content before starting to scroll for about a second the scrollView won't scroll. My best guess is, that it has something to do with userSelection. The time is about the same it usually takes for the copy/paste/magnifying-thing to appear which usually disables scrolling as well.
I am running out of ideas on how to solve this. Every help would be greatly appreciated!
Thanks!
EDIT: Another aspect of the problem is, that the non-scrolling actually triggers JS-Eventhandler (click, mousedown, mouseup) inside my webView which leads to surprising app behavior. The user puts her finger down, waits, scrolls, nothing happens, removes her finger and this is perceived as a click, which feels wrong from a users perspective.
I would guess what is happening is that after that short duration, the scrollview is no longer interpreting the touch as being on it's view and instead passes the touch down to it's content views.
Have you tried delaying the content touches for the scrollview? This will essentially tell the scrollview to delay taking action on the touch event and instead to briefly monitor the touch and if the touch moves then it recognizes it as a swipe gesture for scrolling. If it doesn't move, it will eventually pass the touch along to it's subviews.
scrollView.delaysContentTouches = YES;
I think even then, there is a standard delay time before the scrollview will pass the touch events along the responder chain. If you hold for too long, it's going to naturally perceive it as being a press down event rather than a scroll event.
This question is not relevant anymore. As of iOS 5.0 the UIWebView is based on a real UIScrollView and also exposes that UIScrollView via a property. Use that instead.
And don't mess with UIWebViews embedded in UIScrollViews anymore. The documentation explicitly advises against that.
Relevant Documentation
I have an view which is sometimes covered by some other views. However, if the user slides the finger across the screen, I want to slide that underlying view across the screen, too.
I could start making custom views for all those covering subviews and forward all kinds of touch events, but that's somewhat cumbersome. Maybe there's some kind of notification or another way that a UIView or UIControl subclass can be aware of touch events happening right now, no matter where they are.
In short: I need an UIView subclass or UIControl subclass which knows about any touch events happening on the entire screen. Or at least if tht's not possible, knowing about any touch events happening above itself in the same underlying superview.
Another description: There are 20 views, all reside inside the same superview. The first view is covered by 19 others. But if the user slides across the screen, that first view must slide too, so it must be aware of touch events.
Is there any better solution that making all 19 views forward touch events? (yes, all 19 views respond to touch events in this example)
Perhaps the hitTest:withEvent: method of UIView will help, it's helped me achieve what you are trying in the past.
I'm about to start a new iPhone app that requires a certain functionality but I'm not sure if it's doable. I'm willing to research but first I just wanted to know if I should at least consider it or not.
I haven't seen this in an app before (which is my main concern, even though I haven't seen too many apps since I don't own an iPhone), but an example would be the iPhone shortcuts panels: you can hold on an app, and then drag it to another panel, sweeping while still dragging it. But this is the core app, is it possible to reproduce something similar within a normal app?
I only need to be sure it can be done before I start digging, I don't need code examples or anything, but if you have some exact resources that you consider helpful, that would be appreciated.
Thanks!
Yes. If you have your custom UIView subclass instance inside a UIScrollView, your view controller just needs to set the UIScrollView to delay content touches and not allow it to cancel touch events.
[scrollView setCanCancelContentTouches:NO];
[scrollView setDelaysContentTouches:YES];
When the user taps and holds in the custom view, the event goes to that custom view, which can process the touch events to drag an item around, but if the user quickly swipes, it scrolls the view.
The "panel" view that you're referring to appears to be a UIPageControl view — although, perhaps, the specific incarnation of this view that Apple uses for the iPhone's home page may be customized.
Instances of generic UIView views that you might touch-and-drag will receive touch events. By overriding methods in the view, these events can be processed and passed to the page control, in order to tell it to "sweep" between pages.
If I wanted to do what you're asking about, that's how I might approach it. It seems doable to me, in any case.
Start with this: Swip from one view to the next view
Try using a UIButton that tracks the time since the state of the button changed to "highlighted". You may need to do this in order to track the dragging and move the button around:
Observing pinch multi-touch gestures in a UITableView
Check to see if the button starts overlapping one side of the screen while being dragged. If s certain amount of time elapses since the button first started overlapping the edge and then manipulate the UIScrollView so that it switches to the next page on the corresponding side of the screen
You may need to use NSTimer to keep track of how long the button is held down, etc.
In any case there's no reason why this couldn't work.
If UIButton doesn't work then perhaps try a custom subclass of UIControl (which tracks the same touch down actions etc.). If that doesn't work then use the window event intercept thing to track everything.