Objective-C – Options for handling touch events in Cocoa-Touch - iphone

So I believe there are numerous options for handling touch events when programming for the iDevices. A few of the options that I have come a cross are UITapGestureRecognizer and of course using UIButtons. Are there anymore options? What options are most suitable to use? Perhaps someone has a link to a guide or a tutorial that sums this up?
Cheers,
Peter

1) Target-action: UIControl (UIButton is a subclass of that) provides some built-in touch handling by adding a target and action for certain types of touch events. Example:
[myButton addTarget:self
action:#selector(buttonTapped:)
forControlEvents:UIControlEventTouchUpInside];
2) Overriding the UIView methods touchesBegan:withEvent:, touchesMoved:withEvent:, touchesEnded:withEvent: and touchesCancelled:withEvent: – Provides very fine-grained control but can be difficult to use for complex multitouch handling.
3) Gesture recognizers (since iOS 3.2): Recognition of multitouch (or single touch) gestures that are usually comprised of multiple touch events. The built-in gesture recognizers provide support for recognizing taps, pinches, rotation gestures, swipes, panning and long presses. It's also possible to create custom gesture recognizers for more complex gestures.
All the gesture recognizer subclasses are customizable to a certain degree, e.g. you can specify a minimum number of taps and touches for a UITapGestureRecognizer.
Generally, gesture recognizers can both provide discrete events (like a tap) and continuous events (like a rotation that changes its angle over time).

The best resource by far is the WWDC 2011 Video on Multi-Touch (requires a developer account):
http://developer.apple.com/itunes/?destination=adc.apple.com.8270634034.08270634040.8367260921?i=1527940296
This goes over using both gesture recognizers as well as custom touch handling.

I'd use the UIGestureRecognizers for specific gestures (pan, pinch etc) or the touch handling methods which are inherited from UIResponder class (in UIViewController for instance)....
– touchesBegan:withEvent:
– touchesMoved:withEvent:
– touchesEnded:withEvent:
– touchesCancelled:withEvent:
There are a LOT of resources on handling touches in Cocoa-touch. Just google it or even search here on this site for specifics.

Related

Get CGPoint when tap gesture event happens in iwatch

I want to add touch points and gestures in my Apple Watch app if possible.
I read that Apple Watch doesn't provide tap or swipe gesture recognition.
Is there any way to support tap gesture events, and determine CGPoint on the Apple Watch?
watchOS 3 adds third-party support for gestures, including taps and swipes.
Your gesture recognizer action can monitor continuous touch tracking through the locationInObject() method:
Returns the point computed as the current position of the touch event.
You can find more information in the WKGestureRecognizer documentation, as well as the WatchKit Catalog sample code.

Using UITapGestureRecognizer rather than manually calling tapCount

I've been checking for multiple taps, whether it is 2 or 10 by simply calling tapCount on any touch:
[[touches anyObject] tapCount]==2
This simply checks for a double tap.
It works fine. I'm wondering if there is any particular reason to instead start using UITapGestureRecognizer.
It would seem that the UITapGestureRecognizer API provides wrappers around the same functionality as just inspecting touches directly, as above. Things like tapCount and the number of fingers on the screen don't require UITapGestureRecognizer.
For things like swipes, I can see the simplicity in letting UIKit handle recognizing those, as they are harder to code manually, but for a tapCount? Where's the real gain here, what am I missing?
Gesture recognizers provide for coordination in processing multiple gesture types on the same view. See the discussion of the state machine in the documentation.
If a tap is the only gesture of interest, you may not find much value, but the architecture comes in handy if you want to coordinate the recognition of taps with other gestures provided either by you or system supplied classes, such as scroll views. Gesture recognizers are given first crack at touches, so you will need to use this architecture, if you want, for example, to recognize touches in a child of a scroll view, before the scroll view processes them.
The gesture recognizers can also be set to defer recognition, so, for example, the action for a single tap is not called until a double tap has timed out.
In general, the gesture recognizer approach is a good one to adopt because it allows gestures to be managed in a consistent fashion across apps and code sources. If Apple wanted to add an assistive technology preference that allowed the user to select a longer over which a double tap would be recognized. they could do this without requiring any changes to the code of developers using standard gesture recognizers.
I should add that gesture recognizers can be added directly to your storyboard or nib, so in most cases you only need to code the target action, which could be a time saver in new code.
UITapGestureRecognizer provides a cleaner, easier to use API, but no new functionality. So for your case, no reason.

Does UISlider (and other UI elements) have inherit gesture recognizers

I was just curious if a UISlider had an inherit pan gesture recognizer in it, or if UI elements are separate from gesture recognizers. My guess is separate.
Basically I have another method developed by one of our consultants who is no longer with us that takes in a gesture recognizer of a scrollView to get its locationInView property.
Gesture recognizers have existed for some time, although they were only made public in iOS 4.0. The UIScrollView gesture recognizers that are actually used by the scroll view were only made accessible through the public SDK in iOS 5.0, although you could get hold of them in previous iOS versions through some code trickery.
It's possible that a UISlider uses gesture recognizers under the hood, or it may just use the touchesBegan, touchesEnded, etc methods. Either way, it's not something that is exposed to you. It would in theory be possible to find this out through experimentation, but if you start using functionality that isn't exposed in the public SDK you run the risk that it will break in a newer iOS version (this has happened in the past).

Confining a swipe gesture to a certain area (iPhone)

I have just managed to implement detection of a swipe gesture for my app. However I would like to confine the area where the gesture is valid. Thinking about this I came up with a possible solution which would be to check whether the start & finish coordinates are within some area. I was just wondering if there's a better or preferred method of doing something like this.
Simply create an invisible UIView (= with transparent background) and set its frame so it encloses the region you want to detect the gesture into.
Then, simply add a UISwipeGestureRecognizer to that view, and you are done.
Read the generic UIGestureRecognizer Class Reference and the part of the Event Handling Guide for iOS that talks about UIGestureRecognizers for more info.
Of course you could also manage the detection of the swipe gesture by yourself using custom code like explained here in the very same guide but why bother when UIGestureRecognizers can manage everything for you?

Disable gesture recognizer iOS

i'm developing a application for gestures recognizer for iPad and i want to disable the default gesture recognizer of the iOS. When i ask a way do disable the gestures recognizer is using my own application, so i need a way using some functions of the api and not using the settings way.
I don't completely understand your question.
You can add UIGestureRecognizer to objects. You can also remove them.
- (void)removeTarget:(id)target action:(SEL)action
For example:
[imageView addGestureRecognizer:singleTap];
[imageView removeGestureRecognizer:singleTap];
The four- and five- finger gestures are not officially part of iOS, and may never be.
Though it would be best to figure out an alternative, you should be able to use these gestures for now and not fear conflicts (save on the iPads of developers who have specifically turned on this feature, whose users know that said features may conflict with apps.)
One alterative is to change your design to avoid 4-5 finger swipe. From what i know, the 4-5 gesture setting is for end-users to return home screen/open up multitasking bar and you can't do anything about it till Apple releases it for developer, right now it's still on the stage of experimenting for end users.
You can use a UITapGesture and set the number of touches in the Attribute Inspector if you want multiple touches. Doesn't this do what you want?
when you go to gestures in the assistive touch menu, to disable it you swipe to the right like you would to delete a song or a note.