(iPhone) How to handle touches on a UITextView? - iphone

I'm trying to handle touches on a iPhone's UITextView. I successfully managed to handle taps and other touch events by creating a subclass of UIImageViews for example and implementing the touchesBegan method...however that doesn't work with the UITextView apparently :(
The UITextView has user interaction and multi touch enabled, just to be sure...no no joy. Anyone managed to handle this?

UITextView (subclass of UIScrollView) includes a lot of event processing. It handles copy and paste and data detectors. That said, it is probably a bug that it does not pass unhandled events on.
There is a simple solution: you can subclass UITextView and impement your own touchesEnded (and other event handling messages) in your own versions, you should call[super touchesBegan:touches withEvent:event]; inside every touch handling method.
#import "MyTextView.h" //MyTextView:UITextView
#implementation MyTextView
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
NSLog(#"touchesBegan");
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
[super touchesBegan:touches withEvent:event];
NSLog(#"touchesMoved");
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
NSLog(#"****touchesEnded");
[self.nextResponder touchesEnded: touches withEvent:event];
NSLog(#"****touchesEnded");
[super touchesEnded:touches withEvent:event];
NSLog(#"****touchesEnded");
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event{
[super touches... etc];
NSLog(#"touchesCancelled");
}

If you want to handle single/double/triple tap on UITextView, you can delegate UIGestureRecongnizer and add gesture recognizers on your textview.
Heres sameple code (in viewDidLoad):
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleSingleTap)];
//modify this number to recognizer number of tap
[singleTap setNumberOfTapsRequired:1];
[self.textView addGestureRecognizer:singleTap];
[singleTap release];
and
-(void)handleSingleTap{
//handle tap in here
NSLog(#"Single tap on view");
}
Hope this help :D

Better solution (Without swizzling anything or using any Private API :D )
As explained below, adding new UITapGestureRecognizers to the textview does not have the expected results, handler methods are never called. That is because the UITextView has some tap gesture recognizer setup already and I think their delegate does not allow my gesture recognizer to work properly and changing their delegate could lead to even worse results, I believe.
Luckily the UITextView has the gesture recognizer I want already setup, the problem is that it changes according to the state of the view (i.e.: set of gesture recognizers are different when inputing Japanese than when inputing English and also when not being in editing mode).
I solved this by overriding these in a subclass of UITextView:
- (void)addGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
{
[super addGestureRecognizer:gestureRecognizer];
// Check the new gesture recognizer is the same kind as the one we want to implement
// Note:
// This works because `UITextTapRecognizer` is a subclass of `UITapGestureRecognizer`
// and the text view has some `UITextTapRecognizer` added :)
if ([gestureRecognizer isKindOfClass:[UITapGestureRecognizer class]]) {
UITapGestureRecognizer *tgr = (UITapGestureRecognizer *)gestureRecognizer;
if ([tgr numberOfTapsRequired] == 1 &&
[tgr numberOfTouchesRequired] == 1) {
// If found then add self to its targets/actions
[tgr addTarget:self action:#selector(_handleOneFingerTap:)];
}
}
}
- (void)removeGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
{
// Check the new gesture recognizer is the same kind as the one we want to implement
// Read above note
if ([gestureRecognizer isKindOfClass:[UITapGestureRecognizer class]]) {
UITapGestureRecognizer *tgr = (UITapGestureRecognizer *)gestureRecognizer;
if ([tgr numberOfTapsRequired] == 1 &&
[tgr numberOfTouchesRequired] == 1) {
// If found then remove self from its targets/actions
[tgr removeTarget:self action:#selector(_handleOneFingerTap:)];
}
}
[super removeGestureRecognizer:gestureRecognizer];
}
- (void)_handleOneFingerTap:(UITapGestureRecognizer *)tgr
{
NSDictionary *userInfo = [NSDictionary dictionaryWithObject:tgr forKey:#"UITapGestureRecognizer"];
[[NSNotificationCenter defaultCenter] postNotificationName:#"TextViewOneFingerTapNotification" object:self userInfo:userInfo];
// Or I could have handled the action here directly ...
}
By doing this way, no matter when the textview changes its gesture recognizers, we will always catch the tap gesture recognizer we want → Hence, our handler method will be called accordingly :)
Conclusion:
If you want to add a gesture recognizers to the UITextView, you have to check the text view does not have it already.
If it does not have it, just do the regular way. (Create your gesture recognizer, set it up, and add it to the text view) and you are done!.
If it does have it, then you probably need to do something similar as above.
Old Answer
I came up with this answer by swizzling a private method because previous answers have cons and they don't work as expected. Here, rather than modifying the tapping behavior of the UITextView, I just intercept the called method and then call the original method.
Further Explanation
UITextView has a bunch of specialized UIGestureRecognizers, each of these has a target and a action but their target is not the UITextView itself, it's an object of the forward class UITextInteractionAssistant. (This assistant is a #package ivar of UITextView but is forward definition is in the public header: UITextField.h).
UITextTapRecognizer recognizes taps and calls oneFingerTap: on the UITextInteractionAssistant so we want to intercept that call :)
#import <objc/runtime.h>
// Prototype and declaration of method that is going be swizzled
// When called: self and sender are supposed to be UITextInteractionAssistant and UITextTapRecognizer objects respectively
void proxy_oneFingerTap(id self, SEL _cmd, id sender);
void proxy_oneFingerTap(id self, SEL _cmd, id sender){
[[NSNotificationCenter defaultCenter] postNotificationName:#"TextViewOneFinderTap" object:self userInfo:nil];
if ([self respondsToSelector:#selector(proxy_oneFingerTap:)]) {
[self performSelector:#selector(proxy_oneFingerTap:) withObject:sender];
}
}
...
// subclass of UITextView
// Add above method and swizzle it with.
- (void)doTrickForCatchingTaps
{
Class class = [UITextInteractionAssistant class]; // or below line to avoid ugly warnings
//Class class = NSClassFromString(#"UITextInteractionAssistant");
SEL new_selector = #selector(proxy_oneFingerTap:);
SEL orig_selector = #selector(oneFingerTap:);
// Add method dynamically because UITextInteractionAssistant is a private class
BOOL success = class_addMethod(class, new_selector, (IMP)proxy_oneFingerTap, "v#:#");
if (success) {
Method originalMethod = class_getInstanceMethod(class, orig_selector);
Method newMethod = class_getInstanceMethod(class, new_selector);
if ((originalMethod != nil) && (newMethod != nil)){
method_exchangeImplementations(originalMethod, newMethod); // Method swizzle
}
}
}
//... And in the UIViewController, let's say
[textView doTrickForCatchingTaps];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(textViewWasTapped:) name:#"TextViewOneFinderTap" object:nil];
- (void)textViewWasTapped:(NSNotification *)noti{
NSLog(#"%#", NSStringFromSelector:#selector(_cmd));
}

You need to assign the UITextView instance.delegate = self (assuming you want to take care of the events in the same controller)
And make sure to implement the UITextViewDelegate protocol in the interface... ex:
#interface myController : UIViewController <UITextViewDelegate>{
}
Then you can implement any of the following
- (BOOL)textViewShouldBeginEditing:(UITextView *)textView;
- (BOOL)textViewShouldEndEditing:(UITextView *)textView;
- (void)textViewDidBeginEditing:(UITextView *)textView;
- (void)textViewDidEndEditing:(UITextView *)textView;
- (BOOL)textView:(UITextView *)textView shouldChangeTextInRange:(NSRange)range replacementText:(NSString *)text;
- (void)textViewDidChange:(UITextView *)textView;
- (void)textViewDidChangeSelection:(UITextView *)textView;

I'm using a textview as a subview of a larger view. I need the user to be able to scroll the textview, but not edit it. I want to detect a single tap on the textview's superview, including on the textview itself.
Of course, I ran into the problem that the textview swallows up the touches that begin on it. Disabling user interaction would fix this, but then the user won't be able to scroll the textview.
My solution was to make the textview editable and use the textview's shouldBeginEditing delegate method to detect a tap in the textview. I simply return NO, thereby preventing editing, but now I know that the textview (and thus the superview) has been tapped. Between this method and the superview's touchesEnded method I have what I need.
I know that this won't work for people who want to get access to the actual touches, but if all you want to do is detect a tap, this approach works!

How about make a UIScrollView and [scrollView addSubview: textview] which makes it possible to scroll textview?

You can also send a Touch Down event. Wire-up this event through the Interface Builder.
Then add code in your event handler
- (IBAction)onAppIDTap:(id)sender {
//Your code
}

Related

Passing through touches to UIViews underneath

I have a UIView with 4 buttons on it and another UIView on top of the buttons view. The top most view contains a UIImageView with a UITapGestureRecognizer on it.
The behavoir I am trying to create is that when the user taps the UIImageView it toggles between being small in the bottom right hand corner of the screen and animating to become larger. When it is large I want the buttons on the bottom view to be disabled and when it is small and in the bottom right hand corner I want the touches to be passed through to the buttons and for them to work as normal. I am almost there but I cannot get the touches to pass through to the buttons unless I disable the UserInteractions of the top view.
I have this in my initWithFrame: of the top view:
// Add a gesture recognizer to the image view
UITapGestureRecognizer *tapGestureRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(imageTapped:)];
tapGestureRecognizer.cancelsTouchesInView = NO;
[imageView addGestureRecognizer:tapGestureRecognizer];
[tapGestureRecognizer release];
and I this is my imageTapped: method:
- (void) imageTapped:(UITapGestureRecognizer *) gestureRecognizer {
// Toggle between expanding and contracting the image
if (expanded) {
[self contractAnimated:YES];
expanded = NO;
gestureRecognizer.cancelsTouchesInView = NO;
self.userInteractionEnabled = NO;
self.exclusiveTouch = NO;
}
else {
[self expandAnimated:YES];
expanded = YES;
gestureRecognizer.cancelsTouchesInView = NO;
self.userInteractionEnabled = YES;
self.exclusiveTouch = YES;
}
}
With the above code, when the image is large the buttons are inactive, when I touch the image it shrinks and the buttons become active. However, the small image doesn't receive the touches and therefore wont expand.
If I set self.userInteractionEnabled = YES in both cases, then the image expands and contracts when touched but the buttons never receive touches and act as though disabled.
Is there away to get the image to expand and contract when touched but for the buttons underneath to only receive touches if the image is in its contracted state? Am I doing something stupid here and missing something obvious?
I am going absolutely mad trying to get this to work so any help would be appreciated,
Dave
UPDATE:
For further testing I overrode the touchesBegan: and touchesCancelled: methods and called their super implementations on my view containing the UIImageView. With the code above, the touchesCancelled: is never called and the touchesBegan: is always called.
So it would appear that the view is getting the touches, they are just not passed to the view underneath.
UPDATE
Is this because of the way the responder chain works? My view hierarchy looks like this:
VC - View1
-View2
-imageView1 (has tapGestureRecogniser)
-imageView2
-View3
-button1
-button2
I think the OS first does a hitTest as says View2 is in front so should get all the touches and these are never passed on to View3 unless userInteractions is set to NO for View2, in which case the imageView1 is also prevented from receiving touches. Is this how it works and is there a way for View2 to pass through it's touches to View3?
The UIGestureRecognizer is a red herring I think. In the end to solve this I overrode the pointInside:withEvent: method of my UIView:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
BOOL pointInside = NO;
if (CGRectContainsPoint(imageView.frame, point) || expanded) pointInside = YES;
return pointInside;
}
This causes the view to trap all touches if you touch either the imageView or if its expanded flag is set. If it is not expanded then only trap the touches if they are on the imageView.
By returning NO, the top level VC's View queries the rest of its view hierarchy looking for a hit.
Select your View in Storyboard or XIB and...
Or in Swift
view.isUserInteractionEnabled = false
Look into the UIGestureRecognizerDelegate Protocol. Specifically, gestureRecognizer:shouldReceiveTouch:
You'll want to make each UIGestureRecognizer a property of your UIViewController,
// .h
#property (nonatomic, strong) UITapGestureRecognizer *lowerTap;
// .m
#synthesize lowerTap;
// When you are adding the gesture recognizer to the image view
self.lowerTap = tapGestureRecognizer
Make sure you make your UIViewController a delegate,
[self.lowerTap setDelegate: self];
Then, you'd have something like this,
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
if (expanded && gestureRecognizer == self.lowerTap) {
return NO;
}
else {
return YES;
}
}
Of course, this isn't exact code. But this is the general pattern you'd want to follow.
I have a another solution. I have two views, let's call them CustomSubView that were overlapping and they should both receive the touches. So I have a view controller and a custom UIView class, lets call it ViewControllerView that I set in interface builder, then I added the two views that should receive the touches to that view.
So I intercepted the touches in ViewControllerView by overwriting hitTest:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
return self;
}
Then I overwrote in ViewControllerView:
- (void)touchesBegan:(NSSet *)touches
withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
for (UIView *subview in [self.subviews reverseObjectEnumerator])
{
if ([subview isKindOfClass:[CustomSubView class]])
{
[subview touchesBegan:touches withEvent:event];
}
}
}
Do the exact same with touchesMoved touchesEnded and touchesCancelled.
#Magic Bullet Dave's solution but in Swift
Swift 3
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
var pointInside = false
if commentTextField.frame.contains(point) {
pointInside = true
} else {
commentTextField.resignFirstResponder()
}
return pointInside
}
I use it in my CameraOverlayView for ImagePickerViewController.cameraOverlay to give user ability to comment while taking new photo

UIMapView: UIPinchGestureRecognizer not called

I implemented gesture recognizer in UIMapView just as described in the accepted answer to this question: How to intercept touches events on a MKMapView or UIWebView objects?
Single touches are recognized correctly. However, when I changed the superclass of my class from UIGestureRecognizer to UIPinchGestureRecognizer in order to recognize map scaling, everything stopped working.
Now TouchesEnded event occurs only when the user double tap the annotation on map (don't know, why!) and doesn't occur when the user pinches the map (zoom in or out doesn't matter).
PS I'm using iOS SDK 4.3 and testing my app in simulator if that matters.
The code of mapViewController.m - viewDidLoad method:
- (void)viewDidLoad
{
[super viewDidLoad];
MapGestureRecognizer *changeMapPositionRecognizer = [[MapGestureRecognizer alloc] init];
changeMapPositionRecognizer.touchesEndedCallback = ^(NSSet * touches, UIEvent * event)
{
...
};
[self.mapView addGestureRecognizer:changeMapPositionRecognizer];
[changeMapPositionRecognizer release];
}
The code of MapGestureRecognizer.h:
#import <UIKit/UIKit.h>
typedef void (^TouchesEventBlock) (NSSet * touches, UIEvent * event);
#interface MapGestureRecognizer : UIPinchGestureRecognizer
#property(nonatomic, copy) TouchesEventBlock touchesEndedCallback;
#end
The code of MapGestureRecognizer.m:
#import "MapGestureRecognizer.h"
#implementation MapGestureRecognizer
#synthesize touchesEndedCallback = _touchesEndedCallback;
- (id)init
{
self = [super init];
if (self) {
self.cancelsTouchesInView = NO;
}
return self;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if (self.touchesEndedCallback)
{
self.touchesEndedCallback(touches, event);
NSLog(#"Touches ended, callback done");
}
else
{
NSLog(#"Touches ended, callback skipped");
}
}
- (void) dealloc
{
[super dealloc];
}
#end
What should I correct in to make pinch gesture to be recognized?
I'm not sure why you need to subclass UIPinchGestureRecognizer instead of using it directly as-is.
Also not sure why you need the gesture recognizer to detect map scaling which you could do by using the delegate methods regionWillChangeAnimated and regionDidChangeAnimated and comparing the span before and after. Unless you are trying to detect the scaling as it is happening (and not wanting to wait until user finishes the gesture)
The gesture recognizer may not be getting called because the map view's own pinch gesture recognizer is getting called instead.
To have your recognizer called as well as the map view's, implement the UIGestureRecognizer delegate method shouldRecognizeSimultaneouslyWithGestureRecognizer and return YES:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldRecognizeSimultaneouslyWithGestureRecognizer:
(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Make sure the gesture recognizer's delegate property is set or that method won't get called either.

How to detect touch on UIWebView

On UIWebview, how can I detect a touch?
But not when user clicks some URL or touching a control.
Is it possible to handle it?
Use UIGestureRecognizerDelegate method:
Add UIGestureRecognizerDelegate in declaration file (i.e. your .h file)
Step 1: Just set the delegate of gestureRecognizer: (in .m file viewDidLoad)
UITapGestureRecognizer *webViewTapped = [[UITapGestureRecognizer alloc]initWithTarget:self action:#selector(tapAction:)];
webViewTapped.numberOfTapsRequired = 1;
webViewTapped.delegate = self;
[offScreenWebView addGestureRecognizer:webViewTapped];
[webViewTapped release];
Step 2: Override this function: (in .m file)
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Step 3: Now implement the tapAction function:
- (void)tapAction:(UITapGestureRecognizer *)sender
{
NSLog(#"touched");
// Get the specific point that was touched
CGPoint point = [sender locationInView:self.view];
}
The accepted answer is great if you only need to detect taps. If you need to detect all touches, the best way is to create a new UIView subclass and place it over the webview. In the subclass you can detect touches using hitTest:
TouchOverlay.h
#class TouchOverlay;
#protocol TouchOverlayDelegate <NSObject>
#optional
- (void)touchOverlayTouched:(TV4TouchOverlay *)touchOverlay;
#end
#interface TouchOverlay : UIView
#property (nonatomic, unsafe_unretained) id <TouchOverlayDelegate> delegate;
#end
Touchoverlay.m
#implementation TouchOverlay
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
return self;
}
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView == self) {
if (self.delegate && [self.delegate respondsToSelector:#selector(touchOverlayTouched:)]) {
[self.delegate touchOverlayTouched:self];
}
return nil; // Tell the OS to keep looking for a responder
}
return hitView;
}
#end
Note that the accepted answer above will only capture tap gestures (touchDown and touchUp without a drag in between), and that swipe gestures will be ignored.
For my purposes I needed to be informed of both, and so I added swipe gesture recognizers appropriately. (Note that despite being a bit field, you can't OR together swipe gesture recognizers' direction property, so 4 gesture recognizers are required to detect any swipe).
// Note that despite being a bit field, you can't `OR` together swipe gesture
// recognizers' `direction` property, so 4 gesture recognizers are required
// to detect any swipe
for (NSNumber * swipeDirection in #[#(UISwipeGestureRecognizerDirectionUp), #(UISwipeGestureRecognizerDirectionDown), #(UISwipeGestureRecognizerDirectionLeft), #(UISwipeGestureRecognizerDirectionRight)]) {
UISwipeGestureRecognizer * swipe = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(timerReset:)];
swipe.direction = [swipeDirection integerValue];
swipe.delegate = self;
[rootWebView addGestureRecognizer:swipe];
}
Everything that inherits from UIResponder can handle touches (so does UIWebView). Read the doc:
http://developer.apple.com/library/ios/#documentation/uikit/reference/UIResponder_Class/Reference/Reference.html
You'll have to use:
touchesBegan:withEvent:
Edit: Adding the comment here for clarity-
I believe then there's no clean way of doing it, you can either override the hittest withEvent method like this or do a hack like this: overriding UIView
Do you mean you want to override the options that popup when they hold down on a link? I managed to get one to work with this tutorial/guide but the one posted here is still slightly buggy and needs you to do some fine tuning:
http://www.icab.de/blog/2010/07/11/customize-the-contextual-menu-of-uiwebview/

Hiding the keyboard when UITextField loses focus

I've seen some threads about how to dismiss the Keyboard when a UITextField loses focus, but it did not work for me and I don't know how. The "touchesBegan:withEvent:" in the following code, never gets called. Why?
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
if ([self.textFieldOnFocus isFirstResponder] && [touch view] != self.textFieldOnFocus) {
[textFieldOnFocus resignFirstResponder];
}
[super touchesBegan:touches withEvent:event];
}
P.S.: This code has been inserted in the view controller which has a UITableView. The UITextField is in a cell from this table.
So, my opinion is: this method is not being called, cause the touch occurs on the UITableView from my ViewController. So, I think, that to I should have to subclass the UITableView, to use this method as I have seen on other Threads, but it may have a easier way.
Could you please help me? Thanks a lot!
Make sure you set the delegate of the UITextField to First Responder in IB. And I just put a custom (invisible) UIButton over the screen and set up an IBAction to hide the keyboard. Ex:
- (IBAction)hideKeyboard {
[someTextField resignFirstResponder];
}
With that hooked up to a UIButton.
Here is my solution, somewhat inspired by several posts in SO: Simply handle the tap gesture in the context of the View, the user is 'obviously' trying to leave the focus of the UITextField.
-(void)handleViewTapGesture:(UITapGestureRecognizer *)gesture
{
[self endEditing:YES];
}
This is implemented in the ViewController. The handler is added as a gesture recognizer to the appropriate View in the View property's setter:
-(void) setLoginView:(LoginView *)loginView
{
_loginView = loginView;
UITapGestureRecognizer *tapRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self.loginView action:#selector(handleTapGesture:)];
[tapRecognizer setDelegate:self]; // self implements the UIGestureRecognizerDelegate protocol
[self.loginView addGestureRecognizer:tapRecognizer];
}
The handler could be defined in the View as well. If you are unfamiliar with handling gestures, see Apple's docs are tons of samples elsewhere.
I should mention that you will need some additional code to make sure other controls get taps, you need a delegate that implements the UIGestureRecognizerDelegate protocol and this method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch
{
if ([touch.view isKindOfClass:[UIButton class]]) // Customize appropriately.
return NO; // Don't let the custom gestureRecognizer handle the touch
return YES;
}
-(void)touchesEnded: (NSSet *)touches withEvent: (UIEvent *)event
{
for (UIView* view in self.view.subviews)
{
if ([view isKindOfClass:[UITextField class]])
[view resignFirstResponder];
}
}

Handling touches inside UIWebview

I have created a subclass of UIWebView , and have implemented the
touchesBegan, touchesMoved and touchesEnded methods.
but the webview subclass is not handling the touch events.
Is there any method to handle the touch events inside the UIWebView subclass ???
No subclassing needed, just add a UITapGestureRecognizer :
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(didTapMethod)];
[tap setNumberOfTapsRequired:1]; // Set your own number here
[tap setDelegate:self]; // Add the <UIGestureRecognizerDelegate> protocol
[self.myWebView addGestureRecognizer:tap];
Add the <UIGestureRecognizerDelegate> protocol in the header file, and add this method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
If all you need is to handle gestures, while leaving the rest of the UIWebView functionality intact, you can subclass UIWebView and use this strategy:
in the init method of your UIWebView subclass, add a gesture recognizer, e.g.:
UISwipeGestureRecognizer * swipeRight = [[UISwipeGestureRecognizer alloc]initWithTarget:self action:#selector(handleSwipeGestureRightMethod)];
swipeRight.direction = UISwipeGestureRecognizerDirectionRight;
[self addGestureRecognizer:swipeRight];
swipeRight.delegate = self;
then, add this method to your class:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer{
return YES;
}
Add and handle your designated selector to the class, in this case "handleSwipeGestureRightMethod" and you are good to go...
You could put an UIView over your UIWebView, and overide the touchesDidBegin etc, then send them to your webview. Ex:
User touches your UIView, which provokes a
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Execute your code then send a touchesBegan to your webview like so:
[webView touchesBegan:touches withEvent:event];
return;
}
your UIView has to be over the webview.
I'm not sure if this is what you want (it's not what you asked for, but it might work depending on what your end game is), but you could instead interpret the touches in JavaScript from inside the UIWebView, and get javascript to do
document.location='http://null/'+xCoord+'/'+yCoord; // Null is arbitrary.
Then you can catch that using the UIWebView's delegate method
- (BOOL)webView:(UIWebView *)webView shouldStartLoadWithRequest:(NSURLRequest *)request navigationType:(UIWebViewNavigationType)navigationType
And if the request.URL.host (or whatever it is) isEqualToString:#"null" take the relevant action (and return NO instead of YES). You can even add the JS to each page by doing something like:
- (void)webViewDidFinishLoad:(UIWebView *)webView {
[webView stringByEvaluatingJavaScriptFromString:#"window.ontouchstart=function(/* ... */);"];
}
Hope this helps?
Handling gestures on a UIWebView is discussed in this Apple Developer forum thread.
Using the info given there, there will be no need for an extra view in most or all cases, and as mentioned here before, overriding UIWebView is not the way to go.
Copypaste of the most important post in the thread:
This is a known issue. The UIWebView has its own UITapGestureRecognizers, and they're on a private subview of the UIWebView itself. UIGestureRecognizer precedence defines that gestures attached to views deeper in the view hierarchy will exclude ones on superviews, so the web view's tap gestures will always win over yours.
If it's okay in your case to allow your tap to happen along with the web view's normal tap your best solution would be to implement the UIGestureRecognizerDelegate method gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer and return YES for other tap gestures. This way you'll get your tap handler called, and the web view will still get its called.
If you need to be the only one handling the tap you'll have to subclass UITapGestureRecognizer so you can use the one-way overrides in UIGestureRecognizerSubclass.h, an you can then return NO from canBePreventedByGestureRecognizer: when asked if the web view's tap gesture recognizer can prevent yours.
In any case, we know about this and hope to make it easier in the future.
I've just found that UIWebView does check whether it responds to the - (void)webViewDidNotClick: (id)webBrowserView selector, once one taps on the view area (not on hyperref, or any other area that should be handled specifically). So you may implement that selector with your handling code :)
Do you mean your sub-classed implementation is not called when touchesBegan, touchesMoved and touchesEnded are called?
It sounds like a problem with how you've created an instance of the object. More details are required I think.
(taken form comments)
Header File
#import <UIKit/UIKit.h>
#interface MyWebView : UIWebView { } #end
Implementation File
#import "MyWebView.h"
#implementation MyWebView
- (id)initWithFrame:(CGRect)frame {
if (self = [super initWithFrame:frame]) { } return self;
}
- (void)drawRect:(CGRect)rect {
NSLog(#"MyWebView is loaded");
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"touches began");
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"Touches ended");
}
- (void)dealloc {
[super dealloc];
}
#end
I would try overriding -sendEvent: on UIWindow, to see if you can intercept those touch events.
Following on from what Unfalkster said, you can use the hitTest method to achieve what you want, but you don't have to subclass UIWindow. Just put this in your web view subclass. You will get a compile time warning but it does work:
- (void)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
if (event.type == UIEventTypeTouches) {
// get location info
CGFloat x = point.x;
CGFloat y = point.y;
// get touches
NSSet *touches = [event allTouches];
// individual touches
for (UITouch *touch in touches) {
if (touch.phase == UITouchPhaseBegan) {
// touches began
} else if (touch.phase == UITouchPhaseMoved) {
}
// etc etc
}
}
// call the super
[super hitTest:point withEvent:event];
}
Hope that helps!
If you want to detect your own taps but disable the UIWebView's taps then you can use my solution:
-(void)recursivelyDisableTapsOnView:(UIView*)v{
for(UIView* view in v.subviews){
for(UIGestureRecognizer* g in view.gestureRecognizers){
if(g == self.ownTapRecognizer){
continue;
}
if([g isKindOfClass:[UITapGestureRecognizer class]] ||
[g isKindOfClass:[UILongPressGestureRecognizer class]] ||
[g isKindOfClass:NSClassFromString(#"UITapAndAHalfRecognizer")]){
g.enabled = NO;
}
}
[self recursivelyDisableTapsOnView:view];
}
}
- (void)webViewDidFinishLoad:(UIWebView *)webView{
[self recursivelyDisableTapsOnView:webView];
//disable selection
[webView stringByEvaluatingJavaScriptFromString:#"document.documentElement.style.webkitUserSelect='none';"];
// Disable callout
[webView stringByEvaluatingJavaScriptFromString:#"document.documentElement.style.webkitTouchCallout='none';"];
}