I implemented gesture recognizer in UIMapView just as described in the accepted answer to this question: How to intercept touches events on a MKMapView or UIWebView objects?
Single touches are recognized correctly. However, when I changed the superclass of my class from UIGestureRecognizer to UIPinchGestureRecognizer in order to recognize map scaling, everything stopped working.
Now TouchesEnded event occurs only when the user double tap the annotation on map (don't know, why!) and doesn't occur when the user pinches the map (zoom in or out doesn't matter).
PS I'm using iOS SDK 4.3 and testing my app in simulator if that matters.
The code of mapViewController.m - viewDidLoad method:
- (void)viewDidLoad
{
[super viewDidLoad];
MapGestureRecognizer *changeMapPositionRecognizer = [[MapGestureRecognizer alloc] init];
changeMapPositionRecognizer.touchesEndedCallback = ^(NSSet * touches, UIEvent * event)
{
...
};
[self.mapView addGestureRecognizer:changeMapPositionRecognizer];
[changeMapPositionRecognizer release];
}
The code of MapGestureRecognizer.h:
#import <UIKit/UIKit.h>
typedef void (^TouchesEventBlock) (NSSet * touches, UIEvent * event);
#interface MapGestureRecognizer : UIPinchGestureRecognizer
#property(nonatomic, copy) TouchesEventBlock touchesEndedCallback;
#end
The code of MapGestureRecognizer.m:
#import "MapGestureRecognizer.h"
#implementation MapGestureRecognizer
#synthesize touchesEndedCallback = _touchesEndedCallback;
- (id)init
{
self = [super init];
if (self) {
self.cancelsTouchesInView = NO;
}
return self;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if (self.touchesEndedCallback)
{
self.touchesEndedCallback(touches, event);
NSLog(#"Touches ended, callback done");
}
else
{
NSLog(#"Touches ended, callback skipped");
}
}
- (void) dealloc
{
[super dealloc];
}
#end
What should I correct in to make pinch gesture to be recognized?
I'm not sure why you need to subclass UIPinchGestureRecognizer instead of using it directly as-is.
Also not sure why you need the gesture recognizer to detect map scaling which you could do by using the delegate methods regionWillChangeAnimated and regionDidChangeAnimated and comparing the span before and after. Unless you are trying to detect the scaling as it is happening (and not wanting to wait until user finishes the gesture)
The gesture recognizer may not be getting called because the map view's own pinch gesture recognizer is getting called instead.
To have your recognizer called as well as the map view's, implement the UIGestureRecognizer delegate method shouldRecognizeSimultaneouslyWithGestureRecognizer and return YES:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldRecognizeSimultaneouslyWithGestureRecognizer:
(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Make sure the gesture recognizer's delegate property is set or that method won't get called either.
Related
I want to be able to recognise ALL touches in an interface, no matter what was touched.
I've tried:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
.. but this just recognises when the user taps on something that doesn't respond to taps (uiimages for instance)
The reason i need this ability is that I want to kick in a slide show if the user doesn't touch the screen for 5 minutes, so I want to reset the timer whenever they touch. It seems wrong to put this reset code in each UI event individually.
There are several possible solutions, but as said #omz - overriding the sendEvent: it is the best one.
#interface YourWindow : UIWindow {
NSDate timeOfLastTouch;
}
#end
#implementation YourWindow
- (void)sendEvent:(UIEvent *)event {
[super sendEvent:event];
NSSet *touches = [event allTouches];
UITouch *touch = [touches anyObject];
if( touch.phase == UITouchPhaseEnded ){
timeOfLastTouch = [NSDate date];
}
}
#end
Do not forget replace UIWindow with YourWindow.
You could subclass UIWindow and override the sendEvent: method.
you could use a tap gesture
In your interface add the UIGestureRecognizerDelegate
#interface ViewController : UIViewController <UIGestureRecognizerDelegate> {
then in your viewDidLoad add this
UITapGestureRecognizer *tapped = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapMethod)];
tapped.delegate=self;
tapped.numberOfTapsRequired = 1;
[self.view addGestureRecognizer:tapped];
then do your timer code in the tapped method
-(void)tapped {
//timer code
}
Make sure you UI elements have setUserInteractionEnabled:YES
You can subclass the UIWindow and override the sendEvent: method like this:
- (void)sendEvent:(UIEvent *)event {
if (event.type == UIEventTypeTouches) {
// You got a touch, do whatever you like
};
[super sendEvent:event]; // Let the window do the propagation of the event
}
As #Alladinian said in one of the comments, iOS Reference Documentation mentions that subclassing UIApplication is the right application and thus, seems preferred to subclassing UIWindow. cf. https://developer.apple.com/library/ios/#documentation/UIKit/Reference/UIApplication_Class/Reference/Reference.html :
You might decide to subclass UIApplication to override sendEvent: or
sendAction:to:from:forEvent: to implement custom event and action
dispatching.
In my application I have a lot of little UIImageViews and a view off to the side. What I need is that view off to the side to be able to detect which image view was tapped and give me all the information about it (much like with (id)sender) as well as its coordinates. What is the best way to do this?
This is the internal class from my project. I think you get the idea.
As an option you can create protocol for the owner (or delegate).
You can obtain coords using
-[UITouch locationInView: someView]
Here is the code:
#interface _FaceSelectView : UIImageView {
#protected
VIFaceSelectVC* _owner;
}
-(id) initWithOwner:(FaceSelectVC*) owner;
#end
#implementation _FaceSelectView
-(id) initWithOwner:(FaceSelectVC*) owner {
if( self = [super init] ) {
_owner = owner;
self.userInteractionEnabled = YES;
}
return self;
}
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[_owner _touchesBeganIn: self withTouch: (UITouch*)[touches anyObject]];
}
-(void) touchesEnded:(NSSet*) touches withEvent:(UIEvent*) event {
[_owner _touchesEndedIn: self withTouch: (UITouch*)[touches anyObject]];
}
-(void) touchesMoved:(NSSet*) touches withEvent:(UIEvent*) event {
[_owner _touchesMovedIn: self withTouch: (UITouch*)[touches anyObject]];
}
#end
I would subclass UIImageView and add a new property for the target view you would like to message. Then reenable userInteractionEnabled and add a action to touchesUpInside.
In the action method of the custom subclass call a method on the target view in which you also give the object of the custom subview. Basically you use delegation and pass in the delegate call all parameters you need.
Add tag to UIImageView like myUIImageView.tag = intNumber;
and in side touchesBegan or you can call any common method for all your UIImageView and use tag to identify which view is tapped.
On UIWebview, how can I detect a touch?
But not when user clicks some URL or touching a control.
Is it possible to handle it?
Use UIGestureRecognizerDelegate method:
Add UIGestureRecognizerDelegate in declaration file (i.e. your .h file)
Step 1: Just set the delegate of gestureRecognizer: (in .m file viewDidLoad)
UITapGestureRecognizer *webViewTapped = [[UITapGestureRecognizer alloc]initWithTarget:self action:#selector(tapAction:)];
webViewTapped.numberOfTapsRequired = 1;
webViewTapped.delegate = self;
[offScreenWebView addGestureRecognizer:webViewTapped];
[webViewTapped release];
Step 2: Override this function: (in .m file)
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Step 3: Now implement the tapAction function:
- (void)tapAction:(UITapGestureRecognizer *)sender
{
NSLog(#"touched");
// Get the specific point that was touched
CGPoint point = [sender locationInView:self.view];
}
The accepted answer is great if you only need to detect taps. If you need to detect all touches, the best way is to create a new UIView subclass and place it over the webview. In the subclass you can detect touches using hitTest:
TouchOverlay.h
#class TouchOverlay;
#protocol TouchOverlayDelegate <NSObject>
#optional
- (void)touchOverlayTouched:(TV4TouchOverlay *)touchOverlay;
#end
#interface TouchOverlay : UIView
#property (nonatomic, unsafe_unretained) id <TouchOverlayDelegate> delegate;
#end
Touchoverlay.m
#implementation TouchOverlay
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
return self;
}
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView == self) {
if (self.delegate && [self.delegate respondsToSelector:#selector(touchOverlayTouched:)]) {
[self.delegate touchOverlayTouched:self];
}
return nil; // Tell the OS to keep looking for a responder
}
return hitView;
}
#end
Note that the accepted answer above will only capture tap gestures (touchDown and touchUp without a drag in between), and that swipe gestures will be ignored.
For my purposes I needed to be informed of both, and so I added swipe gesture recognizers appropriately. (Note that despite being a bit field, you can't OR together swipe gesture recognizers' direction property, so 4 gesture recognizers are required to detect any swipe).
// Note that despite being a bit field, you can't `OR` together swipe gesture
// recognizers' `direction` property, so 4 gesture recognizers are required
// to detect any swipe
for (NSNumber * swipeDirection in #[#(UISwipeGestureRecognizerDirectionUp), #(UISwipeGestureRecognizerDirectionDown), #(UISwipeGestureRecognizerDirectionLeft), #(UISwipeGestureRecognizerDirectionRight)]) {
UISwipeGestureRecognizer * swipe = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(timerReset:)];
swipe.direction = [swipeDirection integerValue];
swipe.delegate = self;
[rootWebView addGestureRecognizer:swipe];
}
Everything that inherits from UIResponder can handle touches (so does UIWebView). Read the doc:
http://developer.apple.com/library/ios/#documentation/uikit/reference/UIResponder_Class/Reference/Reference.html
You'll have to use:
touchesBegan:withEvent:
Edit: Adding the comment here for clarity-
I believe then there's no clean way of doing it, you can either override the hittest withEvent method like this or do a hack like this: overriding UIView
Do you mean you want to override the options that popup when they hold down on a link? I managed to get one to work with this tutorial/guide but the one posted here is still slightly buggy and needs you to do some fine tuning:
http://www.icab.de/blog/2010/07/11/customize-the-contextual-menu-of-uiwebview/
I have created a subclass of UIWebView , and have implemented the
touchesBegan, touchesMoved and touchesEnded methods.
but the webview subclass is not handling the touch events.
Is there any method to handle the touch events inside the UIWebView subclass ???
No subclassing needed, just add a UITapGestureRecognizer :
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(didTapMethod)];
[tap setNumberOfTapsRequired:1]; // Set your own number here
[tap setDelegate:self]; // Add the <UIGestureRecognizerDelegate> protocol
[self.myWebView addGestureRecognizer:tap];
Add the <UIGestureRecognizerDelegate> protocol in the header file, and add this method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
If all you need is to handle gestures, while leaving the rest of the UIWebView functionality intact, you can subclass UIWebView and use this strategy:
in the init method of your UIWebView subclass, add a gesture recognizer, e.g.:
UISwipeGestureRecognizer * swipeRight = [[UISwipeGestureRecognizer alloc]initWithTarget:self action:#selector(handleSwipeGestureRightMethod)];
swipeRight.direction = UISwipeGestureRecognizerDirectionRight;
[self addGestureRecognizer:swipeRight];
swipeRight.delegate = self;
then, add this method to your class:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer{
return YES;
}
Add and handle your designated selector to the class, in this case "handleSwipeGestureRightMethod" and you are good to go...
You could put an UIView over your UIWebView, and overide the touchesDidBegin etc, then send them to your webview. Ex:
User touches your UIView, which provokes a
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Execute your code then send a touchesBegan to your webview like so:
[webView touchesBegan:touches withEvent:event];
return;
}
your UIView has to be over the webview.
I'm not sure if this is what you want (it's not what you asked for, but it might work depending on what your end game is), but you could instead interpret the touches in JavaScript from inside the UIWebView, and get javascript to do
document.location='http://null/'+xCoord+'/'+yCoord; // Null is arbitrary.
Then you can catch that using the UIWebView's delegate method
- (BOOL)webView:(UIWebView *)webView shouldStartLoadWithRequest:(NSURLRequest *)request navigationType:(UIWebViewNavigationType)navigationType
And if the request.URL.host (or whatever it is) isEqualToString:#"null" take the relevant action (and return NO instead of YES). You can even add the JS to each page by doing something like:
- (void)webViewDidFinishLoad:(UIWebView *)webView {
[webView stringByEvaluatingJavaScriptFromString:#"window.ontouchstart=function(/* ... */);"];
}
Hope this helps?
Handling gestures on a UIWebView is discussed in this Apple Developer forum thread.
Using the info given there, there will be no need for an extra view in most or all cases, and as mentioned here before, overriding UIWebView is not the way to go.
Copypaste of the most important post in the thread:
This is a known issue. The UIWebView has its own UITapGestureRecognizers, and they're on a private subview of the UIWebView itself. UIGestureRecognizer precedence defines that gestures attached to views deeper in the view hierarchy will exclude ones on superviews, so the web view's tap gestures will always win over yours.
If it's okay in your case to allow your tap to happen along with the web view's normal tap your best solution would be to implement the UIGestureRecognizerDelegate method gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer and return YES for other tap gestures. This way you'll get your tap handler called, and the web view will still get its called.
If you need to be the only one handling the tap you'll have to subclass UITapGestureRecognizer so you can use the one-way overrides in UIGestureRecognizerSubclass.h, an you can then return NO from canBePreventedByGestureRecognizer: when asked if the web view's tap gesture recognizer can prevent yours.
In any case, we know about this and hope to make it easier in the future.
I've just found that UIWebView does check whether it responds to the - (void)webViewDidNotClick: (id)webBrowserView selector, once one taps on the view area (not on hyperref, or any other area that should be handled specifically). So you may implement that selector with your handling code :)
Do you mean your sub-classed implementation is not called when touchesBegan, touchesMoved and touchesEnded are called?
It sounds like a problem with how you've created an instance of the object. More details are required I think.
(taken form comments)
Header File
#import <UIKit/UIKit.h>
#interface MyWebView : UIWebView { } #end
Implementation File
#import "MyWebView.h"
#implementation MyWebView
- (id)initWithFrame:(CGRect)frame {
if (self = [super initWithFrame:frame]) { } return self;
}
- (void)drawRect:(CGRect)rect {
NSLog(#"MyWebView is loaded");
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"touches began");
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"Touches ended");
}
- (void)dealloc {
[super dealloc];
}
#end
I would try overriding -sendEvent: on UIWindow, to see if you can intercept those touch events.
Following on from what Unfalkster said, you can use the hitTest method to achieve what you want, but you don't have to subclass UIWindow. Just put this in your web view subclass. You will get a compile time warning but it does work:
- (void)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
if (event.type == UIEventTypeTouches) {
// get location info
CGFloat x = point.x;
CGFloat y = point.y;
// get touches
NSSet *touches = [event allTouches];
// individual touches
for (UITouch *touch in touches) {
if (touch.phase == UITouchPhaseBegan) {
// touches began
} else if (touch.phase == UITouchPhaseMoved) {
}
// etc etc
}
}
// call the super
[super hitTest:point withEvent:event];
}
Hope that helps!
If you want to detect your own taps but disable the UIWebView's taps then you can use my solution:
-(void)recursivelyDisableTapsOnView:(UIView*)v{
for(UIView* view in v.subviews){
for(UIGestureRecognizer* g in view.gestureRecognizers){
if(g == self.ownTapRecognizer){
continue;
}
if([g isKindOfClass:[UITapGestureRecognizer class]] ||
[g isKindOfClass:[UILongPressGestureRecognizer class]] ||
[g isKindOfClass:NSClassFromString(#"UITapAndAHalfRecognizer")]){
g.enabled = NO;
}
}
[self recursivelyDisableTapsOnView:view];
}
}
- (void)webViewDidFinishLoad:(UIWebView *)webView{
[self recursivelyDisableTapsOnView:webView];
//disable selection
[webView stringByEvaluatingJavaScriptFromString:#"document.documentElement.style.webkitUserSelect='none';"];
// Disable callout
[webView stringByEvaluatingJavaScriptFromString:#"document.documentElement.style.webkitTouchCallout='none';"];
}
I'm trying to handle touches on a iPhone's UITextView. I successfully managed to handle taps and other touch events by creating a subclass of UIImageViews for example and implementing the touchesBegan method...however that doesn't work with the UITextView apparently :(
The UITextView has user interaction and multi touch enabled, just to be sure...no no joy. Anyone managed to handle this?
UITextView (subclass of UIScrollView) includes a lot of event processing. It handles copy and paste and data detectors. That said, it is probably a bug that it does not pass unhandled events on.
There is a simple solution: you can subclass UITextView and impement your own touchesEnded (and other event handling messages) in your own versions, you should call[super touchesBegan:touches withEvent:event]; inside every touch handling method.
#import "MyTextView.h" //MyTextView:UITextView
#implementation MyTextView
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
NSLog(#"touchesBegan");
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
[super touchesBegan:touches withEvent:event];
NSLog(#"touchesMoved");
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
NSLog(#"****touchesEnded");
[self.nextResponder touchesEnded: touches withEvent:event];
NSLog(#"****touchesEnded");
[super touchesEnded:touches withEvent:event];
NSLog(#"****touchesEnded");
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event{
[super touches... etc];
NSLog(#"touchesCancelled");
}
If you want to handle single/double/triple tap on UITextView, you can delegate UIGestureRecongnizer and add gesture recognizers on your textview.
Heres sameple code (in viewDidLoad):
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleSingleTap)];
//modify this number to recognizer number of tap
[singleTap setNumberOfTapsRequired:1];
[self.textView addGestureRecognizer:singleTap];
[singleTap release];
and
-(void)handleSingleTap{
//handle tap in here
NSLog(#"Single tap on view");
}
Hope this help :D
Better solution (Without swizzling anything or using any Private API :D )
As explained below, adding new UITapGestureRecognizers to the textview does not have the expected results, handler methods are never called. That is because the UITextView has some tap gesture recognizer setup already and I think their delegate does not allow my gesture recognizer to work properly and changing their delegate could lead to even worse results, I believe.
Luckily the UITextView has the gesture recognizer I want already setup, the problem is that it changes according to the state of the view (i.e.: set of gesture recognizers are different when inputing Japanese than when inputing English and also when not being in editing mode).
I solved this by overriding these in a subclass of UITextView:
- (void)addGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
{
[super addGestureRecognizer:gestureRecognizer];
// Check the new gesture recognizer is the same kind as the one we want to implement
// Note:
// This works because `UITextTapRecognizer` is a subclass of `UITapGestureRecognizer`
// and the text view has some `UITextTapRecognizer` added :)
if ([gestureRecognizer isKindOfClass:[UITapGestureRecognizer class]]) {
UITapGestureRecognizer *tgr = (UITapGestureRecognizer *)gestureRecognizer;
if ([tgr numberOfTapsRequired] == 1 &&
[tgr numberOfTouchesRequired] == 1) {
// If found then add self to its targets/actions
[tgr addTarget:self action:#selector(_handleOneFingerTap:)];
}
}
}
- (void)removeGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
{
// Check the new gesture recognizer is the same kind as the one we want to implement
// Read above note
if ([gestureRecognizer isKindOfClass:[UITapGestureRecognizer class]]) {
UITapGestureRecognizer *tgr = (UITapGestureRecognizer *)gestureRecognizer;
if ([tgr numberOfTapsRequired] == 1 &&
[tgr numberOfTouchesRequired] == 1) {
// If found then remove self from its targets/actions
[tgr removeTarget:self action:#selector(_handleOneFingerTap:)];
}
}
[super removeGestureRecognizer:gestureRecognizer];
}
- (void)_handleOneFingerTap:(UITapGestureRecognizer *)tgr
{
NSDictionary *userInfo = [NSDictionary dictionaryWithObject:tgr forKey:#"UITapGestureRecognizer"];
[[NSNotificationCenter defaultCenter] postNotificationName:#"TextViewOneFingerTapNotification" object:self userInfo:userInfo];
// Or I could have handled the action here directly ...
}
By doing this way, no matter when the textview changes its gesture recognizers, we will always catch the tap gesture recognizer we want → Hence, our handler method will be called accordingly :)
Conclusion:
If you want to add a gesture recognizers to the UITextView, you have to check the text view does not have it already.
If it does not have it, just do the regular way. (Create your gesture recognizer, set it up, and add it to the text view) and you are done!.
If it does have it, then you probably need to do something similar as above.
Old Answer
I came up with this answer by swizzling a private method because previous answers have cons and they don't work as expected. Here, rather than modifying the tapping behavior of the UITextView, I just intercept the called method and then call the original method.
Further Explanation
UITextView has a bunch of specialized UIGestureRecognizers, each of these has a target and a action but their target is not the UITextView itself, it's an object of the forward class UITextInteractionAssistant. (This assistant is a #package ivar of UITextView but is forward definition is in the public header: UITextField.h).
UITextTapRecognizer recognizes taps and calls oneFingerTap: on the UITextInteractionAssistant so we want to intercept that call :)
#import <objc/runtime.h>
// Prototype and declaration of method that is going be swizzled
// When called: self and sender are supposed to be UITextInteractionAssistant and UITextTapRecognizer objects respectively
void proxy_oneFingerTap(id self, SEL _cmd, id sender);
void proxy_oneFingerTap(id self, SEL _cmd, id sender){
[[NSNotificationCenter defaultCenter] postNotificationName:#"TextViewOneFinderTap" object:self userInfo:nil];
if ([self respondsToSelector:#selector(proxy_oneFingerTap:)]) {
[self performSelector:#selector(proxy_oneFingerTap:) withObject:sender];
}
}
...
// subclass of UITextView
// Add above method and swizzle it with.
- (void)doTrickForCatchingTaps
{
Class class = [UITextInteractionAssistant class]; // or below line to avoid ugly warnings
//Class class = NSClassFromString(#"UITextInteractionAssistant");
SEL new_selector = #selector(proxy_oneFingerTap:);
SEL orig_selector = #selector(oneFingerTap:);
// Add method dynamically because UITextInteractionAssistant is a private class
BOOL success = class_addMethod(class, new_selector, (IMP)proxy_oneFingerTap, "v#:#");
if (success) {
Method originalMethod = class_getInstanceMethod(class, orig_selector);
Method newMethod = class_getInstanceMethod(class, new_selector);
if ((originalMethod != nil) && (newMethod != nil)){
method_exchangeImplementations(originalMethod, newMethod); // Method swizzle
}
}
}
//... And in the UIViewController, let's say
[textView doTrickForCatchingTaps];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(textViewWasTapped:) name:#"TextViewOneFinderTap" object:nil];
- (void)textViewWasTapped:(NSNotification *)noti{
NSLog(#"%#", NSStringFromSelector:#selector(_cmd));
}
You need to assign the UITextView instance.delegate = self (assuming you want to take care of the events in the same controller)
And make sure to implement the UITextViewDelegate protocol in the interface... ex:
#interface myController : UIViewController <UITextViewDelegate>{
}
Then you can implement any of the following
- (BOOL)textViewShouldBeginEditing:(UITextView *)textView;
- (BOOL)textViewShouldEndEditing:(UITextView *)textView;
- (void)textViewDidBeginEditing:(UITextView *)textView;
- (void)textViewDidEndEditing:(UITextView *)textView;
- (BOOL)textView:(UITextView *)textView shouldChangeTextInRange:(NSRange)range replacementText:(NSString *)text;
- (void)textViewDidChange:(UITextView *)textView;
- (void)textViewDidChangeSelection:(UITextView *)textView;
I'm using a textview as a subview of a larger view. I need the user to be able to scroll the textview, but not edit it. I want to detect a single tap on the textview's superview, including on the textview itself.
Of course, I ran into the problem that the textview swallows up the touches that begin on it. Disabling user interaction would fix this, but then the user won't be able to scroll the textview.
My solution was to make the textview editable and use the textview's shouldBeginEditing delegate method to detect a tap in the textview. I simply return NO, thereby preventing editing, but now I know that the textview (and thus the superview) has been tapped. Between this method and the superview's touchesEnded method I have what I need.
I know that this won't work for people who want to get access to the actual touches, but if all you want to do is detect a tap, this approach works!
How about make a UIScrollView and [scrollView addSubview: textview] which makes it possible to scroll textview?
You can also send a Touch Down event. Wire-up this event through the Interface Builder.
Then add code in your event handler
- (IBAction)onAppIDTap:(id)sender {
//Your code
}