I have a UIView with 4 buttons on it and another UIView on top of the buttons view. The top most view contains a UIImageView with a UITapGestureRecognizer on it.
The behavoir I am trying to create is that when the user taps the UIImageView it toggles between being small in the bottom right hand corner of the screen and animating to become larger. When it is large I want the buttons on the bottom view to be disabled and when it is small and in the bottom right hand corner I want the touches to be passed through to the buttons and for them to work as normal. I am almost there but I cannot get the touches to pass through to the buttons unless I disable the UserInteractions of the top view.
I have this in my initWithFrame: of the top view:
// Add a gesture recognizer to the image view
UITapGestureRecognizer *tapGestureRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(imageTapped:)];
tapGestureRecognizer.cancelsTouchesInView = NO;
[imageView addGestureRecognizer:tapGestureRecognizer];
[tapGestureRecognizer release];
and I this is my imageTapped: method:
- (void) imageTapped:(UITapGestureRecognizer *) gestureRecognizer {
// Toggle between expanding and contracting the image
if (expanded) {
[self contractAnimated:YES];
expanded = NO;
gestureRecognizer.cancelsTouchesInView = NO;
self.userInteractionEnabled = NO;
self.exclusiveTouch = NO;
}
else {
[self expandAnimated:YES];
expanded = YES;
gestureRecognizer.cancelsTouchesInView = NO;
self.userInteractionEnabled = YES;
self.exclusiveTouch = YES;
}
}
With the above code, when the image is large the buttons are inactive, when I touch the image it shrinks and the buttons become active. However, the small image doesn't receive the touches and therefore wont expand.
If I set self.userInteractionEnabled = YES in both cases, then the image expands and contracts when touched but the buttons never receive touches and act as though disabled.
Is there away to get the image to expand and contract when touched but for the buttons underneath to only receive touches if the image is in its contracted state? Am I doing something stupid here and missing something obvious?
I am going absolutely mad trying to get this to work so any help would be appreciated,
Dave
UPDATE:
For further testing I overrode the touchesBegan: and touchesCancelled: methods and called their super implementations on my view containing the UIImageView. With the code above, the touchesCancelled: is never called and the touchesBegan: is always called.
So it would appear that the view is getting the touches, they are just not passed to the view underneath.
UPDATE
Is this because of the way the responder chain works? My view hierarchy looks like this:
VC - View1
-View2
-imageView1 (has tapGestureRecogniser)
-imageView2
-View3
-button1
-button2
I think the OS first does a hitTest as says View2 is in front so should get all the touches and these are never passed on to View3 unless userInteractions is set to NO for View2, in which case the imageView1 is also prevented from receiving touches. Is this how it works and is there a way for View2 to pass through it's touches to View3?
The UIGestureRecognizer is a red herring I think. In the end to solve this I overrode the pointInside:withEvent: method of my UIView:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
BOOL pointInside = NO;
if (CGRectContainsPoint(imageView.frame, point) || expanded) pointInside = YES;
return pointInside;
}
This causes the view to trap all touches if you touch either the imageView or if its expanded flag is set. If it is not expanded then only trap the touches if they are on the imageView.
By returning NO, the top level VC's View queries the rest of its view hierarchy looking for a hit.
Select your View in Storyboard or XIB and...
Or in Swift
view.isUserInteractionEnabled = false
Look into the UIGestureRecognizerDelegate Protocol. Specifically, gestureRecognizer:shouldReceiveTouch:
You'll want to make each UIGestureRecognizer a property of your UIViewController,
// .h
#property (nonatomic, strong) UITapGestureRecognizer *lowerTap;
// .m
#synthesize lowerTap;
// When you are adding the gesture recognizer to the image view
self.lowerTap = tapGestureRecognizer
Make sure you make your UIViewController a delegate,
[self.lowerTap setDelegate: self];
Then, you'd have something like this,
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
if (expanded && gestureRecognizer == self.lowerTap) {
return NO;
}
else {
return YES;
}
}
Of course, this isn't exact code. But this is the general pattern you'd want to follow.
I have a another solution. I have two views, let's call them CustomSubView that were overlapping and they should both receive the touches. So I have a view controller and a custom UIView class, lets call it ViewControllerView that I set in interface builder, then I added the two views that should receive the touches to that view.
So I intercepted the touches in ViewControllerView by overwriting hitTest:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
return self;
}
Then I overwrote in ViewControllerView:
- (void)touchesBegan:(NSSet *)touches
withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
for (UIView *subview in [self.subviews reverseObjectEnumerator])
{
if ([subview isKindOfClass:[CustomSubView class]])
{
[subview touchesBegan:touches withEvent:event];
}
}
}
Do the exact same with touchesMoved touchesEnded and touchesCancelled.
#Magic Bullet Dave's solution but in Swift
Swift 3
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
var pointInside = false
if commentTextField.frame.contains(point) {
pointInside = true
} else {
commentTextField.resignFirstResponder()
}
return pointInside
}
I use it in my CameraOverlayView for ImagePickerViewController.cameraOverlay to give user ability to comment while taking new photo
Related
In my iPhone app I have one messaging screen. I have added UITapGestureRecognizer on the UIViewController and also I have a UITableview on the screen. I want to select the UITableViewCell but I can't select the UITableView because of UITapGestureRecognizer. When I touch the screen, only the tap gesture action is called but UITableView delegate didSelectRowAtIndexPath: is not called. Could anyone please help me to work on both tap gesture and UITableView:didSelectRowAtIndexPath:. Thanks in advance.
While I prefer Matt Meyer's suggestion or my other suggestion of using a custom gesture recognizer, another solution, not involving custom gesture recognizers, would be to have your tap gesture recognizer identify whether you tapped on a cell in your tableview, and if so, manually invoke didSelectRowAtIndexPath, e.g.:
- (void)handleTap:(UITapGestureRecognizer *)sender
{
CGPoint location = [sender locationInView:self.view];
if (CGRectContainsPoint([self.view convertRect:self.tableView.frame fromView:self.tableView.superview], location))
{
CGPoint locationInTableview = [self.tableView convertPoint:location fromView:self.view];
NSIndexPath *indexPath = [self.tableView indexPathForRowAtPoint:locationInTableview];
if (indexPath)
[self tableView:self.tableView didSelectRowAtIndexPath:indexPath];
return;
}
// otherwise proceed with the rest of your tap handling logic
}
This is suboptimal because if you're doing anything sophisticated with your tableview (e.g. in cell editing, custom controls, etc.), you lose that behavior, but if you're just looking to receive the didSelectRowAtIndexPath, then this might do the job. The other two approaches (separate views or the custom gesture recognizer) let you retain the full tableview functionality, but this could work if you just need something simple and you don't need the rest of the tableview's built-in capabilities.
You can use the TagGesture delegate:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch
{
if ([touch.view isDescendantOfView:yourTableView]) {
return NO;
}
return YES;
}
Hope this helps.
An easier way to do this is to have two views: one containing the view that you want the tap gesture to be on, and one containing the tableview. You can attach the UITapGestureRecognizer to the view you want it to work on, and then it won't block your UITableView.
Assuming you want the tap gesture to work everywhere except over the tableview, you could subclass the tap gesture recognizer, creating a recognizer that will ignore any subviews included in an array of excludedViews, preventing them from generating a successful gesture (thus passing it on to didSelectRowAtIndexPath or whatever):
#import <UIKit/UIGestureRecognizerSubclass.h>
#interface MyTapGestureRecognizer : UITapGestureRecognizer
#property (nonatomic, strong) NSMutableArray *excludedViews;
#end
#implementation MyTapGestureRecognizer
#synthesize excludedViews = _excludedViews;
- (id)initWithTarget:(id)target action:(SEL)action
{
self = [super initWithTarget:target action:action];
if (self)
{
_excludedViews = [[NSMutableArray alloc] init];
}
return self;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
CGPoint location = [[touches anyObject] locationInView:self.view];
for (UIView *excludedView in self.excludedViews)
{
CGRect frame = [self.view convertRect:excludedView.frame fromView:excludedView.superview];
if (CGRectContainsPoint(frame, location))
self.state = UIGestureRecognizerStateFailed;
}
}
#end
And then, when you want to use it, just specify what controls you want to exclude:
MyTapGestureRecognizer *tap = [[MyTapGestureRecognizer alloc] initWithTarget:self action:#selector(handleTap:)];
[tap.excludedViews addObject:self.tableView];
[self.view addGestureRecognizer:tap];
On UIWebview, how can I detect a touch?
But not when user clicks some URL or touching a control.
Is it possible to handle it?
Use UIGestureRecognizerDelegate method:
Add UIGestureRecognizerDelegate in declaration file (i.e. your .h file)
Step 1: Just set the delegate of gestureRecognizer: (in .m file viewDidLoad)
UITapGestureRecognizer *webViewTapped = [[UITapGestureRecognizer alloc]initWithTarget:self action:#selector(tapAction:)];
webViewTapped.numberOfTapsRequired = 1;
webViewTapped.delegate = self;
[offScreenWebView addGestureRecognizer:webViewTapped];
[webViewTapped release];
Step 2: Override this function: (in .m file)
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Step 3: Now implement the tapAction function:
- (void)tapAction:(UITapGestureRecognizer *)sender
{
NSLog(#"touched");
// Get the specific point that was touched
CGPoint point = [sender locationInView:self.view];
}
The accepted answer is great if you only need to detect taps. If you need to detect all touches, the best way is to create a new UIView subclass and place it over the webview. In the subclass you can detect touches using hitTest:
TouchOverlay.h
#class TouchOverlay;
#protocol TouchOverlayDelegate <NSObject>
#optional
- (void)touchOverlayTouched:(TV4TouchOverlay *)touchOverlay;
#end
#interface TouchOverlay : UIView
#property (nonatomic, unsafe_unretained) id <TouchOverlayDelegate> delegate;
#end
Touchoverlay.m
#implementation TouchOverlay
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
return self;
}
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView == self) {
if (self.delegate && [self.delegate respondsToSelector:#selector(touchOverlayTouched:)]) {
[self.delegate touchOverlayTouched:self];
}
return nil; // Tell the OS to keep looking for a responder
}
return hitView;
}
#end
Note that the accepted answer above will only capture tap gestures (touchDown and touchUp without a drag in between), and that swipe gestures will be ignored.
For my purposes I needed to be informed of both, and so I added swipe gesture recognizers appropriately. (Note that despite being a bit field, you can't OR together swipe gesture recognizers' direction property, so 4 gesture recognizers are required to detect any swipe).
// Note that despite being a bit field, you can't `OR` together swipe gesture
// recognizers' `direction` property, so 4 gesture recognizers are required
// to detect any swipe
for (NSNumber * swipeDirection in #[#(UISwipeGestureRecognizerDirectionUp), #(UISwipeGestureRecognizerDirectionDown), #(UISwipeGestureRecognizerDirectionLeft), #(UISwipeGestureRecognizerDirectionRight)]) {
UISwipeGestureRecognizer * swipe = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(timerReset:)];
swipe.direction = [swipeDirection integerValue];
swipe.delegate = self;
[rootWebView addGestureRecognizer:swipe];
}
Everything that inherits from UIResponder can handle touches (so does UIWebView). Read the doc:
http://developer.apple.com/library/ios/#documentation/uikit/reference/UIResponder_Class/Reference/Reference.html
You'll have to use:
touchesBegan:withEvent:
Edit: Adding the comment here for clarity-
I believe then there's no clean way of doing it, you can either override the hittest withEvent method like this or do a hack like this: overriding UIView
Do you mean you want to override the options that popup when they hold down on a link? I managed to get one to work with this tutorial/guide but the one posted here is still slightly buggy and needs you to do some fine tuning:
http://www.icab.de/blog/2010/07/11/customize-the-contextual-menu-of-uiwebview/
I have added Tap Gesture recognizer to a view. My view has an image and a UIToolBar at the bottom with a few UIBarbuttons I want to cancel any touches on these buttons. I am trying to use the following method to cancel the touch. How do I detect whether the touch is on the toolbar or any bar buttons? Frame is also not defined for Bar buttons...
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
if (gestureRecognizer == tapRecognizer) {
if (touch.view==barbutton/*toolbar or bar button item*/)
{
return NO;
}
}
return YES;
}
CGPoint location = [touch locationInView:self.view];
if(CGRectContainsPoint(toolbar.frame, location)) { ... }
This is assuming the toolbar and self.view are in the same coordinate space. If not, you'll have to use UIView's coordinate conversion methods (convertPoint:toView:) to make the spaces match.
Buttons are the first responder and their uitouchup or other event will fire first and won't propogate to the backing view.
You can subclass your buttons and have the touchesbegan/moved/ended do:
[self.nextResponder touchesBegan:touches withEvent:event];
to have your backing view handle all their events for them in which case your gesture code should work.
I need to do something like this:
UISCrollView with UIImageViews as its subviews. When user taps on UIImageViews an action occurs. But when user want to scroll UIScrollView should scroll even if scrolling started at UIImageView (at location where UIImage of UIImageView is displayed).
Basicly I can get one of two scenarios:
I can get that if user taps on UIImageView (which is subview of UIScrollView) an action occur, but when you try to scroll by draging finger from UIImageView the action also occurs (and I want a scroll to occur).
I can make that regardles where user taps the view will scroll but if user taps UIImageView the action will NOT occur.
I can't get you any of my code because I'm testing a lot of aprroches here and there and it's bit messy so it would be no use at all (without a tons of commenting).
Is there a clean and simple solution for doing this?
Ok here is some code:
-(UIView*) hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
if(isDragging == NO)
{
return [super hitTest:point withEvent:event];
}
NSLog(#"dragging ><><><><><>>><><><><>");
return nil;
}
Now if I return nil then I can scroll but I can't get to tap my UIImageView for an action. If I return [super hitTest:point withEvent:event] I can't scroll over my UIIMageView.
isDragging is test code to determine if I'm trying to scroll or just tap. But hit test occurs before I can set isDragging property accordingly to event that is happening.
Here is my init
-(id) initWithCoder:(NSCoder *)aDecoder
{
if(self = [super initWithCoder:aDecoder])
{
[self setUserInteractionEnabled:YES];
UISwipeGestureRecognizer *swipeRecLeft = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(swipe)];
swipeRecLeft.direction = UISwipeGestureRecognizerDirectionDown;
UISwipeGestureRecognizer *swipeRecRight = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(swipe)];
swipeRecRight.direction = UISwipeGestureRecognizerDirectionUp;
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapSingle)];
[self addGestureRecognizer:swipeRecRight];
[self addGestureRecognizer:swipeRecLeft];
[self addGestureRecognizer:singleTap];
[swipeRecLeft release];
[swipeRecRight release];
[singleTap release];
isDragging = NO;
}
else {
self = nil;
}
return self;
}
Here are the rest of actions
-(void) tapSingle
{
[self.delegate hitOccur];
}
-(void) swipe
{
isDragging = YES;
}
And on top of it in my UIScrollView I've setted delegate and when scrolling ends I set isDragging property manualy to NO on each subview of my UIScrollView.
It's working... but it's not perfect. To actually scroll content I must swipe TWICE in UIImageView (first one is to set isDragging to YES and then we can scroll...). How to this right and proper?
LATEST UPDATE:
Ok I've managed to solve this problem. However I'm dead sure my way isn't clean or good one (but regardles it works).
In my UIScrollView subclass I've overided hitTest method with this:
-(UIView*) hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
if(!self.dragging)
{
if([[super hitTest:point withEvent:event] class] == [ResponsiveBookView class])
{
container = [super hitTest:point withEvent:event];
}
}
return self;
}
Where the container is id container and it holds my UIView subclass. So I can recognize if the touch was on my image or on scroll view itself. Now I need to detec if it is scrolling or just touching and I do this here:
-(void) touchesEnded: (NSSet *) touches withEvent: (UIEvent *) event
{
if (!self.dragging) {
NSLog(#"touch touch touch");
[container tapSingle];
[self.nextResponder touchesEnded: touches withEvent:event];
}
[super touchesEnded: touches withEvent: event];
}
As you see if self.dragging (scrolling) the default behaviour will apply. If !self.dragging I will manually call tapSingle on my container (which will then make an action "occur"). It works!
Yeah—use gesture recognizers. If you add a UITapGestureRecognizer to each UIImageView (using the UIVIew -addGestureRecognizer: method), you should get both recognition of the taps and the default scrolling behavior.
I'm trying to handle touches on a iPhone's UITextView. I successfully managed to handle taps and other touch events by creating a subclass of UIImageViews for example and implementing the touchesBegan method...however that doesn't work with the UITextView apparently :(
The UITextView has user interaction and multi touch enabled, just to be sure...no no joy. Anyone managed to handle this?
UITextView (subclass of UIScrollView) includes a lot of event processing. It handles copy and paste and data detectors. That said, it is probably a bug that it does not pass unhandled events on.
There is a simple solution: you can subclass UITextView and impement your own touchesEnded (and other event handling messages) in your own versions, you should call[super touchesBegan:touches withEvent:event]; inside every touch handling method.
#import "MyTextView.h" //MyTextView:UITextView
#implementation MyTextView
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
NSLog(#"touchesBegan");
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
[super touchesBegan:touches withEvent:event];
NSLog(#"touchesMoved");
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
NSLog(#"****touchesEnded");
[self.nextResponder touchesEnded: touches withEvent:event];
NSLog(#"****touchesEnded");
[super touchesEnded:touches withEvent:event];
NSLog(#"****touchesEnded");
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event{
[super touches... etc];
NSLog(#"touchesCancelled");
}
If you want to handle single/double/triple tap on UITextView, you can delegate UIGestureRecongnizer and add gesture recognizers on your textview.
Heres sameple code (in viewDidLoad):
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleSingleTap)];
//modify this number to recognizer number of tap
[singleTap setNumberOfTapsRequired:1];
[self.textView addGestureRecognizer:singleTap];
[singleTap release];
and
-(void)handleSingleTap{
//handle tap in here
NSLog(#"Single tap on view");
}
Hope this help :D
Better solution (Without swizzling anything or using any Private API :D )
As explained below, adding new UITapGestureRecognizers to the textview does not have the expected results, handler methods are never called. That is because the UITextView has some tap gesture recognizer setup already and I think their delegate does not allow my gesture recognizer to work properly and changing their delegate could lead to even worse results, I believe.
Luckily the UITextView has the gesture recognizer I want already setup, the problem is that it changes according to the state of the view (i.e.: set of gesture recognizers are different when inputing Japanese than when inputing English and also when not being in editing mode).
I solved this by overriding these in a subclass of UITextView:
- (void)addGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
{
[super addGestureRecognizer:gestureRecognizer];
// Check the new gesture recognizer is the same kind as the one we want to implement
// Note:
// This works because `UITextTapRecognizer` is a subclass of `UITapGestureRecognizer`
// and the text view has some `UITextTapRecognizer` added :)
if ([gestureRecognizer isKindOfClass:[UITapGestureRecognizer class]]) {
UITapGestureRecognizer *tgr = (UITapGestureRecognizer *)gestureRecognizer;
if ([tgr numberOfTapsRequired] == 1 &&
[tgr numberOfTouchesRequired] == 1) {
// If found then add self to its targets/actions
[tgr addTarget:self action:#selector(_handleOneFingerTap:)];
}
}
}
- (void)removeGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
{
// Check the new gesture recognizer is the same kind as the one we want to implement
// Read above note
if ([gestureRecognizer isKindOfClass:[UITapGestureRecognizer class]]) {
UITapGestureRecognizer *tgr = (UITapGestureRecognizer *)gestureRecognizer;
if ([tgr numberOfTapsRequired] == 1 &&
[tgr numberOfTouchesRequired] == 1) {
// If found then remove self from its targets/actions
[tgr removeTarget:self action:#selector(_handleOneFingerTap:)];
}
}
[super removeGestureRecognizer:gestureRecognizer];
}
- (void)_handleOneFingerTap:(UITapGestureRecognizer *)tgr
{
NSDictionary *userInfo = [NSDictionary dictionaryWithObject:tgr forKey:#"UITapGestureRecognizer"];
[[NSNotificationCenter defaultCenter] postNotificationName:#"TextViewOneFingerTapNotification" object:self userInfo:userInfo];
// Or I could have handled the action here directly ...
}
By doing this way, no matter when the textview changes its gesture recognizers, we will always catch the tap gesture recognizer we want → Hence, our handler method will be called accordingly :)
Conclusion:
If you want to add a gesture recognizers to the UITextView, you have to check the text view does not have it already.
If it does not have it, just do the regular way. (Create your gesture recognizer, set it up, and add it to the text view) and you are done!.
If it does have it, then you probably need to do something similar as above.
Old Answer
I came up with this answer by swizzling a private method because previous answers have cons and they don't work as expected. Here, rather than modifying the tapping behavior of the UITextView, I just intercept the called method and then call the original method.
Further Explanation
UITextView has a bunch of specialized UIGestureRecognizers, each of these has a target and a action but their target is not the UITextView itself, it's an object of the forward class UITextInteractionAssistant. (This assistant is a #package ivar of UITextView but is forward definition is in the public header: UITextField.h).
UITextTapRecognizer recognizes taps and calls oneFingerTap: on the UITextInteractionAssistant so we want to intercept that call :)
#import <objc/runtime.h>
// Prototype and declaration of method that is going be swizzled
// When called: self and sender are supposed to be UITextInteractionAssistant and UITextTapRecognizer objects respectively
void proxy_oneFingerTap(id self, SEL _cmd, id sender);
void proxy_oneFingerTap(id self, SEL _cmd, id sender){
[[NSNotificationCenter defaultCenter] postNotificationName:#"TextViewOneFinderTap" object:self userInfo:nil];
if ([self respondsToSelector:#selector(proxy_oneFingerTap:)]) {
[self performSelector:#selector(proxy_oneFingerTap:) withObject:sender];
}
}
...
// subclass of UITextView
// Add above method and swizzle it with.
- (void)doTrickForCatchingTaps
{
Class class = [UITextInteractionAssistant class]; // or below line to avoid ugly warnings
//Class class = NSClassFromString(#"UITextInteractionAssistant");
SEL new_selector = #selector(proxy_oneFingerTap:);
SEL orig_selector = #selector(oneFingerTap:);
// Add method dynamically because UITextInteractionAssistant is a private class
BOOL success = class_addMethod(class, new_selector, (IMP)proxy_oneFingerTap, "v#:#");
if (success) {
Method originalMethod = class_getInstanceMethod(class, orig_selector);
Method newMethod = class_getInstanceMethod(class, new_selector);
if ((originalMethod != nil) && (newMethod != nil)){
method_exchangeImplementations(originalMethod, newMethod); // Method swizzle
}
}
}
//... And in the UIViewController, let's say
[textView doTrickForCatchingTaps];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(textViewWasTapped:) name:#"TextViewOneFinderTap" object:nil];
- (void)textViewWasTapped:(NSNotification *)noti{
NSLog(#"%#", NSStringFromSelector:#selector(_cmd));
}
You need to assign the UITextView instance.delegate = self (assuming you want to take care of the events in the same controller)
And make sure to implement the UITextViewDelegate protocol in the interface... ex:
#interface myController : UIViewController <UITextViewDelegate>{
}
Then you can implement any of the following
- (BOOL)textViewShouldBeginEditing:(UITextView *)textView;
- (BOOL)textViewShouldEndEditing:(UITextView *)textView;
- (void)textViewDidBeginEditing:(UITextView *)textView;
- (void)textViewDidEndEditing:(UITextView *)textView;
- (BOOL)textView:(UITextView *)textView shouldChangeTextInRange:(NSRange)range replacementText:(NSString *)text;
- (void)textViewDidChange:(UITextView *)textView;
- (void)textViewDidChangeSelection:(UITextView *)textView;
I'm using a textview as a subview of a larger view. I need the user to be able to scroll the textview, but not edit it. I want to detect a single tap on the textview's superview, including on the textview itself.
Of course, I ran into the problem that the textview swallows up the touches that begin on it. Disabling user interaction would fix this, but then the user won't be able to scroll the textview.
My solution was to make the textview editable and use the textview's shouldBeginEditing delegate method to detect a tap in the textview. I simply return NO, thereby preventing editing, but now I know that the textview (and thus the superview) has been tapped. Between this method and the superview's touchesEnded method I have what I need.
I know that this won't work for people who want to get access to the actual touches, but if all you want to do is detect a tap, this approach works!
How about make a UIScrollView and [scrollView addSubview: textview] which makes it possible to scroll textview?
You can also send a Touch Down event. Wire-up this event through the Interface Builder.
Then add code in your event handler
- (IBAction)onAppIDTap:(id)sender {
//Your code
}