Detect touch down & touch up in a view containing UIControls - iphone

I'm trying to detect when a user touches down and up in a view. The view contains some buttons and other UIKit controls. I would like to detect these touch events but not consume them.
I've tried two approaches, but neither has been sufficient:
First I added a transparent overlay overriding -touchesBegan:withEvent: and -touchesEnded:withEvent: and forward the event to the next responder with
[self.nextResponder touchesBegan:touches withEvent:event]
However, it seems that UIKit objects ignore all forwarded events.
Next, I tried overriding -pointInside:withEvent: and -hitTest:withEvent: . This worked great to detect touch down events, but pointInside:: and hitTest:: are not called on touch up (i.e. [[[event allTouches] anyObject] phase] is never equal to UITouchPhaseEnded ).
What's the best way to detect both touch down and touch up events without disturbing interaction with the underlying UIControls?

according to the Event Handling Guide for iOS
you have 3 options:
1) subclassing UIWindow to override sendEvent:
2) using an overlay view
3) designing so that you don't have to do that... so really more like 2 options.
Here is a simplification of apples example, using UIWindow subclass;
1) change the class of your window in the NIB to your subclass of UIWindow.
2) put this method in the .m file.
- (void)sendEvent:(UIEvent *)event
{
NSSet * allTouches = [event allTouches];
NSMutableSet *began = nil;
NSMutableSet *moved = nil;
NSMutableSet *ended = nil;
NSMutableSet *cancelled = nil;
// sort the touches by phase so we can handle them similarly to normal event dispatch
for(UITouch *touch in allTouches) {
switch ([touch phase]) {
case UITouchPhaseBegan:
if (!began) began = [NSMutableSet set];
[began addObject:touch];
break;
case UITouchPhaseMoved:
if (!moved) moved = [NSMutableSet set];
[moved addObject:touch];
break;
case UITouchPhaseEnded:
if (!ended) ended = [NSMutableSet set];
[ended addObject:touch];
break;
case UITouchPhaseCancelled:
if (!cancelled) cancelled = [NSMutableSet set];
[cancelled addObject:touch];
break;
default:
break;
}
// call our methods to handle the touches
if (began)
{
NSLog(#"the following touches began: %#", began);
};
if (moved)
{
NSLog(#"the following touches were moved: %#", moved);
};
if (ended)
{
NSLog(#"the following touches were ended: %#", ended);
};
if (cancelled)
{
NSLog(#"the following touches were cancelled: %#", cancelled);
};
}
[super sendEvent:event];
}
It has way too much output, but you will get the idea... and can make your logic fit where you want it to.

Related

UITouch returns UIWebBrowserView (Apple internal component) instead of UIWebView

I have a UIWebView and I want to detect any touch on that. (I don't want to use UITapGsture or any other thing)
I am using sendEvent: method of UIApplication for this purpose
and check if touch object contains webview.
Surprisingly it points to UIWebBrowserView. I have to check it's superview to get browser but it makes my code very inefficient because sendEvent is called every time when user makes a tap.
Code Snippet :
- (void)sendEvent:(UIEvent *)event {
[super sendEvent:event];
NSSet *touches = [event allTouches];
if (touches.count != 1)
return;
UITouch *touch = touches.anyObject
if([touch.view isKindOfClass:[UIWebView class]]){ // This fails
}
}
I want to know is there a way to make UITouch return WebView as an object instead of returning it's child views like UIPdfView or UIWebBrowser view?
Finally I found a solution to this :
We have to use :
if([touch.view isMemberOfClass:[UIWebView class]]){ // This Works
}
instead of :
if([touch.view isKindOfClass:[UIWebView class]]){ // This fails
}

NSSet of touchesShouldBegin: of UIScrollView is just one NSString

I have a subclass of UIScrollView, it is also the delegate.
When I have the next protocol function called:
- (BOOL)touchesShouldBegin:(NSSet *)touches withEvent:(UIEvent *)event inContentView:(UIView *)view
{
id element;
NSEnumerator *setEnum = [touches objectEnumerator];
while ((element = [setEnum nextObject]) != nil)
{
NSLog(#"element:%#", element);
}
return [super touchesShouldBegin:touches withEvent:event inContentView:view];
}
The only thing that NSLog is showing is:
element:<UITouch: 0x15a9a0> phase: Ended tap count: 3 window: <UIWindow: 0x392390; frame = (0 0; 320 480); layer = <UIWindowLayer: 0x382cd0>> view: <UIButton: 0x3a5e90; frame = (0 0; 106 138); opaque = NO; layer = <CALayer: 0x3a3e90>> location in window: {228, 126} previous location in window: {227, 126} location in view: {89.6667, 77} previous location in view: {88.3333, 77}
The problem is, it shows the content of the NSSet as one big NSString. If I ask allObjects from the objectEnumerator, I just get one object in an NSArray. Exactly the same object (the NSString) as NSLog is showing.
Could someone please tell me if I am doing something wrong, or if it is not normal that the NSSet is just giving one NSString.
Thank you!
The NSLog() output is the result of the description method called on the object. NSLog() output is designed to be human readable for debugging only.
<UITouch: 0x15a9a0> indicates the the object is a UITouch at address 0x15a9a0. The rest of the description just displays some interesting values of the UITouch.
Try:
NSLog(#"tap count:%d", element.count);
etc. and you will se that it is not just a string.
As an aside there is no need for an explicit NSEnumerator, fast enumeration ca be used: you can simplify the code to:
for (UITouch *element in touches)
{
NSLog(#"element:%#", element);
}
The NSSet touches is a set of UITouch instances, not NSStrings. When you send it to NSLog, it calls the "description" method of the UITouch instance which does return an NSString.

Why does UINavigationBar steal touch events?

I have a custom UIButton with UILabel added as subview. Button perform given selector only when I touch it about 15points lower of top bound. And when I tap above that area nothing happens.
I found out that it hasn't caused by wrong creation of button and label, because after I shift the button lower at about 15 px it works correctly.
UPDATE I forgot to say that button located under the UINavigationBar and 1/3 of upper part of the button don't get touch events.
Image was here
View with 4 buttons is located under the NavigationBar. And when touch the "Basketball" in top, BackButton get touch event, and when touch "Piano" in top, then rightBarButton (if exists) get touch. If not exists, nothing happened.
I didn't find this documented feature in App docs.
Also I found this topic related to my problem, but there is no answer too.
I noticed that if you set userInteractionEnabled to OFF, the NavigationBar doesn't "steal" the touches anymore.
So you have to subclass your UINavigationBar and in your CustomNavigationBar do this:
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
if ([self pointInside:point withEvent:event]) {
self.userInteractionEnabled = YES;
} else {
self.userInteractionEnabled = NO;
}
return [super hitTest:point withEvent:event];
}
Info about how to subclass UINavigationBar you can find here.
I found out the answer here(Apple Developer Forum).
Keith at Apple Developer Technical Support, on 18th May 2010 (iPhone OS 3):
I recommend that you avoid having touch-sensitive UI in such close proximity to the nav bar or toolbar. These areas are typically known as "slop factors" making it easier for users to perform touch events on buttons without the difficulty of performing precision touches. This is also the case for UIButtons for example.
But if you want to capture the touch event before the navigation bar or toolbar receives it, you can subclass UIWindow and override:
-(void)sendEvent:(UIEvent *)event;
Also I found out,that when I touch the area under the UINavigationBar, the location.y defined as 64,though it was not.
So I made this:
CustomWindow.h
#interface CustomWindow: UIWindow
#end
CustomWindow.m
#implementation CustomWindow
- (void) sendEvent:(UIEvent *)event
{
BOOL flag = YES;
switch ([event type])
{
case UIEventTypeTouches:
//[self catchUIEventTypeTouches: event]; perform if you need to do something with event
for (UITouch *touch in [event allTouches]) {
if ([touch phase] == UITouchPhaseBegan) {
for (int i=0; i<[self.subviews count]; i++) {
//GET THE FINGER LOCATION ON THE SCREEN
CGPoint location = [touch locationInView:[self.subviews objectAtIndex:i]];
//REPORT THE TOUCH
NSLog(#"[%#] touchesBegan (%i,%i)", [[self.subviews objectAtIndex:i] class],(NSInteger) location.x, (NSInteger) location.y);
if (((NSInteger)location.y) == 64) {
flag = NO;
}
}
}
}
break;
default:
break;
}
if(!flag) return; //to do nothing
/*IMPORTANT*/[super sendEvent:(UIEvent *)event];/*IMPORTANT*/
}
#end
In AppDelegate class I use CustomWindow instead of UIWindow.
Now when I touch area under navigation bar, nothing happens.
My buttons still don't get touch events,because I don't know how to send this event (and change coordinates) to my view with buttons.
Subclass UINavigationBar and add this method. It will cause taps to be passed through unless they are tapping a subview (such as a button).
-(UIView*) hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *v = [super hitTest:point withEvent:event];
return v == self? nil: v;
}
The solution for me was the following one:
First:
Add in your application (It doesn't matter where you enter this code) an extension for UINavigationBar like so:
The following code just dispatch a notification with the point and event when the navigationBar is being tapped.
extension UINavigationBar {
open override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
NotificationCenter.default.post(name: NSNotification.Name(rawValue: "tapNavigationBar"), object: nil, userInfo: ["point": point, "event": event as Any])
return super.hitTest(point, with: event)
}
}
Then in your specific view controller you must listen to this notification by adding this line in your viewDidLoad:
NotificationCenter.default.addObserver(self, selector: #selector(tapNavigationBar), name: NSNotification.Name(rawValue: "tapNavigationBar"), object: nil)
Then you need to create the method tapNavigationBar in your view controller as so:
func tapNavigationBar(notification: Notification) {
let pointOpt = notification.userInfo?["point"] as? CGPoint
let eventOpt = notification.userInfo?["event"] as? UIEvent?
guard let point = pointOpt, let event = eventOpt else { return }
let convertedPoint = YOUR_VIEW_BEHIND_THE_NAVBAR.convert(point, from: self.navigationController?.navigationBar)
if YOUR_VIEW_BEHIND_THE_NAVBAR.point(inside: convertedPoint, with: event) {
//Dispatch whatever you wanted at the first place.
}
}
PD: Don't forget to remove the observation in the deinit like so:
deinit {
NotificationCenter.default.removeObserver(self)
}
That's it... That's a little bit 'tricky', but it's a good workaround for not subclassing and getting a notification anytime the navigationBar is being tapped.
I just wanted to share another prospective to solving this problem. This is not a problem by design, but it was meant to help user get back or navigate. But we need to put things tightly in or below nav bar and things look sad.
First lets look at the code.
class MyNavigationBar: UINavigationBar {
private var secondTap = false
private var firstTapPoint = CGPointZero
override func pointInside(point: CGPoint, withEvent event: UIEvent?) -> Bool {
if !self.secondTap{
self.firstTapPoint = point
}
defer{
self.secondTap = !self.secondTap
}
return super.pointInside(firstTapPoint, withEvent: event)
}
}
You might be see why am i doing second touch handling. There is the recipe to the solution.
Hit test is called twice for a call. The first time the actual point on the window is reported. Everything goes well. On the second pass, this happens.
If system sees a nav bar and the hit point is around 9 pixels more on Y side, it tries to decrease that gradually to below 44 points which is where the nav bar is.
Take a look at the screen to be clear.
So theres a mechanism that will use nearby logic to the second pass of hittest. If we can know its second pass and then call the super with first hit test point. Job done.
The above code does that exactly.
There are 2 things that might be causing problems.
Did you try setUserInteractionEnabled:NO for the label.
Second thing i think might work is apart from that after adding label on top of button you can send the label to back (it might work, not sure although)
[button sendSubviewToBack:label];
Please let me know if the code works :)
Your labels are huge. They start at {0,0} (the left top corner of the button), extend over the entire width of the button and have a height of the entire view. Check your frame data and try again.
Also, you have the option of using the UIButton property titleLabel. Maybe you are setting the title later and it goes into this label rather than your own UILabel. That would explain why the text (belonging to the button) would work, while the label would be covering the rest of the button (not letting the taps go through).
titleLabel is a read-only property, but you can customize it just as your own label (except perhaps the frame) including text color, font, shadow, etc.
This solved my problem..
I added hitTest:withEvent: code to my navbar subclass..
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
int errorMargin = 5;// space left to decrease the click event area
CGRect smallerFrame = CGRectMake(0 , 0 - errorMargin, self.frame.size.width, self.frame.size.height);
BOOL isTouchAllowed = (CGRectContainsPoint(smallerFrame, point) == 1);
if (isTouchAllowed) {
self.userInteractionEnabled = YES;
} else {
self.userInteractionEnabled = NO;
}
return [super hitTest:point withEvent:event];
}
Extending Alexander's solution:
Step 1. Subclass UIWindow
#interface ChunyuWindow : UIWindow {
NSMutableArray * _views;
#private
UIView *_touchView;
}
- (void)addViewForTouchPriority:(UIView*)view;
- (void)removeViewForTouchPriority:(UIView*)view;
#end
// .m File
// #import "ChunyuWindow.h"
#implementation ChunyuWindow
- (void) dealloc {
TT_RELEASE_SAFELY(_views);
[super dealloc];
}
- (void)motionBegan:(UIEventSubtype)motion withEvent:(UIEvent *)event {
if (UIEventSubtypeMotionShake == motion
&& [TTNavigator navigator].supportsShakeToReload) {
// If you're going to use a custom navigator implementation, you need to ensure that you
// implement the reload method. If you're inheriting from TTNavigator, then you're fine.
TTDASSERT([[TTNavigator navigator] respondsToSelector:#selector(reload)]);
[(TTNavigator*)[TTNavigator navigator] reload];
}
}
- (void)addViewForTouchPriority:(UIView*)view {
if ( !_views ) {
_views = [[NSMutableArray alloc] init];
}
if (![_views containsObject: view]) {
[_views addObject:view];
}
}
- (void)removeViewForTouchPriority:(UIView*)view {
if ( !_views ) {
return;
}
if ([_views containsObject: view]) {
[_views removeObject:view];
}
}
- (void)sendEvent:(UIEvent *)event {
if ( !_views || _views.count == 0 ) {
[super sendEvent:event];
return;
}
UITouch *touch = [[event allTouches] anyObject];
switch (touch.phase) {
case UITouchPhaseBegan: {
for ( UIView *view in _views ) {
if ( CGRectContainsPoint(view.frame, [touch locationInView:[view superview]]) ) {
_touchView = view;
[_touchView touchesBegan:[event allTouches] withEvent:event];
return;
}
}
break;
}
case UITouchPhaseMoved: {
if ( _touchView ) {
[_touchView touchesMoved:[event allTouches] withEvent:event];
return;
}
break;
}
case UITouchPhaseCancelled: {
if ( _touchView ) {
[_touchView touchesCancelled:[event allTouches] withEvent:event];
_touchView = nil;
return;
}
break;
}
case UITouchPhaseEnded: {
if ( _touchView ) {
[_touchView touchesEnded:[event allTouches] withEvent:event];
_touchView = nil;
return;
}
break;
}
default: {
break;
}
}
[super sendEvent:event];
}
#end
Step 2: Assign ChunyuWindow instance to AppDelegate Instance
Step 3: Implement touchesEnded:widthEvent: for view with buttons, for example:
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesEnded: touches withEvent: event];
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView: _buttonsView]; // a subview contains buttons
for (UIButton* button in _buttons) {
if (CGRectContainsPoint(button.frame, point)) {
[self onTabButtonClicked: button];
break;
}
}
}
Step 4: call ChunyuWindow's addViewForTouchPriority when the view we care about appears, and call removeViewForTouchPriority when the view disappears or dealloc, in viewDidAppear/viewDidDisappear/dealloc of ViewControllers, so _touchView in ChunyuWindow is NULL, and it is the same as UIWindow, having no side effects.
An alternative solution that worked for me, based on the answer provided by Alexandar:
self.navigationController?.barHideOnTapGestureRecognizer.enabled = false
Instead of overriding the UIWindow, you can just disable the gesture recogniser responsible for the "slop area" on the UINavigationBar.
Give a extension version according to Bart Whiteley. No need to subclass.
#implementation UINavigationBar(Xxxxxx)
- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *v = [super hitTest:point withEvent:event];
return v == self ? nil: v;
}
#end
The following worked for me:
self.navigationController?.isNavigationBarHidden = true

Why do CALayer's move slower than UIView's?

I have a UIView subclass that moved around 'event's when the user touches them, using the following overridden method:
// In a custom UIView...
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint point = [[touches anyObject] locationInView:self];
UIView *eventView = [self.ringOfEvents hitTest:point withEvent:event];
for (EventView *e in self.events) {
if (e == eventView) {
event.position = point;
}
}
}
Why is it that when I make EventView a CALayer instead of UIView, the movement slows down to a creep? I can post that code too, but it is so similar that I don't think it is necessary.
I would think that abstracting to a lower level would speed up event handling, but I must be missing something.
By the way, either if *eventView is a subclass of UIView or CALayer, the position property is as follows:
- (void)setPosition:(CGPoint)pos {
self.layer.position = pos;
}
- (CGPoint)position {
return self.layer.position;
}
Not sure why I get a huge decrease in latency when using UIView as apposed to CALayer..
Most CALayer properties are changed with animation by default, so decrease in latency is probably caused by that.
You may want to disable animations when layer position is changed. Possible solutions are discussed for example in here and here
This is due to implicit animations.
I've implemented a category method which removes implicit animation for givven keys and can be used like this
[view.layer setNullAsActionForKeys:#[#"bounds", #"position"]];
Implementation
#implementation CALayer (Extensions)
- (void)setNullAsActionForKeys:(NSArray *)keys
{
NSMutableDictionary *dict = [self.actions mutableCopy];
if(dict == nil)
{
dict = [NSMutableDictionary dictionaryWithCapacity:[keys count]];
}
for(NSString *key in keys)
{
[dict setObject:[NSNull null] forKey:key];
}
self.actions = dict;
}
#end

How to call a method only once during -(void)touchesMoved?

I am using - (void) touchesMoved to do stuff when ever I enter a specific frame, in this case the area of a button.
My problem is, I only want it to do stuff when I enter the frame - not when I am moving my finger inside the frame.
Does anyone know how I can call my methods only once while I am inside the frame, and still allow me to call it once again if I re-enter it in the same touchMove.
Thank you.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event touchesForView:self.view] anyObject];
CGPoint location = [touch locationInView:touch.view];
if(CGRectContainsPoint(p1.frame, location))
{
//I only want the below to me called
// once while I am inside this frame
[self pP01];
[p1 setHighlighted:YES];
}else {
[p1 setHighlighted:NO];
}
}
You can use some attribute to check if the code was already called when you were entering specific area. It looks like highlighted state of p1 object (not sure what it is) may be appropriate for that:
if(CGRectContainsPoint(p1.frame, location))
{
if (!p1.isHighlighted){ // We entered the area but have not run highlighting code yet
//I only want the below to me called
// once while I am inside this frame
[self pP01];
[p1 setHighlighted:YES];
}
}else { // We left the area - so we'll call highlighting code when we enter next time
[p1 setHighlighted:NO];
}
Simply add a BOOL that you check in touchesMoved and reset in touchesEnded
if( CGRectContainsPoint([p1 frame],[touch locationInView:self.view])) {
NSLog (#"Touch Moved over p1");
if (!p14.isHighlighted) {
[self action: p1];
p1.highlighted = YES;
}
}else {
p1.highlighted = NO;
}
try using a UIButton and use the 'touch drag enter' connection in Interface Builder.