How to detect touches in status bar - iphone

I have custom view in my application which can be scrolled by the user. This view, however, does not inherit from UIScrollView. Now I want the user to be able to scroll this view to the top, just as any other scrollable view allows. I figured that there is no direct way to do so.
Google turned up one solution: http://cocoawithlove.com/2009/05/intercepting-status-bar-touches-on.html This no longer works on iOS 4.x. That's a no-go.
I had the idea of creating a scrollview and keeping it around somewhere, just to catch it's notifications and then forward them to my control. This is not a nice way to solve my problem, so I am looking for "cleaner" solutions. I like the general approach of the aforementioned link to subclass UIApplication. But what API can give me reliable info?
Are there any thoughts, help, etc...?
Edit: Another thing I don't like about my current solution is that it only works as long as the current view does not have any scroll views. The scroll-to-top gesture works only if exactly one scroll view is around. As soon as the dummy is added (see my answer below for details) to a view with another scrollview, the gesture is completely disabled. Another reason to look for a better solution...

Finally, i've assembled the working solution from answers here. Thank you guys.
Declare notification name somewhere (e.g. AppDelegate.h):
static NSString * const kStatusBarTappedNotification = #"statusBarTappedNotification";
Add following lines to your AppDelegate.m:
#pragma mark - Status bar touch tracking
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
CGPoint location = [[[event allTouches] anyObject] locationInView:[self window]];
CGRect statusBarFrame = [UIApplication sharedApplication].statusBarFrame;
if (CGRectContainsPoint(statusBarFrame, location)) {
[self statusBarTouchedAction];
}
}
- (void)statusBarTouchedAction {
[[NSNotificationCenter defaultCenter] postNotificationName:kStatusBarTappedNotification
object:nil];
}
Observe notification in the needed controller (e.g. in viewWillAppear):
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(statusBarTappedAction:)
name:kStatusBarTappedNotification
object:nil];
Remove observer properly (e.g. in viewDidDisappear):
[[NSNotificationCenter defaultCenter] removeObserver:self name:kStatusBarTappedNotification object:nil];
Implement notification-handling callback:
- (void)statusBarTappedAction:(NSNotification*)notification {
NSLog(#"StatusBar tapped");
//handle StatusBar tap here.
}
Hope it will help.
Swift 3 update
Tested and works on iOS 9+.
Declare notification name somewhere:
let statusBarTappedNotification = Notification(name: Notification.Name(rawValue: "statusBarTappedNotification"))
Track status bar touches and post notification. Add following lines to your AppDelegate.swift:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesBegan(touches, with: event)
let statusBarRect = UIApplication.shared.statusBarFrame
guard let touchPoint = event?.allTouches?.first?.location(in: self.window) else { return }
if statusBarRect.contains(touchPoint) {
NotificationCenter.default.post(statusBarTappedNotification)
}
}
Observe notification where necessary:
NotificationCenter.default.addObserver(forName: statusBarTappedNotification.name, object: .none, queue: .none) { _ in
print("status bar tapped")
}

So this is my current solution, which works amazingly well. But please come with other ideas, as I don't really like it...
Add a scrollview somewhere in your view. Maybe hide it or place it below some other view etc.
Set its contentSize to be larger than the bounds
Set a non-zero contentOffset
In your controller implement a delegate of the scrollview like shown below.
By always returning NO, the scroll view never scrolls up and one gets a notification whenever the user hits the status bar. The problem is, however, that this does not work with a "real" content scroll view around. (see question)
- (BOOL)scrollViewShouldScrollToTop:(UIScrollView *)scrollView
{
// Do your action here
return NO;
}

Adding this to your AppDelegate.swift will do what you want:
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
super.touchesBegan(touches, withEvent: event)
let events = event!.allTouches()
let touch = events!.first
let location = touch!.locationInView(self.window)
let statusBarFrame = UIApplication.sharedApplication().statusBarFrame
if CGRectContainsPoint(statusBarFrame, location) {
NSNotificationCenter.defaultCenter().postNotificationName("statusBarSelected", object: nil)
}
}
Now you can subscribe to the event where ever you need:
NSNotificationCenter.defaultCenter().addObserverForName("statusBarSelected", object: nil, queue: nil) { event in
// scroll to top of a table view
self.tableView!.setContentOffset(CGPointZero, animated: true)
}

Thanks Max, your solution worked for me after spending ages looking.
For information :
dummyScrollView = [[UIScrollView alloc] init];
dummyScrollView.delegate = self;
[view addSubview:dummyScrollView];
[view sendSubviewToBack:dummyScrollView];
then
dummyScrollView.contentSize = CGSizeMake(view.frame.size.width, view.frame.size.height+200);
// scroll it a bit, otherwise scrollViewShouldScrollToTop not called
dummyScrollView.contentOffset = CGPointMake(0, 1);
//delegate :
- (BOOL)scrollViewShouldScrollToTop:(UIScrollView *)scrollView
{
// DETECTED! - do what you need to
NSLog(#"scrollViewShouldScrollToTop");
return NO;
}
Note that I had a UIWebView also which I had to hack a bit with a solution I found somewhere :
- (void)webViewDidFinishLoad:(UIWebView *)wv
{
[super webViewDidFinishLoad:wv];
UIScrollView *scroller = (UIScrollView *)[[webView subviews] objectAtIndex:0];
if ([scroller respondsToSelector:#selector(setScrollEnabled:)])
scroller.scrollEnabled = NO;
}

Found a much better solution which is iOS7 compatible here :http://ruiaureliano.tumblr.com/post/37260346960/uitableview-tap-status-bar-to-scroll-up
Add this method to your AppDelegate:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
CGPoint location = [[[event allTouches] anyObject] locationInView:[self window]];
if (CGRectContainsPoint([UIApplication sharedApplication].statusBarFrame, location)) {
NSLog(#"STATUS BAR TAPPED!");
}
}

I implemented this by adding a clear UIView over the status bar and then intercepting the touch events
First in your Application delegate application:didFinishLaunchingWithOptions: add these 2 lines of code:
self.window.backgroundColor = [UIColor clearColor];
self.window.windowLevel = UIWindowLevelStatusBar+1.f;
Then in the view controller you wish to intercept status bar taps (or in the application delegate) add the following code
UIView* statusBarInterceptView = [[[UIView alloc] initWithFrame:[UIApplication sharedApplication].statusBarFrame] autorelease];
statusBarInterceptView.backgroundColor = [UIColor clearColor];
UITapGestureRecognizer* tapRecognizer = [[[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(statusBarClicked)] autorelease];
[statusBarInterceptView addGestureRecognizer:tapRecognizer];
[[[UIApplication sharedApplication].delegate window] addSubview:statusBarInterceptView];
In the statusBarClicked selector, do what you need to do, in my case I posted a notification to the notification center so that other view controllers can respond to the status bar tap.

Use an invisible UIScrollView. Tested at iOS 10 & Swift 3.
override func viewDidLoad() {
let scrollView = UIScrollView()
scrollView.bounds = view.bounds
scrollView.contentOffset.y = 1
scrollView.contentSize.height = view.bounds.height + 1
scrollView.delegate = self
view.addSubview(scrollView)
}
func scrollViewShouldScrollToTop(_ scrollView: UIScrollView) -> Bool {
debugPrint("status bar tapped")
return false
}

You can track status bar tap by using following code:
[[NSNotificationCenter defaultCenter] addObserverForName:#"_UIApplicationSystemGestureStateChangedNotification"
object:nil
queue:nil
usingBlock:^(NSNotification *note) {
NSLog(#"Status bar pressed!");
}];

One way, might not be the best, could be to put a clear UIView on top of the status bar and intercept the touches of the UIView, might help you out if nothing else comes up...

If you're just trying to have a UIScrollView scroll to the top when the status bar is tapped, it's worth noting that this is the default behavior IF your view controller has exactly one UIScrollView in its subviews that has scrollsToTop set to YES.
So I just had to go and find any other UIScrollView (or subclasses: UITableView, UICollectionView, and set scrollsToTop to be NO.
To be fair, I found this info in the post that was linked to in the original question, but it's also dismissed as no longer working so I skipped it and only found the relevant piece on a subsequent search.

For iOS 13 this has worked for me, Objective-C category of UIStatusBarManager
#implementation UIStatusBarManager (CAPHandleTapAction)
-(void)handleTapAction:(id)arg1 {
// Your code here
}
#end

Related

How to Implement Touch Up Inside in touchesBegan, touchesEnded

I'm wondering if someone knows how to implement the "touch up inside" response when a user pushes down then lifts their finger in the touchesBegan, touchesEnded methods. I know this can be done with UITapGestureRecognizer, but actually I'm trying to make it so that it only works on a quick tap (with UITapGestureRecognizer, if you hold your finger there for a long time, then lift, it still executes). Anyone know how to implement this?
Using the UILongPressGesturizer is actually a much better solution to mimic all of the functionality of a UIButton (touchUpInside, touchUpOutside, touchDown, etc.):
- (void) longPress:(UILongPressGestureRecognizer *)longPressGestureRecognizer
{
if (longPressGestureRecognizer.state == UIGestureRecognizerStateBegan || longPressGestureRecognizer.state == UIGestureRecognizerStateChanged)
{
CGPoint touchedPoint = [longPressGestureRecognizer locationInView: self];
if (CGRectContainsPoint(self.bounds, touchedPoint))
{
[self addHighlights];
}
else
{
[self removeHighlights];
}
}
else if (longPressGestureRecognizer.state == UIGestureRecognizerStateEnded)
{
if (self.highlightView.superview)
{
[self removeHighlights];
}
CGPoint touchedPoint = [longPressGestureRecognizer locationInView: self];
if (CGRectContainsPoint(self.bounds, touchedPoint))
{
if ([self.delegate respondsToSelector:#selector(buttonViewDidTouchUpInside:)])
{
[self.delegate buttonViewDidTouchUpInside:self];
}
}
}
}
I'm not sure when it was added, but the property isTouchInside is a life saver for any UIControl derived object (e.g. UIButton).
override func endTracking(_ touch: UITouch?, with event: UIEvent?) {
super.endTracking(touch, with: event)
if isTouchInside {
// Do the thing you want to do
}
}
Here's the Apple official docs
You can implement touchesBegan and touchesEnded by creating a UIView subclass and implementing it there.
However you can also use a UILongPressGestureRecognizer and achieve the same results.
I did this by putting a timer that gets triggered in touchesBegan. If this timer is still running when touchesEnded gets called, then execute whatever code you wanted to. This gives the effect of touchUpInside.
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSTimer *tapTimer = [[NSTimer scheduledTimerWithTimeInterval:.15 invocation:nil repeats:NO] retain];
self.tapTimer = tapTimer;
[tapTimer release];
}
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([self.tapTimer isValid])
{
}
}
You can create some BOOL variable then in -touchesBegan check what view or whatever you need was touched and set this BOOL variable to YES. After that in -touchesEnded check if this variable is YES and your view or whatever you need was touched that will be your -touchUpInside. And of course set BOOL variable to NO after.
You can add a UTapGestureRecognizer and a UILongPressGestureRecognizer and add dependency using [tap requiresGestureRecognizerToFail:longPress]; (tap and long press being the objects of added recognizers).
This way, the tap will not be detected if long press is fired.

-[CustomWindow hitTest:withEvent:] implementation to forward events

I have a custom window (that should be on top of everything, including the keyboard) to show an overlay thing, something like the overlay you see when pressing the volume up/down buttons in the device.
So I made a custom window OverlayWindow so far everything works fine and windows in the back are receiving their events normally. However hitTest:withEvent: is called several times and sometimes it even returns nil. I wonder if that is normal/correct? If not how can I fix that?
// A small (WIDTH_MAX:100) window in the center of the screen. If it matters
const CGSize screenSize = [[UIScreen mainScreen] bounds].size;
const CGRect rect = CGRectMake(((int)(screenSize.width - WIDTH_MAX)*0.5),
       ((int)(screenSize.height - WIDTH_MAX)*0.5), WIDTH_MAX, WIDTH_MAX);
overlayWindow = [[CustomWindow alloc] initWithFrame:rect];
overlayWindow.windowLevel = UIWindowLevelStatusBar; //1000.0
overlayWindow.hidden = NO; // I don't need it to be the key (no makeKeyAndVisible)
 
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
// Find the front most window (with the highest window level) and
// call this method on that window. It should will make the event be
// forwarded to it
// Situation1: This method is called twice (or even more, it depend
// on the number of windows the app has) per event: Why? Is this the
// *normal* behaviour?
NSLog(#" ");
NSLog(#"Point: %# Event: %p\n", NSStringFromCGPoint(point), event);
UIView *view = nil;
if (CGRectContainsPoint(self.bounds, point)) {
NSLog(#"inside window\n");
NSArray *wins = [[UIApplication sharedApplication] windows];
__block UIWindow *frontMostWin = nil;
[wins enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
NSLog(#"win: %#\n", obj);
if ([obj windowLevel] >= [frontMostWin windowLevel] && obj != self) {
frontMostWin = obj;
}
}];
NSLog(#"frontMostWindow:%#\n finding a new view ...\n", frontMostWin);
CGPoint p = [frontMostWindow convertPoint:point fromWindow:self];
view = [frontMostWindow hitTest:p withEvent:event];
// Situation2: sometimes view is nil here, Is that correct?
}
NSLog(#"resultView: %#\n", view);
return view;
}
EDIT:
Also I've noticed that
if hitTest:withEvent: always returns nil it works too. This is only when I call overlayWindow.hidden = NO;
if I call [overlayWindow makeKeyAndVisible] returning nil in hitTest:withEvent: does not always work. It looks like a key window requires a proper implementation of the hit testing method?
Am I missing something about event forwarding here?
Do frontMostWindow means frontMostWin?
It looks like even if we use only one UIWindow, hitTest:withEvent: will be executed on it at least 2 times. So, I guess it is normal.
You can receive null at
view = [frontMostWindow hitTest:p withEvent:event];
due to following reasons:
frontMostWindow is null itself (as example, if you have only one window)
p is ouside frontMostWindow bounds (as example, when frontMostWindow is keyboard and your touch is somewhere else)
frontMostWindow have property userInteractionEnabled set to NO;
hitTest:withEvent: being call several times is normal. This is probably because you only display a UILabel or UIImageView on the overlayed window and thus touches are automatically forwareded.
However I think you don't really need another OverlayWindow, instead you might consider a UIView on the top of the keyWindow. This should make your application cleaner...
I did face the same problem.
I tried to solve it based your post and another solution was found.
This code implemented in uiwindow overlaying on the top.
This code will pass events through the under window when area has no view.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
NSLog(#" ");
NSLog(#"Point: %# Event: %p\n", NSStringFromCGPoint(point), event);
UIView *view = nil;
UIView *resultView = [super hitTest:point withEvent:event];
if (resultView == self) {
NSLog(#"touched in transparent window !!");
return nil;
}
NSLog(#"touched in view!!");
return resultView;
}
After all, Thanks. Your post is very helpful.

Why does UINavigationBar steal touch events?

I have a custom UIButton with UILabel added as subview. Button perform given selector only when I touch it about 15points lower of top bound. And when I tap above that area nothing happens.
I found out that it hasn't caused by wrong creation of button and label, because after I shift the button lower at about 15 px it works correctly.
UPDATE I forgot to say that button located under the UINavigationBar and 1/3 of upper part of the button don't get touch events.
Image was here
View with 4 buttons is located under the NavigationBar. And when touch the "Basketball" in top, BackButton get touch event, and when touch "Piano" in top, then rightBarButton (if exists) get touch. If not exists, nothing happened.
I didn't find this documented feature in App docs.
Also I found this topic related to my problem, but there is no answer too.
I noticed that if you set userInteractionEnabled to OFF, the NavigationBar doesn't "steal" the touches anymore.
So you have to subclass your UINavigationBar and in your CustomNavigationBar do this:
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
if ([self pointInside:point withEvent:event]) {
self.userInteractionEnabled = YES;
} else {
self.userInteractionEnabled = NO;
}
return [super hitTest:point withEvent:event];
}
Info about how to subclass UINavigationBar you can find here.
I found out the answer here(Apple Developer Forum).
Keith at Apple Developer Technical Support, on 18th May 2010 (iPhone OS 3):
I recommend that you avoid having touch-sensitive UI in such close proximity to the nav bar or toolbar. These areas are typically known as "slop factors" making it easier for users to perform touch events on buttons without the difficulty of performing precision touches. This is also the case for UIButtons for example.
But if you want to capture the touch event before the navigation bar or toolbar receives it, you can subclass UIWindow and override:
-(void)sendEvent:(UIEvent *)event;
Also I found out,that when I touch the area under the UINavigationBar, the location.y defined as 64,though it was not.
So I made this:
CustomWindow.h
#interface CustomWindow: UIWindow
#end
CustomWindow.m
#implementation CustomWindow
- (void) sendEvent:(UIEvent *)event
{
BOOL flag = YES;
switch ([event type])
{
case UIEventTypeTouches:
//[self catchUIEventTypeTouches: event]; perform if you need to do something with event
for (UITouch *touch in [event allTouches]) {
if ([touch phase] == UITouchPhaseBegan) {
for (int i=0; i<[self.subviews count]; i++) {
//GET THE FINGER LOCATION ON THE SCREEN
CGPoint location = [touch locationInView:[self.subviews objectAtIndex:i]];
//REPORT THE TOUCH
NSLog(#"[%#] touchesBegan (%i,%i)", [[self.subviews objectAtIndex:i] class],(NSInteger) location.x, (NSInteger) location.y);
if (((NSInteger)location.y) == 64) {
flag = NO;
}
}
}
}
break;
default:
break;
}
if(!flag) return; //to do nothing
/*IMPORTANT*/[super sendEvent:(UIEvent *)event];/*IMPORTANT*/
}
#end
In AppDelegate class I use CustomWindow instead of UIWindow.
Now when I touch area under navigation bar, nothing happens.
My buttons still don't get touch events,because I don't know how to send this event (and change coordinates) to my view with buttons.
Subclass UINavigationBar and add this method. It will cause taps to be passed through unless they are tapping a subview (such as a button).
-(UIView*) hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *v = [super hitTest:point withEvent:event];
return v == self? nil: v;
}
The solution for me was the following one:
First:
Add in your application (It doesn't matter where you enter this code) an extension for UINavigationBar like so:
The following code just dispatch a notification with the point and event when the navigationBar is being tapped.
extension UINavigationBar {
open override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
NotificationCenter.default.post(name: NSNotification.Name(rawValue: "tapNavigationBar"), object: nil, userInfo: ["point": point, "event": event as Any])
return super.hitTest(point, with: event)
}
}
Then in your specific view controller you must listen to this notification by adding this line in your viewDidLoad:
NotificationCenter.default.addObserver(self, selector: #selector(tapNavigationBar), name: NSNotification.Name(rawValue: "tapNavigationBar"), object: nil)
Then you need to create the method tapNavigationBar in your view controller as so:
func tapNavigationBar(notification: Notification) {
let pointOpt = notification.userInfo?["point"] as? CGPoint
let eventOpt = notification.userInfo?["event"] as? UIEvent?
guard let point = pointOpt, let event = eventOpt else { return }
let convertedPoint = YOUR_VIEW_BEHIND_THE_NAVBAR.convert(point, from: self.navigationController?.navigationBar)
if YOUR_VIEW_BEHIND_THE_NAVBAR.point(inside: convertedPoint, with: event) {
//Dispatch whatever you wanted at the first place.
}
}
PD: Don't forget to remove the observation in the deinit like so:
deinit {
NotificationCenter.default.removeObserver(self)
}
That's it... That's a little bit 'tricky', but it's a good workaround for not subclassing and getting a notification anytime the navigationBar is being tapped.
I just wanted to share another prospective to solving this problem. This is not a problem by design, but it was meant to help user get back or navigate. But we need to put things tightly in or below nav bar and things look sad.
First lets look at the code.
class MyNavigationBar: UINavigationBar {
private var secondTap = false
private var firstTapPoint = CGPointZero
override func pointInside(point: CGPoint, withEvent event: UIEvent?) -> Bool {
if !self.secondTap{
self.firstTapPoint = point
}
defer{
self.secondTap = !self.secondTap
}
return super.pointInside(firstTapPoint, withEvent: event)
}
}
You might be see why am i doing second touch handling. There is the recipe to the solution.
Hit test is called twice for a call. The first time the actual point on the window is reported. Everything goes well. On the second pass, this happens.
If system sees a nav bar and the hit point is around 9 pixels more on Y side, it tries to decrease that gradually to below 44 points which is where the nav bar is.
Take a look at the screen to be clear.
So theres a mechanism that will use nearby logic to the second pass of hittest. If we can know its second pass and then call the super with first hit test point. Job done.
The above code does that exactly.
There are 2 things that might be causing problems.
Did you try setUserInteractionEnabled:NO for the label.
Second thing i think might work is apart from that after adding label on top of button you can send the label to back (it might work, not sure although)
[button sendSubviewToBack:label];
Please let me know if the code works :)
Your labels are huge. They start at {0,0} (the left top corner of the button), extend over the entire width of the button and have a height of the entire view. Check your frame data and try again.
Also, you have the option of using the UIButton property titleLabel. Maybe you are setting the title later and it goes into this label rather than your own UILabel. That would explain why the text (belonging to the button) would work, while the label would be covering the rest of the button (not letting the taps go through).
titleLabel is a read-only property, but you can customize it just as your own label (except perhaps the frame) including text color, font, shadow, etc.
This solved my problem..
I added hitTest:withEvent: code to my navbar subclass..
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
int errorMargin = 5;// space left to decrease the click event area
CGRect smallerFrame = CGRectMake(0 , 0 - errorMargin, self.frame.size.width, self.frame.size.height);
BOOL isTouchAllowed = (CGRectContainsPoint(smallerFrame, point) == 1);
if (isTouchAllowed) {
self.userInteractionEnabled = YES;
} else {
self.userInteractionEnabled = NO;
}
return [super hitTest:point withEvent:event];
}
Extending Alexander's solution:
Step 1. Subclass UIWindow
#interface ChunyuWindow : UIWindow {
NSMutableArray * _views;
#private
UIView *_touchView;
}
- (void)addViewForTouchPriority:(UIView*)view;
- (void)removeViewForTouchPriority:(UIView*)view;
#end
// .m File
// #import "ChunyuWindow.h"
#implementation ChunyuWindow
- (void) dealloc {
TT_RELEASE_SAFELY(_views);
[super dealloc];
}
- (void)motionBegan:(UIEventSubtype)motion withEvent:(UIEvent *)event {
if (UIEventSubtypeMotionShake == motion
&& [TTNavigator navigator].supportsShakeToReload) {
// If you're going to use a custom navigator implementation, you need to ensure that you
// implement the reload method. If you're inheriting from TTNavigator, then you're fine.
TTDASSERT([[TTNavigator navigator] respondsToSelector:#selector(reload)]);
[(TTNavigator*)[TTNavigator navigator] reload];
}
}
- (void)addViewForTouchPriority:(UIView*)view {
if ( !_views ) {
_views = [[NSMutableArray alloc] init];
}
if (![_views containsObject: view]) {
[_views addObject:view];
}
}
- (void)removeViewForTouchPriority:(UIView*)view {
if ( !_views ) {
return;
}
if ([_views containsObject: view]) {
[_views removeObject:view];
}
}
- (void)sendEvent:(UIEvent *)event {
if ( !_views || _views.count == 0 ) {
[super sendEvent:event];
return;
}
UITouch *touch = [[event allTouches] anyObject];
switch (touch.phase) {
case UITouchPhaseBegan: {
for ( UIView *view in _views ) {
if ( CGRectContainsPoint(view.frame, [touch locationInView:[view superview]]) ) {
_touchView = view;
[_touchView touchesBegan:[event allTouches] withEvent:event];
return;
}
}
break;
}
case UITouchPhaseMoved: {
if ( _touchView ) {
[_touchView touchesMoved:[event allTouches] withEvent:event];
return;
}
break;
}
case UITouchPhaseCancelled: {
if ( _touchView ) {
[_touchView touchesCancelled:[event allTouches] withEvent:event];
_touchView = nil;
return;
}
break;
}
case UITouchPhaseEnded: {
if ( _touchView ) {
[_touchView touchesEnded:[event allTouches] withEvent:event];
_touchView = nil;
return;
}
break;
}
default: {
break;
}
}
[super sendEvent:event];
}
#end
Step 2: Assign ChunyuWindow instance to AppDelegate Instance
Step 3: Implement touchesEnded:widthEvent: for view with buttons, for example:
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesEnded: touches withEvent: event];
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView: _buttonsView]; // a subview contains buttons
for (UIButton* button in _buttons) {
if (CGRectContainsPoint(button.frame, point)) {
[self onTabButtonClicked: button];
break;
}
}
}
Step 4: call ChunyuWindow's addViewForTouchPriority when the view we care about appears, and call removeViewForTouchPriority when the view disappears or dealloc, in viewDidAppear/viewDidDisappear/dealloc of ViewControllers, so _touchView in ChunyuWindow is NULL, and it is the same as UIWindow, having no side effects.
An alternative solution that worked for me, based on the answer provided by Alexandar:
self.navigationController?.barHideOnTapGestureRecognizer.enabled = false
Instead of overriding the UIWindow, you can just disable the gesture recogniser responsible for the "slop area" on the UINavigationBar.
Give a extension version according to Bart Whiteley. No need to subclass.
#implementation UINavigationBar(Xxxxxx)
- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *v = [super hitTest:point withEvent:event];
return v == self ? nil: v;
}
#end
The following worked for me:
self.navigationController?.isNavigationBarHidden = true

Is there a way to prevent the keyboard from dismissing?

I realize that this is the inverse of most posts, but I would like for the keyboard to remain up even if the 'keyboard down' button is pressed.
Specifically, I have a view with two UITextFields. With the following delegate method
- (BOOL)textFieldShouldReturn:(UITextField *)textField {
return NO;
}
I am able to keep the keyboard up even if the user presses the Done button on the keyboard or taps anywhere else on the screen EXCEPT for that pesky keyboard down button on the bottom right of the keyboard.
I am using this view like a modal view (though the view is associated with a ViewController that gets pushed in a UINavigationController), so it really works best from a user perspective to keep the keyboard up all of the time. If anyone knows how to achieve this, please let me know! Thanks!
UPDATE Still no solution! When Done is pressed, it triggers textFieldShouldReturn, but when the Dismiss button is pressed, it triggers textFieldDidEndEditing. I cannot block the textField from ending editing or it never goes away. Somehow, I really want to have a method that detects the Dismiss button and ignores it. If you know a way, please enlighten me!
There IS a way to do this. Because UIKeyboard subclasses UIWindow, the only thing big enough to get in UIKeyboard's way is another UIWindow.
- (void)viewDidLoad {
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(coverKey) name:UIKeyboardDidShowNotification object:nil];
[super viewDidLoad];
}
- (void)coverKey {
CGRect r = [[UIScreen mainScreen] bounds];
UIWindow *myWindow = [[UIWindow alloc] initWithFrame:CGRectMake(r.size.width - 50 , r.size.height - 50, 50, 50)];
[myWindow setBackgroundColor:[UIColor clearColor]];
[super.view addSubview:myWindow];
[myWindow makeKeyAndVisible];
}
This works on iPhone apps. Haven't tried it with iPad. You may need to adjust the size of myWindow. Also, I didn't do any mem management on myWindow. So, consider doing that, too.
I think I've found a good solution.
Add a BOOL as instance variable, let's call it shouldBeginCalledBeforeHand
Then implement the following methods:
- (BOOL)textFieldShouldBeginEditing:(UITextField *)textField
{
shouldBeginCalledBeforeHand = YES;
return YES;
}
- (BOOL)textFieldShouldEndEditing:(UITextField *)textField
{
return shouldBeginCalledBeforeHand;
}
- (void)textFieldDidBeginEditing:(UITextField *)textField
{
shouldBeginCalledBeforeHand = NO;
}
As well as
- (BOOL)textFieldShouldReturn:(UITextField *)textField
{
return NO;
}
to prevent the keyboard from disappearing with the return button. The trick is, a focus switch from one textfield to another will trigger a textFieldShouldBeginEditing beforehand. If the dismiss keyboard button is pressed this doesn't happen. The flag is reset after a textfield has gotten focus.
Old not perfect solution
I can only think of a not perfect solution. Listen for the notification UIKeyboardDidHideNotification and make of the textfields first responder again. This will move the keyboard out of sight and back again. You could keep record of which textfield was the last firstResponder by listening for UIKeyboardWillHideNotification and put focus on it in the didHide.
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(keyboardDidHide:)
name:UIKeyboardDidHideNotification
object:nil];
...
- (void)keyboardDidHide:(id)sender
{
[myTextField becomeFirstResponder];
}
For iOS 9/10 and Swift 3, use this to create a rect which overlaps the "Hide keyboard" - Button
override func viewDidLoad() {
NotificationCenter.default.addObserver(self, selector: #selector(coverKey), name: .UIKeyboardDidShow, object: nil)
}
func coverKey() {
if let keyboardWindow = UIApplication.shared.windows.last {
let r = UIScreen.main.bounds
let myWindow = UIWindow.init(frame: CGRect(x: r.size.width - 50 , y: r.size.height - 50, width: 50, height: 50))
myWindow.backgroundColor = UIColor.clear
myWindow.isHidden = false
keyboardWindow.addSubview(myWindow)
keyboardWindow.bringSubview(toFront: myWindow)
}
}
Notice that this adds a sub view to the keyboard window instead of the main window
Try adding a custom on top of the keyboard dismiss button so that the user won't be able to tab the dismiss button. I have used this method in one of my application.
- (void)addButtonToKeyboard {
// create custom button
UIButton *blockButton = [UIButton buttonWithType:UIButtonTypeCustom];
blockButton.frame = //set the frame here, I don't remember the exact frame
[blockButton setImage:[UIImage imageNamed:#"block_button.png"] forState:UIControlStateNormal];
// locate keyboard view
UIWindow *appWindows = [[[UIApplication sharedApplication] windows] objectAtIndex:1];
UIView *keyboard;
for (int i=0; i<[appWindows.subviews count]; i++) {
keyboard = [appWindows.subviews objectAtIndex:i];
// keyboard found, add the button
if ([[keyboard description] hasPrefix:#"<UIPeripheralHost"] == YES && [self.textField isFirstResponder]) {
[keyboard addSubview:doneButton];
}
}
}
Try this...
- (BOOL)textFieldShouldEndEditing:(UITextField *)textField{
return NO;
}
You can use notification as mentioned by Nick Weaver.

How to disable multitouch?

My app has several buttons which trigger different events. The user should NOT be able to hold down several buttons. Anyhow, holding down several buttons crashes the app.
And so, I'm trying to disable multi-touch in my app.
I've unchecked 'Multiple Touch' in all the xib files, and as far as I can work out, the properties 'multipleTouchEnabled' and 'exclusiveTouch' control whether the view uses multitouch. So in my applicationDidFinishLaunching I've put this:
self.mainViewController.view.multipleTouchEnabled = NO;
self.mainViewController.view.exclusiveTouch = YES;
And in each of my view controllers I've put this in the viewDidLoad
self.view.multipleTouchEnabled = NO;
self.view.exclusiveTouch = YES;
However, it still accepts multiple touches. I could do something like disable other buttons after getting a touch down event, but this would be an ugly hack. Surely there is a way to properly disable multi-touch?
If you want only one button to respond to touches at a time, you need to set exclusiveTouch for that button, rather than for the parent view. Alternatively, you could disable the other buttons when a button gets the "Touch Down" event.
Here's an example of the latter, which worked better in my testing. Setting exclusiveTouch for the buttons kind-of worked, but led to some interesting problems when you moved your finger off the edge of a button, rather than just clicking it.
You need to have outlets in your controller hooked up to each button, and have the "Touch Down", "Touch Up Inside", and "Touch Up Outside" events hooked to the proper methods in your controller.
#import "multibuttonsViewController.h"
#implementation multibuttonsViewController
// hook this up to "Touch Down" for each button
- (IBAction) pressed: (id) sender
{
if (sender == one)
{
two.enabled = false;
three.enabled = false;
[label setText: #"One"]; // or whatever you want to do
}
else if (sender == two)
{
one.enabled = false;
three.enabled = false;
[label setText: #"Two"]; // or whatever you want to do
}
else
{
one.enabled = false;
two.enabled = false;
[label setText: #"Three"]; // or whatever you want to do
}
}
// hook this up to "Touch Up Inside" and "Touch Up Outside"
- (IBAction) released: (id) sender
{
one.enabled = true;
two.enabled = true;
three.enabled = true;
}
#end
- (void)viewDidLoad {
[super viewDidLoad];
for(UIView* v in self.view.subviews)
{
if([v isKindOfClass:[UIButton class]])
{
UIButton* btn = (UIButton*)v;
[btn setExclusiveTouch:YES];
}
}
}
- (void)viewDidLoad {
[super viewDidLoad];
for(UIView* v in self.view.subviews)
{
if([v isKindOfClass:[UIButton class]])
{
UIButton* btn = (UIButton*)v;
[btn setExclusiveTouch:YES];
}
}
}
This code is tested and working perfectly for me.there is no app crash when pressing more than one button at a time.
Your app crashes for a reason. Investigate further, use the debugger, see what's wrong instead of trying to hide the bug.
Edit:
OK, ok, I have to admit I was a bit harsh. You have to set the exclusiveTouch property on each button. That's all. The multipleTouchEnabled property is irrelevant.
To disable multitouch in SWIFT:
You need first to have an outlet of every button and afterwards just set the exclusive touch to true.Therefore in you viewDidLoad() would have:
yourButton.exclusiveTouch = true.
// not really necessary but you could also add:
self.view.multipleTouchEnabled = false
If you want to disable multi touch throughout the application and don't want to write code for each button then you can simply use Appearance of button. Write below line in didFinishLaunchingWithOptions.
UIButton.appearance().isExclusiveTouch = true
Thats great!! UIAppearance
You can even use it for any of UIView class so if you want to disable multi touch for few buttons. Make a CustomClass of button and then
CustomButton.appearance().isExclusiveTouch = true
There is one more advantage which can help you. In case you want to disable multi touch of buttons in a particular ViewController
UIButton.appearance(whenContainedInInstancesOf: [ViewController2.self]).isExclusiveTouch = true
Based on neoevoke's answer, only improving it a bit so that it also checks subviews' children, I created this function and added it to my utils file:
// Set exclusive touch to all children
+ (void)setExclusiveTouchToChildrenOf:(NSArray *)subviews
{
for (UIView *v in subviews) {
[self setExclusiveTouchToChildrenOf:v.subviews];
if ([v isKindOfClass:[UIButton class]]) {
UIButton *btn = (UIButton *)v;
[btn setExclusiveTouch:YES];
}
}
}
Then, a simple call to:
[Utils setExclusiveTouchToChildrenOf:self.view.subviews];
... will do the trick.
This is quite often issue being reported by our testers. One of the approach that I'm using sometimes, although it should be used consciously, is to create category for UIView, like this one:
#implementation UIView (ExclusiveTouch)
- (BOOL)isExclusiveTouch
{
return YES;
}
Pretty much simple you can use make use of ExclusiveTouch property in this case
[youBtn setExclusiveTouch:YES];
This is a Boolean value that indicates whether the receiver handles touch events exclusively.
Setting this property to YES causes the receiver to block the delivery of touch events to other views in the same window. The default value of this property is NO.
For disabling global multitouch in Xamarin.iOS
Copy&Paste the code below:
[DllImport(ObjCRuntime.Constants.ObjectiveCLibrary, EntryPoint = "objc_msgSend")]
internal extern static IntPtr IntPtr_objc_msgSend(IntPtr receiver, IntPtr selector, bool isExclusiveTouch);
static void SetExclusiveTouch(bool isExclusiveTouch)
{
var selector = new ObjCRuntime.Selector("setExclusiveTouch:");
IntPtr_objc_msgSend(UIView.Appearance.Handle, selector.Handle, isExclusiveTouch);
}
And set it on AppDelegate:
public override bool FinishedLaunching(UIApplication app, NSDictionary options)
{
...
SetExclusiveTouch(true); // setting exlusive to true disables the multitouch
...
}
My experience is that, by default, a new project doesn't even allow multitouch, you have to turn it on. But I suppose that depends on how you got started. Did you use a mutlitouch example as a template?
First of all, are you absolutely sure multitouch is on? It's possible to generate single touches in sequence pretty quickly. Multitouch is more about what you do with two or more fingers once they are on the surface. Perhaps you have single touch on but aren't correctly dealing with what happens if two buttons are pressed at nearly the same time.
I've just had exactly this problem.
The solution we came up with was simply to inherit a new class from UIButton that overrides the initWithCoder method, and use that where we needed one button push at a time (ie. everywhere):
#implementation ExclusiveButton
(id)initWithCoder: (NSCoder*)decoder
{
[self setExclusiveTouch:YES];
return [super initWithCoder:decoder]
}
#end
Note that this only works with buttons loaded from nib files.
I created UIView Class Extension and added this two functions. and when i want to disable view touch i just call [view makeExclusiveTouch];
- (void) makeExclusiveTouchForViews:(NSArray*)views {
for (UIView * view in views) {
[view makeExclusiveTouch];
}
}
- (void) makeExclusiveTouch {
self.multipleTouchEnabled = NO;
self.exclusiveTouch = YES;
[self makeExclusiveTouchForViews:self.subviews];
}
If you want to disable multitouch programmatically, or if you are using cocos2d (no multipleTouchEnabled option), you can use the following code on your ccTouches delegate:
- (BOOL)ccTouchesBegan:(NSSet *)touches
withEvent:(UIEvent *)event {
NSSet *multiTouch = [event allTouches];
if( [multiTouch count] > 1) {
return;
}
else {
//else your rest of the code
}
Disable all the buttons on view in "Touch Down" event and enable them in "Touch Up Inside" event.
for example
- (void) handleTouchDown {
for (UIButton *btn in views) {
btn.enable = NO;
}
}
- (void) handleTouchUpInside {
for (UIButton *btn in views) {
btn.enable = Yes;
}
------
------
}
I decided this problem by this way:
NSTimeInterval intervalButtonPressed;
- (IBAction)buttonPicturePressed:(id)sender{
if (([[NSDate date] timeIntervalSince1970] - intervalButtonPressed) > 0.1f) {
intervalButtonPressed = [[NSDate date] timeIntervalSince1970];
//your code for button
}
}
I had struggled with some odd cases when dragging objects around a view, where if you touched another object at the same time it would fire the touchesBegan method. My work-around was to disable user interaction for the parent view until touchesEnded or touchesCancelled is called.
override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent?) {
// whatever setup you need
self.view.userInteractionEnabled = false
}
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
// whatever setup you need
self.view.userInteractionEnabled = true
}
override func touchesCancelled(touches: Set<UITouch>?, withEvent event: UIEvent?) {
// whatever setup you need
self.view.userInteractionEnabled = true
}
A Gotcha:
If you are using isExclusiveTouch, be aware that overriding point(inside:) on the button can interfere, effectively making isExclusiveTouch useless.
(Sometimes you need to override point(inside:) for handling the "button not responsive at bottom of iPhone screen" bug/misfeature (which is caused by Apple installing swipe GestureRecognizers at the bottom of the screen, interfering with button highlighting.)
See: UIButton fails to properly register touch in bottom region of iPhone screen
Just set all relevant UIView's property exclusiveTouch to false do the trick.