I have a UIScrollView that is set to have a clear background. Part of the scrollview does have content, but part does not (so it shows other views behind it). I would like to be able to click through the UIScrollView and to the MKMapView behind, but only for the transparent portion of the UIScrollView.
I have found some code which I am having a real hard time understanding how to get working:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if (![self yourMethodThatDeterminesInterestingTouches:touches withEvent:event])
[self.nextResponder touchesBegan:touches withEvent:event];
}
Could someone help me wrap my mind around how to forward a touch event to a view that is behind another view? Can I call - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event from a UIViewController?
What we did was to subclass UIScrollView and implement logic that passes responsibility down to views under it, if the touch happens inside of the transparent area.
In our case the transparent area is defined by contentOffset of 120 on Y axis, meaning our content starts 120 points below the start of the UIScrollView, and the code looks like this:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
if (self.contentOffset.y < 0 && point.y < 0.0) {
return NO;
} else {
return YES;
}
}
Obviously this response is well past its prime but hopefully this is helpful to anyone searching for a solution.
Basically, it's up to you to determine what touch events you care to forward to another responder. If you simply want to forward all touch events, just remove that if statement in the code you posted so the next responder will receive all the touch events.
Related
I want to disable touches on all areas of the screen apart from a specific few points (e.g buttons). I.e. I don't want 'touchesBegan' to trigger at all when I tap anything other than a button. Calling
self.view.userInteractionEnabled = NO;
has the desired effect for not registering touches, but then of course I can't tap any buttons. I basically want the button to still work, even if there are 5 points touching the screen, i.e. all touch inputs have been used up, and the button represents the 6th.
Is this possible?
I've tried inserting a view with userInteraction disabled below my buttons, but it still registers touches when the user taps the screen. It seems the only way to disable touch registering is to do so on the entire screen (on the parent UIView).
UPDATE:
I've tried using gesture recognizers to handle all touch events, and ignore those that don't qualify. Here is my code:
#interface ViewController : UIViewController <UIGestureRecognizerDelegate>
...
- (void)viewDidLoad
{
[super viewDidLoad];
UIGestureRecognizer *allRecognizer = [[UIGestureRecognizer alloc] initWithTarget:self action:nil];
allRecognizer.delegate = self;
[self.view addGestureRecognizer:allRecognizer];
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
CGPoint coords = [touch locationInView:self.view];
NSLog(#"Coords: %g, %g", coords.x, coords.y);
if (coords.y < 200) {
[self ignoreTouch:touch forEvent:nil];
return TRUE;
}
return FALSE;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"%i touch(es)", [touches count]);
}
However the screen still 'reads' the touches, so if I place 5 fingers down, the 6th one won't trigger a button press...
You need to set up an invisible UIButton and lay it between the view that should not register touches and the UIButtons that should still be active.
Now you need to set the invisible button's 'userInteractionEnabled':
//userInteractionEnabled == NO => self.view registeres touches
//userInteractionEnabled == YES => self.view doesn't register touches
[_invisibleButton setUserInteractionEnabled:NO];
What really matters in this solution is that both - the invisible and the visible buttons are direct subviews of the VC's view.
You can download an example project from my dropbox:
https://dl.dropboxusercontent.com/u/99449487/DontTapThat.zip
However this example just prevents the handling of certain touches. Completly ignoring input isn't technically possible: Third party apps are not responsible for for detecting input. They are just responsible for handling input. The detection of touch input is done iOS.
The only way to build up a case like you describe it in the comments is to hope that iOS won't interpret the input of your case as a "finger" because it's most likely going to cover an area that's way bigger than a finger.
So in conclusion the best way would be to change the material of the case you're about to build or at least give it a non conductive coating. From a third party developers point of view there is no way to achieve your goals with software if there is a need for 5 fingers as described in the comments.
There is couple of methods in UIView that you can override:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event; // recursively calls -pointInside:withEvent:. point is in the receiver's coordinate system
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event; // default returns YES if point is in bounds
This should prevent to call touchBegin and other methods.
Regards.
I have a nice solution for this. What ever the area you want to hide the interaction place a transparent button on top of the area.
touchesBegan is a default method so it must call all the time when touch happens on view so there is no way-out, But you can still do one thing set
self.buttonPlayMusic.userInteractionEnabled = FALSE;
for the object you don't need touch may be this could be help you with your desired output.
Have you tried using a UIScrollView as the background ? i.e the area where you do not want touch events to be fired.
UIScrollView does not call the touch methods.
You can add UIImageView Control on that area where you want to disable touch event.you can add UIImageView object as top of self.view subViews.
Example
//area -- is that area where you want to disable touch on self.view
UIImageView *imageView=[[UIImageView alloc]initWithFrame:area];
[self.view addSubView:imageView];
You touchDelegate will always call in this way, but if you are doing some task on touch then you can do your task like this way.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
UIButton *touchObject=(UIButton*)[touch view];
if ([touchObject isKindOfClass:[UIButton class]])
{
//Do what ever you want on button touch
}
else{
return;
}
}
My problem is quite strange but simple.
I subclassed a my customer UIScrollView: MyScrollView, where i disabled the scroll:
self.scrollEnabled = NO;
that means apart from
- (void)scrollViewDidScroll:(UIScrollView *)scrollView
all other UIScrollViewDelegate method won't be called
and in MyScrollView i do the content scroll by detecting the user touch movement on screen, that is to say no flipping, no bounces, my implementation is in the touchesMoved:withEvent: method
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
// NSLog(#"touch.phase:%d",touch.phase);
CGPoint currentPosition = [touch locationInView:self];
CGPoint lastTouchPosition = [touch previousLocationInView:self];
CGFloat deltaY = lastTouchPosition.y - currentPosition.y;
CGFloat contentYOffset = self.contentOffset.y + deltaY;
[self setContentOffset:CGPointMake(0,contentYOffset) animated:NO];
}
after the user drag movement have been finished, i do my own method according to the content offset of MyScrollView in touchesEnded:withEvent:
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
//some implementation according to current content offset
}
user moves finger on screen, whenever the finger left the screen surface, the touchesEnded:withEvent: get called and i can implement my own method correctly.
BUT, when user move finger on screen surface from inside the screen to outside either on top or bottom, then lift the finger up, the touchesEnded:withEvent: method never got called, it seems like the ios doesn't treat the move out bounds(top or bottom) event as a touches end event, also ios wouldn't know the what is going on when touch is outside it screen bounds
someone may suggest me to detect the current position in touchesMoved:withEvent: to check out whether it is inbounds or not. this MAY WORK WHEN THE MOVEMENT IS VERY SLOW, but when you move very fast, the system can not detect every point position, it seems like the movement is detected in a certain time interval.
can any one help me out how could i detect if the user finger has moved out of bounds or not
I think the touchesCancelled:withEvent: method will be called !
I have resolved this problem
bcs UIScrollView does too much work that we can not handle some event ourselves, fortunately the UIView can detect the touche move out of bounds and will invoke the touchesEnd:withEvent: method.
considering that replacing MyScrollView's superclass with UIView has too much work to do, so i figured out a simple way to resolve:
i added an TouchActionDetectView subclassed from UIView, whose work is to detect all user touches event and deliver those event to MyScrollView. of course i have to clear the background color of TouchActionDetectView to avoid blocking other view content.
I have several UIImageView as subview of a UIView that act as a canvas. The ImageViews receive touch events, so I can move them.
If I pan two views, each finger on each of them I can move two views at the same time. I don't want that.
I checked the maximumNumberOfTouches property on the superview but it affects that view, but it doesn't prevent that each other subview receives the touch event.
Any ideas how to avoid this behavior?
Thanks
How about just setting a flag that an image view is already moving, and the pan gesture on all views check that flag before actually moving their views? Just set the flag to false once the view has stopped moving.
you may do like this , extend UIImageView as a Custom UIView may called YouImageView
in the touch event
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for ( UIView *v in self.superview.subviews )
{
if ( v != self )
{
v.userInteractionEnabled = NO;
}
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
for ( UIView *v in self.superview.subviews )
{
v.userInteractionEnabled = YES;
}
}
done!
I am already visiting array of UIGestureRecognizer in the scrollview and make maximumNumberOfTouches to be 2 to allow both one/two finger swipe/drag gesture for my custom scrollview.
The thing I want to do is to identify between when it's a one finger drag and when it's a two finger drag. Is there a way to achieve this?
The method -(NSUInteger)numberOfTouches of UIGestureRecognizer could tell you how many touches on it.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
int count = [[event allTouches] count]; // this should be 1 or 2
}
I'm making an small game for iPhone in openGL.
First I removed the "status bar" by writting
[[UIApplication sharedApplication] setStatusBarHidden:YES];
Which worked, but only removed the status bar when my app began to run. Then I modified my project.plist
<key>UIStatusBarHidden</key>
<true/>
And now the status bar is never show, just how I wanted. The problem is that I'm reading touches without problem in any portion of the screen, except for the zone where the status bar used to be.
// This method deals with events when one or more fingers touch the screen
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[myProject newTouch:touches withEvent:event];
[self.nextResponder touchesEnded: touches withEvent:event];
}
// This method deals with events when one or more fingers moves while touching the screen
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[myProject movingTouch:touches withEvent:event ];
}
// This method deals with events when one or more fingers stops touching the screen
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[myProject oldTouchEnded:touches withEvent:event ];
}
// This method deals with events when the system is interrupted ( for example an incomming call)
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
//
}
I guess that hiding the bar is not enough and it must be removed, but how can I do it ?, or there Is another solution ?
What's the size of the view you're reading in? Sometimes people hide the status bar but forget to resize their view to cover the appropriate area. The complete screen is 320x480 - make sure your height is the full 480px, not 460 or smaller.
There is a bug in the simulator: it doesn't register touches where the status bar is (or would be). It works properly on the device, though.
Are you testing on the simulator or on the device?