UIScrollView touches vs subview touches - iphone

Please can someone help sort a noob out? I've posted this problem on various forums and received no answers, while many searches for other answers have turned up stackOverflow, so I'm hoping this is the place.
I've got a BeachView.h (subclass of UIScrollView, picture of a sandy beach) covered with a random number of Stone.h (subclass of UIImageView, a random PNG of a stone, userInteractionEnabled = YES to accept touches).
If the user touches and moves on the beach, it should scroll.
If the user taps a stone, it should call method "touchedStone".
If the user taps the beach where there is no stone, it should call method "touchedBeach".
Now, I realize this sounds dead simple. Everyone and everything tells me that if there's something on a UIScrollView that accepts touches that it should pass control on to it. So when I touch and drag, it should scroll; but if I tap, and it's on a stone, it should ignore beach taps and accept stone taps, yes?
However, it seems that both views are accepting the tap and calling both touchedStone AND touchedBeach. Furthermore, the beach tap occurs first, so I can't even put in a "if touchedStone then don't run touchedBeach" type flag.
Here's some code.
On BeachView.m
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if (self.decelerating) { didScroll = YES; }
else { didScroll = NO; }
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:touch.view];
NSLog(#"touched beach = %#", [touch view]);
lastTouch = touchLocation;
[super touchesBegan:touches withEvent:event];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
didScroll = YES;
[super touchesMoved:touches withEvent:event];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if (didScroll == NO && isPaused == NO) {
[self touchedBeach:YES location:lastTouch];
}
[super touchesEnded:touches withEvent:event];
}
On Stone.m
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[parent stoneWasTouched]; // parent = ivar pointing from stone to beachview
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:touch.view];
NSLog(#"touched stone = %#", [touch view]);
[parent touchedStone:YES location:touchLocation];
}
After a stone tap, My NSLog looks like this:
Touched beach = <BeachView: 0x1276a0>
ran touchedBeach
Touched Stone = <Stone: 0x1480c0>
ran touchedStone
So it's actually running both. What's even stranger is if I take the touchesBegan and touchesEnded out of Stone.m but leave userInteractionEnabled = YES, the beachView registers both touches itself, but returns the Stone as the view it touched (the second time).
Touched beach = <BeachView: 0x1276a0>
ran touchedBeach
Touched beach = <Stone: 0x1480c0>
ran touchedBeach
So PLEASE, I've been trying to sort this for days. How do I make it so a tapped stone calls only touchedStone and a tapped beach calls only touchedBeach? Where am I going wrong?

Is true, iPhone SDK 3.0 and up, don't pass touches to -touchesBegan: and -touchesEnded: **UIScrollview**subclass methods anymore. You can use the touchesShouldBegin and touchesShouldCancelInContentView methods that is not the same.
If you really want to get this touches, have one hack that allow this.
In your subclass of UIScrollView override the hitTest method like this:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *result = nil;
for (UIView *child in self.subviews)
if ([child pointInside:point withEvent:event])
if ((result = [child hitTest:point withEvent:event]) != nil)
break;
return result;
}
This will pass to you subclass this touches, however you can't cancel the touches to UIScrollView super class.

Prior to iPhone OS 3.0, the UIScrollView's hitTest:withEvent: method always returns self so that it receives the UIEvent directly, only forwarding it to the appropriate subview if and when it determines it's not related to scrolling or zooming.
I couldn't really comment on iPhone OS 3.0 as it's under NDA, but check your "iPhone SDK Release notes for iPhone OS 3.0 beta 5" :)
If you need to target pre-3.0, you could override hitTest:withEvent: in BeachView and set a flag to ignore the next beach touch if the CGPoint is actually in a stone.
But have you tried simply moving your calls to [super touches*:withEvent:] from the end of your overridden methods to the start? This might cause the stone tap to occur first.

I had a similar problem with a simpleView and it is added to a scrollView , and whenever I touched the simpleView , the scrollView used to get the touch and instead of the simpleView , the scrollView moved . To avoid this , I disabled the srcolling of the scrollView when the user touched the simpleView and otherwise the scrolling is enabled .
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *result = [super hitTest:point withEvent:event] ;
if (result == simpleView)
{
scrollView.scrollEnabled = NO ;
}
else
{
scrollView.scrollEnabled = YES ;
}
return result ;
}

This could be related to a bug in iOS7 please review my issue (bug report submitted)
UIScrollView subclass has changed behavior in iOS7

Related

To find what object is underneath our touch when touch moved

How to determine the property or perhaps the object underneath the object i am hovering or dragging?
To put my question clearly, lets take I am hovering a uiview, I want to find out what (object or view) is underneath the view I am hovering.
in custom view you can override touchesEnded method.This sample code may help your custom view hit test problem.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if ([touches count] == 1) {
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:custom_view];
if (CGRectContainsPoint(custom_view.bounds, point)) {
//if touch hit to custom_view
};
}
[super touchesEnded:touches withEvent:event];
}
For one, if you know the frames of both objects you can use CGRectIntersectsRect.
if (CGRectIntersectsRect(topObjectsRect, bottomObjectsRect)) {
//
}
Additionally, you could get the point that was touched and then use the following to check if that point is in a certain rectangle.
if (CGRectContainsPoint(CGRectMake(someX, someY, someWidth, someHeight), pointOfTouch))
{
//
}

Passing touch to next responder or other sub-view iOS

I am sure that this problem will be easily resolved however I am relatively new to iOS development. I am trying to handle passing touch events to children that are lower in the draw order on a UIView. For example -
I create extended UIImageView to create my MoveableImage class. This class is just basically UIImageView that implements the touchesBegan,touchesEnded and touchesMoved-
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[self showFrame];
//if multitouch dont move
if([[event allTouches]count] > 1)
{
return;
}
UITouch *touch = [[event touchesForView:self] anyObject ];
// Animate the first touch
CGPoint colorPoint = [touch locationInView:self];
CGPoint touchPoint = [touch locationInView:self.superview];
//if color is alpha of 0 , they are touching the frame and bubble to next responder
UIColor *color = [self colorOfPoint:colorPoint];
[color getRed:NULL green:NULL blue:NULL alpha:&touchBeganAlpha];
NSLog(#"alpha : %f",touchBeganAlpha);
if(touchBeganAlpha > 0)
{
[self animateFirstTouchAtPoint:touchPoint];
}
else {
[super.nextResponder touchesBegan:touches withEvent:event];
}
}
So the end result is basically this- If they are touching the frame of the imageView and not the image inside the other image that will be underneath can possibly respond. See this image for an example.
So far I have tried next responder however that does not solve the problem. Any help would be greatly appreciated!
Resolved- I stopped checking the alpha on the touchesBegan and touchesMoved. Ovveriding pointInside allowed the UIView to handle that for me.
-(BOOL) pointInside:(CGPoint)point withEvent:(UIEvent *) event
{
BOOL superResult = [super pointInside:point withEvent:event];
if(!superResult)
{
return superResult;
}
if(CGPointEqualToPoint(point, self.previousTouchPoint))
{
return self.previousTouchHitTestResponse;
}else{
self.previousTouchPoint = point;
}
BOOL response = NO;
//if image is nil then return yes and fall back to super
if(self.image == nil)
{
response = YES;
}
response = [self isAlphaVisibleAtPoint:point];
self.previousTouchHitTestResponse = response;
return response;
}
You can override instead - (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event method for your subclass of UIImageView. (It is a method of uiview that each subclass can override)
The UIView uses this method in hitTest:withEvent: to determine which subview should receive a touch event. If pointInside:withEvent: returns YES, then the subview’s hierarchy is traversed; otherwise, its branch of the view hierarchy is ignored.
Check the source code on github of OBShapedButton. They handle tap event only for the opaque part of a button.

iOS UIScrollView with 2 finger pan for paging and one finger pan for "fingerpointer"

I spend quite some time to figure out how to achieve what I want to do but didn't find a proper solution for it, yet. I have a UIScrollView where I changed the panGestureRecognizer from one to two finger recognition so the paging only works when two fingers are used. Now I want to add an additional panGestureRecognizer that shows a courser if I'm panning with one finger. I tried that by just adding an additional panGestureRecognizer to the UIScrollView but then the app crashes immediately. So I thought of adding a subview that is transparent and positioned above the UIScrollView and that I delegate the two finger gestures to the UIScrollView with something like resgin firstResponder. I also thought of overwriting the pangestureRecognizer of the UIScrollView and let it add a Subview where my "fingerpointer"(a little point that is centered where I'm touching the screen right now) is located. I'm totally clueless what way I should go and how to implement it. Your help is greatly appreciated! Thanks a lot!
Timo
Ok, this is the second time editing my response. This might do the trick for you.
If you extend UIScrollView you can override these methods in this way:
//In your h file
BOOL cursorShown;
//In your m file
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if([touches count] == 1)
{
cursorShown = YES;
CGPoint touchLocation = [[touches anyObject] locationInView:self.superview]
//Add your cursor to the parent view here and set its location to touchLocation
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if(cursorShown == YES)
{
CGPoint touchLocation = [[touches anyObject] locationInView:self.superview]
//Move your cursors location to touchLocation
}
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if(cursorShown == YES)
{
cursorShown = NO;
//Destroy your cursor
}
}
-(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
if(cursorShown == YES)
{
cursorShown = NO;
//Destroy your cursor
}
}

iPhone - ignoring the second touch on an area

I have a view that the users are allowed to finger paint. The code is working perfectly if the area is touched with one finger. For example: I touch it with one finger and move the finger. Then, a line is drawn as I move the first finger. If I touch with a second finger the same view, the line that was being drawn by the first finger stops.
I would like to ignore any touch beyond the first, i.e., to track the first touch but ignore all others to the same view.
I am using touchesBegan/moved/ended.
I have used this to detect the touches
UITouch *touch = [[event allTouches] anyObject];
lastPoint = [touch locationInView:myView];
I have also tried this
lastPoint = [[touches anyObject] locationInView:myView];
but nothing changed.
How do I do that - track the first touch and ignore any subsequent touch to a view?
thanks.
NOTE: the view is NOT adjusted to detect multiple touches.
A given touch will maintain the same memory address as long as it is in contact with the screen. This means you can save the address as an instance variable and ignore any events from other objects. However, do not retain the touch. If you do, a different address will be used and your code won't work.
Example:
Add currentTouch to your interface:
#interface MyView : UIView {
UITouch *currentTouch;
...
}
...
#end
Modify touchesBegan: to ignore the touch if one is already being tracked:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if(currentTouch) return;
currentTouch = [touches anyObject];
...
}
Modify touchesMoved: to use currentTouch instead of getting a touch from the set:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if(!currentTouch) return;
CGPoint currentPoint = [currentTouch locationInView:myView];
...
}
Modify touchesEnded: and touchesCancelled: to clear currentTouch, but only if currentTouch has ended or been cancelled.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if(currentTouch && currentTouch.phase == UITouchPhaseEnded) {
...
currentTouch = nil;
}
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
if(currentTouch && currentTouch.phase == UITouchPhaseCancelled) {
...
currentTouch = nil;
}
}
yourView.multipleTouchEnabled = NO;
From the reference documents on UIView
multipleTouchEnabled
A Boolean value that indicates whether
the receiver handles multitouch
events.
#property(nonatomic, getter=isMultipleTouchEnabled) BOOL
multipleTouchEnabled Discussion
When set to YES, the receiver receives
all touches associated with a
multitouch sequence. When set to NO,
the receiver receives only the first
touch event in a multitouch sequence.
The default value of this property is
NO.
Other views in the same window can
still receive touch events when this
property is NO. If you want this view
to handle multitouch events
exclusively, set the values of both
this property and the exclusiveTouch
property to YES.

Problem with cocos2D for iPhone and touch detection

I just don't get it.
I use cocos2d for development of a small game on the iPhone/Pod. The framework is just great, but I fail at touch detection. I read that you just need to overwrite the proper functions (e.g. "touchesBegan" ) in the implementation of a class which subclasses CocosNode. But it doesn't work. What could I do wrong?
the function:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{NSLog(#"tickle, hihi!");}
did I get it totally wrong?
Layer is the only cocos2d class which gets touches.
The trick is that ALL instances of Layer get passed the touch events, one after the other, so your code has to handle this.
I did it like this:
-(BOOL)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
CGPoint cLoc = [[Director sharedDirector] convertCoordinate: location];
float labelX = self.position.x - HALF_WIDTH;
float labelY = self.position.y - HALF_WIDTH;
float labelXWidth = labelX + WIDTH;
float labelYHeight = labelY + WIDTH;
if( labelX < cLoc.x &&
labelY < cLoc.y &&
labelXWidth > cLoc.x &&
labelYHeight > cLoc.y){
NSLog(#"WE ARE TOUCHED AND I AM A %#", self.labelString);
return kEventHandled;
} else {
return kEventIgnored;
}
}
Note that the cocos2d library has a "ccTouchesEnded" implementation, rather than the Apple standard. It allows you to return a BOOL indicating whether or not you handled the event.
Good luck!
Have you added this to your layers init method?
// isTouchEnabled is an property of Layer (the super class).
// When it is YES, then the touches will be enabled
self.isTouchEnabled = YES;
// isAccelerometerEnabled is property of Layer (the super class).
// When it is YES, then the accelerometer will be enabled
self.isAccelerometerEnabled = YES;
In order to detect touches, you need to subclass from UIResponder (which UIView does as well) . I am not familiar with cocos2D, but a quick look at the documentation reveals that CocosNode does not derive from UIResponder.
Upon further investigation, it looks like Cocos folks created a Layer class that derives from CocosNode. And that class implements the touch event handlers. But those are prefixed by cc.
See http://code.google.com/p/cocos2d-iphone/source/browse/trunk/cocos2d/Layer.h
Also see menu.m code and the below blog post article for more info on this:
http://blog.sapusmedia.com/2008/12/cocos2d-propagating-touch-events.html
maw, the CGPoint struct members x,y are floats. use #"%f" to format floats for printf/NSLog.
If you use the 0.9 beta of cocos2D it has a really simple touch detection for CocosNodes. The real beauty of this new detection is that it handles multiple touch tracking really well.
An example of this can be found here
http://code.google.com/p/cocos2d-iphone/source/browse/#svn/trunk/tests/TouchesTest
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//Add a new body/atlas sprite at the touched location
CGPoint tapPosition;
for( UITouch *touch in touches ) {
CGPoint location = [touch locationInView: [touch view]];
tapPosition = [self convertToNodeSpace:[[CCDirector sharedDirector] convertToGL:location]]; // get the tapped position
}
}
think this can help you....
-Make your scene conforms to protocol CCTargetedTouchDelegate
-Add This line to init of your scene:
[[[CCDirector sharedDirector] touchDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:NO];
-Implement these functions:
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
return YES;
}
-(void)ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event
{
//here touch is ended
}