iPhone hitTest broken after rotation - iphone

I have a UIView that contains a number of CALayer subclasses. I am using the following code to detect which layer a touch event corresponds to:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self];
NSLog(#"%#,%#",NSStringFromCGPoint(point),[self.layer hitTest:point].name);
}
This works fine until the device is rotated. When the device is rotated all current layers are removed from the superlayer, and new CALayers are created to fit the new orientation. The new layers are correctly inserted and viewable in the correct orientation.
After the rotation the hitTest method consistently returns null for the layer. I have noticed that when rotated 180 degrees, the returned layer is what was in that location before the rotation, i.e. touching the top left layer gives the layer in the bottom right when rotated 180 degrees. The coordinates of the hit test are printed as expected with (0,0) being in the top left. I redraw the layers with every rotation, but for some reason they seem to be mapped to being the "correct" way up, with the home button at the bottom. Am I missing a function call or something after handling the rotation?
Cheers,
Adam

Okay, I've found that the following code works in all orientations without any issues (In my case there are no overlapping views so this is appropriate for me):
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *t = [touches anyObject];
CGPoint point = [t locationInView:self];
for(CALayer *layer in self.layer.sublayers) {
CGPoint convertedPoint = [self.layer convertPoint:point
toLayer:layer];
if([layer containsPoint:convertedPointPoint]) {
NSLog(#"%#",layer.name);
}
}
}
While this manual point conversion works correctly, the question still remains as to why the original method call did not. Can anybody enlighten me?
Adam

Related

How to get effective Bounds of a subview in its parent view's co-ordinate system

In my app, I allow the user to annotate a photo by adding arrows (custom ArrowView). There can be many arrows added, with various zoom & rotation.
I am trying implement selecting of arrow by touch. Currently, I am iterating & using
CGRectContainsPoint(arrowView.frame, touchPoint)
to decide which arrow to select based on a touch gesture.
But, this does not work well when some of the arrows are big & rotated to 45 degrees (since the frame becomes big).
Question:
I would like to use bounds of the arrow translated to parent co-ordinates instead of frame. How can I get this when scaling & rotation is applied?
Alternatively, is there a better method to solve this selection problem?
This code find the arrow under touchPoint:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
UIView *arrow = [self.view hitTest:touchPoint withEvent:event];
}

Jittery movement when using CGAffineTransformMakeRotation

I'm having a problem with a jumping movement with using CGAffineTransformMakeRotation. It seems to work fine until I get to the top of quadrants 1 and 2. Below is the code I am using and youtube link that shows what happens. Any insight would be really appreciated.
My goal is to rotate a UIWebView with an svg inside. Since I can't easily detect touch on the UIWebView alone, I'm putting a blank UIImageView over it. This allows me to detect the touch and prevent the copy dialog from popping up.
http://www.youtube.com/watch?v=x_OmS0MPdEE&feature=youtu.be
- (void)touchesMoved: (NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
NSArray *allTouches = [touches allObjects];
CGPoint location = [[allTouches objectAtIndex:0] locationInView:self.view];
if(selected == 1) {
CGFloat rads = atan2f(location.y - (grid1.frame.size.height/2),location.x - (grid1.frame.size.width/2));
grid1.transform = grid1Btn.transform = CGAffineTransformMakeRotation(rads);
}
}
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
NSArray *allTouches = [touches allObjects];
CGPoint location = [[allTouches objectAtIndex:0] locationInView:self.view];
if(CGRectContainsPoint(grid1Btn.frame, location))
{
selected = 1;
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
selected = -1;
}
You use some frame.size.* in your computations. Note, that those values (unlike bounds.size.*) are affected by transform. Use grid1.center instead. It is also more logical (you rotate around center).
The next thing you need to fix is that it should not jump to new position when touches start :)
The jerkiness is caused by the inherent inaccuracy of touch event locations.
Touch events that are located sufficiently close to the center of the frame have a high likelihood of straying to the diagonally opposite quadrant of the frame whilst a circle is being traced about the center of the frame.
This will result in a sudden jump of 90 degrees as observed in your video.
One way to avoid the problem is to introduce a dead zone about the center of the frame. Any touch that is located within a given radius of the center of the frame would not trigger the rotation to be recalculated.
Hope this helps.

Determine How far drag point is

I have a UIImageView and I am trying to determine when a drag is performed, how far that drag is from the origin. I currently have this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:touch.view];
if(CGRectContainsPoint(myimage.frame, location) == 0){
}
else
{ //user didn't tap inside image}
The image itself does not move, but a person can take their finger, click on the image and then drag their finger. I am just trying to determine that drag distance.
If you want to calculate distance, you need to remember the point (store it somewhere) in touchesBegan if the user tapped on your image.
Then in touchesMoved or touchesEnd you will be able to get current point and calculate distance to your original point.
If you need to get distance from UIImageView origin, you can call [touch locationInView:myImage];
And I suggest you to use UIGestureRecognizer class instead of handling touches by yourself. They are simpler to implement.

How do I detect a touch over a specific area

Currently I see that a touch event will show me the UIView where the touch occured. But what if I need to detect a touch of some non rectangular shape, like a circle. How would I go about doing something like that ?
Basically I want to do something only if the user touches somewhere within a circular area that's not visible.
Any help/direction is appreciated, TIA!
You would do it like so. Note that 'locationInView' will return the coordinates of the touch with respect to the specified view, so a touch in the top-left corner of a view will return (0,0) regardless of where that view is onscreen.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
// gets the coordinats of the touch with respect to the specified view.
CGPoint touchPoint = [touch locationInView:self];
// test the coordinates however you wish,
...
}
To test against a sphere you would calculate the distance from the touch point to the center of the sphere, then check whether this was less than the sphere radius.

How can I detect touch in cocos2d?

I am developing a 2d game for iPhone by using cocos2d.
I use many small sprite (image) in my game. I want to touch two similar types of sprite(image) and then both sprite(image) will be hidden.
How can I detect touch in a specific sprite(image) ?
A better way to do this is to actually use the bounding box on the sprite itself (which is a CGRect). In this sample code, I put all my sprites in a NSMutableArray and I simple check if the sprite touch is in the bounding box. Make sure you turn on touch detection in the init. If you notice I also accept/reject touches on the layer by returning YES(if I use the touch) or NO(if I don't)
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
CGPoint location = [self convertTouchToNodeSpace: touch];
for (CCSprite *station in _objectList)
{
if (CGRectContainsPoint(station.boundingBox, location))
{
DLog(#"Found sprite");
return YES;
}
}
return NO;
}
Following Jonas's instructions, and adding onto it a bit more ...
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [[[Director sharedDirector] convertCoordinate: touch.location];
CGRect particularSpriteRect = CGMakeRect(particularSprite.position.x, particularSprite.position.y, particularSprite.contentSize.width, particularSprite.contentSize.height);
if(CGRectContainsPoint(particularSpriteRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
You may need to adjust the x/y a little to account for the 'centered positioning' in Cocos
In your layer that contains your sprite, you need to say:
self.isTouchEnabled = YES;
then you can use the same events that you would use in a UIView, but they're named a little differently:
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
//in your touchesEnded event, you would want to see if you touched
//down and then up inside the same place, and do your logic there.
}
#david, your code has some typos for cocos 0.7.3 and 2.2.1, specifically CGRectMake instead of CGMakeRect and [touch location] is now [touch locationInView:touch.view].
here's what I did:
- (BOOL)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [[Director sharedDirector] convertCoordinate: [touch locationInView:touch.view]];
CGRect myRect = CGRectMake(sprite.position.x, sprite.position.y, sprite.contentSize.width, sprite.contentSize.height);
if(CGRectContainsPoint(myRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
#Genericrich: CGRectContainsPoint works in CocosLand because of the call 2 lines above:
[[Director sharedDirector] convertCoordinate:]
The Cocos2D objects will be using the OpenGL coordinate system, where 0,0 is the lower left, and UIKit coordinates (like where the touch happened) have 0,0 is upper left. convertCoordinate: is making the flip from UIKit to OpenGL for you.
Here's how it worked for me...
Where spriteSize is obviously the sprite's size... :P
- (BOOL)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [[Director sharedDirector] convertCoordinate: [touch locationInView:touch.view]];
CGRect myRect = CGRectMake(sprite.position.x-spriteSize/2, sprite.position.y-spriteSize/2, spriteSize, spriteSize);
if(CGRectContainsPoint(myRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
this is a good tutorial explaining the basic touch system
http://ganbarugames.com/2010/12/detecting-touch-events-in-cocos2d-iphone/
first, write
self.isTouchEnabled = YES;
then, you need to implement the functions ccTouchesEnded, ccTouchesBegan, etc
from what I understood, you want to be able to 'match' two sprites that can be on different coordinates on the screen.
a method for doing this.. : (im sure theres many other methods)
consider having 2 global variables.
so everytime a touch touches a sprite, you use the CGRectContainsPoint function that is mentioned several times to find which sprite has been touched. then, you can save the 'tag' of that sprite in one of the global variables.
You do the same for the second touch, and then you compare the 2 global variables.
you should be able to figure out the rest but comment if you have problems.