I want to get touch event on UIImageView with specified location only.
UITouch *touch=[touches anyObject];
CGPoint currentPoint=[touch locationInView:CGRectMake(0,0,100,100)];
CGRectMake(0,0,100,100) In the specific location i need touche event, is it possible.
What to do?
Thanks in advance.
In touchesBegan and touchesMoved:
UITouch *touch=[touches anyObject];
CGPoint currentPoint=[touch locationInView:self];
CGRect testRect = CGRectMake(0,0,100,100);
if(CGRectContainsPoint(testRect, currentPoint)) {
//For if you are touching the 100 x 100 rectangle...
}
Related
I created a sknodeshape. How can I detect if the shape I made was touched(clicked)?
Here is the code: (I solved it already)
//metung_babi is the name of the SKShapeNode
UITouch *touch = [touches anyObject];
CGPoint nokarin = [touch locationInNode:self];
SKNode *node = [self nodeAtPoint:nokarin];
if ([node.name isEqualToString:#"metung_babi"]) {
NSlog(#"touch me not");
}
my mistake was when I created the shape I put the SKShapeNode name before initializing it.
Implement the touch delegate methods. Then, in the -touchesBegan: method, extract the point of touch and retrieve the node using the [self nodeAtPoint:] method
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
/* Called when a touch begins */
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
SKNode *yourNode = ....;
CGRect yourNodeFrame = bowlNode.frame;
if (CGRectContainsPoint(yourNodeFrame, location)) {
//your node may be touched, check if it's your node
SKNode *theNode = [self nodeAtPoint:location];
if ([theNode.name isEqualTo:yourNode.name]) {
//it's your node touched
}
}
}
the method nodeAtPoint return value is complicated. you can check the document to find different situations
Im having a little problem on handling touches in my apps.
I set my touchesBegan like this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *currentTouch = [[event allTouches] anyObject];
touchPoint = [currentTouch locationInView:self.view];
if (CGRectContainsPoint(image1.frame, touchPoint)) {
image1IsTouched = YES;
}
}
Then i set my touch move like this:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *currentTouch = [[event allTouches] anyObject];
currentPoint = [currentTouch locationInView:currentTouch.view];
if(image1IsTouched == YES) {
image1.center = CGPointMake(currentPoint.x,currentPoint.y);
.....
}
}
Now i tried my app on actual unit and thats where i notice my problem. While im touching the image1 with 1 finger the app is doing ok and its checking for collision everytime i drag my finger. The problem occurs when i touch the screen with another finger while touching/dragging the image. The image im currently touching will jump to the other finger. I've tried [myView setMultipleTouchEnable:NO]; & using NSArray on touches and comparing the [touches count] with the touch but its not working. Can someone show me how to set a uiimageview to act on single touch only. Thanks.
First, you should use UITouch *currentTouch = [touches anyObject]; to get the current touch.
Second, you should check that touches.count == 1 to make sure there's only one finger on the screen, and ignore touch input if there's more than one, unless you wanted to support multitouch.
I have the following code to determine if a touch is within an image view in my table cell. However, it doesn't work. I compared the two with CGRectContainsPoint however, it doesn't work. Here is the code:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Declare the touch and get it's location
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self];
if (CGRectContainsPoint(myImageView.frame, touchLocation))
{
NSLog(#"Tapped image view");
}
}
Thanks for the help!
However, it doesn't work.
Please be more specific.
UITouch *touch = [touches anyObject];
Why not examine every touch, and not simply a random* pick of them?
*The documentation for anyObject says that you are not guaranteed which one it will give you. You are not even guaranteed that it will be random; it could be the same object every time. Murphy's Law says that, whether it is random or not, it will be the wrong one.
CGPoint touchLocation = [touch locationInView:self];
if (CGRectContainsPoint(myImageView.frame, touchLocation))
Your frame is in your superview's co-ordinate system; [touch locationInView:self] returns the touch point in your co-ordinate system. You want to test within bounds, which is in your co-ordinate system. The documentation explains the difference.
The problem is that you need to be calling [touch locationInView:myImageView] to get the point in the image views coordinate system. Then do your check to see if it's within the frame.
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self];
if ([touch view]==view) {
view.center=location;
}
write it in the touch moved event. thanx
Remember that when you're asking a touch for locationInView:, you're getting out a point relative to that view's frame. So, assuming the code snippet you gave was contained in a subclass of UIViewController you should be asking for
CGPoint touchLocation = [touch locationInView:self.view];
Which will give you a point relative to your view. The reason you want a point relative to your current view is because the frame of your image view is also relative to its parent view - the same view. So now it should work.
if (CGRectContainsPoint(myImageView.frame, touchLocation)) {
NSLog(#"Tapped image view");
}
Trying to get some basic drag/drop functionality happening for an iPhone application.
My current code for trying to do this is as follows:
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
self.center = location;
}
This code has the result of a the touched UIView flickering while it follows the touch around. After playing around with it a bit, I also noticed that the UIView seemed to flicker from the 0,0 position on the screen to the currently touched location.
Any ideas what I'm doing wrong?
The center property needs to be specified in the coordinate system of the superview, but you've asked the touch event for the location in terms of your subview. Instead ask for them based on the superview's coordinates, like this:
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.superview]; // <--- note self.superview
self.center = location;
}
I am trying to build an iPhone app by using Cocos2d. And i would like to set an image to another fixed position from a fixed position by using touch as my wish(speedy, or slowly). I have got some code but it does not work properly.
so friends it will more helpful to me if i get any solution.
The question is a little fuzzy, but if you want to set the position of a CocosNode you do:
[myNode setPosition:cpv(x,y)];
If you want the node to be offset from a touch location, you can do this by implementing ccTouchesBegan:withEvent
-(BOOL)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
CGPoint convertedLocation = [[Director sharedDirector] convertCoordinate:location];
[myNode setPosition: cpv(convertedLocation.x - 100, convertedLocation.y - 100)];
return kEventHandled;
}
That will offset the CocosNode by -100,-100 to where the touch occurred.
The ccTouchesBegan:withEvent: should be implemented in your Layer, and isTouchesEnabled should be set to YES to enable touches.