I have the following code to determine if a touch is within an image view in my table cell. However, it doesn't work. I compared the two with CGRectContainsPoint however, it doesn't work. Here is the code:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Declare the touch and get it's location
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self];
if (CGRectContainsPoint(myImageView.frame, touchLocation))
{
NSLog(#"Tapped image view");
}
}
Thanks for the help!
However, it doesn't work.
Please be more specific.
UITouch *touch = [touches anyObject];
Why not examine every touch, and not simply a random* pick of them?
*The documentation for anyObject says that you are not guaranteed which one it will give you. You are not even guaranteed that it will be random; it could be the same object every time. Murphy's Law says that, whether it is random or not, it will be the wrong one.
CGPoint touchLocation = [touch locationInView:self];
if (CGRectContainsPoint(myImageView.frame, touchLocation))
Your frame is in your superview's co-ordinate system; [touch locationInView:self] returns the touch point in your co-ordinate system. You want to test within bounds, which is in your co-ordinate system. The documentation explains the difference.
The problem is that you need to be calling [touch locationInView:myImageView] to get the point in the image views coordinate system. Then do your check to see if it's within the frame.
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self];
if ([touch view]==view) {
view.center=location;
}
write it in the touch moved event. thanx
Remember that when you're asking a touch for locationInView:, you're getting out a point relative to that view's frame. So, assuming the code snippet you gave was contained in a subclass of UIViewController you should be asking for
CGPoint touchLocation = [touch locationInView:self.view];
Which will give you a point relative to your view. The reason you want a point relative to your current view is because the frame of your image view is also relative to its parent view - the same view. So now it should work.
if (CGRectContainsPoint(myImageView.frame, touchLocation)) {
NSLog(#"Tapped image view");
}
Related
I have a view control and inside I plan to place some controls like buttons textbox etc... I can drag my view along the x axis like:
1)
2)
with the following code:
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
if( [touch view] == ViewMain)
{
CGPoint location = [touch locationInView:self.view];
displaceX = location.x - ViewMain.center.x;
displaceY = ViewMain.center.y;
startPosX = location.x - displaceX;
}
CurrentTime = [[NSDate date] timeIntervalSince1970];
}
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
if( [touch view] == ViewMain)
{
CGPoint location = [touch locationInView:self.view];
location.x =location.x - displaceX;
location.y = displaceY;
ViewMain.center = location;
}
}
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
double time = [[NSDate date] timeIntervalSince1970]-CurrentTime;
UITouch *touch = [[event allTouches] anyObject];
if( [touch view] == ViewMain)
{
CGPoint location = [touch locationInView:self.view];
location.x =location.x - displaceX;
location.y = displaceY;
ViewMain.center = location;
double speed = (ViewMain.center.x-startPosX)/(time*2);
NSLog(#"speed: %f", speed);
}
}
not that I have to add the global variables:
float displaceX = 0;
float displaceY = 0;
float startPosX = 0;
float startPosY = 0;
double CurrentTime;
the reason why I created those variables is so that when I start dragging the view the view moves from the point where I touch it instead of from the middle.
Anyways if I touch a button or image the view will not drag even though the images have transparency on the background. I want to be able to still be able to drag the view regardless if there is an image on top of the view. I where thinking that maybe I need to place a large transparent view on top of everything but I need to have buttons, images etc. I want to be able to drag a view just like you can with:
note that I was able to drag the view regardless of wither I first touched an app/image or text. How could I do that?
I think your problem is that if you touch a UIButton or a UIImageView with interaction enabled, it doesn't pass the touch along.
For the images, uncheck the User Interaction Enabledproperty in IB.
For the buttons that are causing touchesBegan:withEvent:, etc. to not get called, then look at the following link: Is there a way to pass touches through on the iPhone?.
You may want to consider a different approach to this problem. Rather than trying to manually manage the content scrolling yourself you would probably be better off using a UIScrollView with the pagingEnabled property set to YES. This is the method Apple recommends (and it's probably the method used by Springboard.app in your last screenshot). If you are a member of the iOS developer program check out the WWDC 2010 session on UIScrollView for an example of this. I think they may have also posted sample code on developer.apple.com.
I am bit stacked with a functionality that should perform a certain task after double taping on a certain place on UIView. I know how to count number of taps, but do not know how to determinate which place has been tapped and I guess to compare with CGRect of view which was specified for doing this action.
thanx in advance
We can detect with touchesBegan
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSUInteger numTaps = [[touches anyObject] tapCount];
UITouch* t;
if([[event allTouches] count]==2)//double tap
{
t=[[[event allTouches] allObjects] objectAtIndex:0];
CGPoint p1=[t locationInView:self.view];
}
numTaps gives the nuber of taps .
P1 has the point where it is tapped.
All the best.
use
Point point = [touch locationInView:self.view];
I want to get touch event on UIImageView with specified location only.
UITouch *touch=[touches anyObject];
CGPoint currentPoint=[touch locationInView:CGRectMake(0,0,100,100)];
CGRectMake(0,0,100,100) In the specific location i need touche event, is it possible.
What to do?
Thanks in advance.
In touchesBegan and touchesMoved:
UITouch *touch=[touches anyObject];
CGPoint currentPoint=[touch locationInView:self];
CGRect testRect = CGRectMake(0,0,100,100);
if(CGRectContainsPoint(testRect, currentPoint)) {
//For if you are touching the 100 x 100 rectangle...
}
Trying to get some basic drag/drop functionality happening for an iPhone application.
My current code for trying to do this is as follows:
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
self.center = location;
}
This code has the result of a the touched UIView flickering while it follows the touch around. After playing around with it a bit, I also noticed that the UIView seemed to flicker from the 0,0 position on the screen to the currently touched location.
Any ideas what I'm doing wrong?
The center property needs to be specified in the coordinate system of the superview, but you've asked the touch event for the location in terms of your subview. Instead ask for them based on the superview's coordinates, like this:
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.superview]; // <--- note self.superview
self.center = location;
}
I am trying to build an iPhone app by using Cocos2d. And i would like to set an image to another fixed position from a fixed position by using touch as my wish(speedy, or slowly). I have got some code but it does not work properly.
so friends it will more helpful to me if i get any solution.
The question is a little fuzzy, but if you want to set the position of a CocosNode you do:
[myNode setPosition:cpv(x,y)];
If you want the node to be offset from a touch location, you can do this by implementing ccTouchesBegan:withEvent
-(BOOL)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
CGPoint convertedLocation = [[Director sharedDirector] convertCoordinate:location];
[myNode setPosition: cpv(convertedLocation.x - 100, convertedLocation.y - 100)];
return kEventHandled;
}
That will offset the CocosNode by -100,-100 to where the touch occurred.
The ccTouchesBegan:withEvent: should be implemented in your Layer, and isTouchesEnabled should be set to YES to enable touches.