I have a UIImageView and I am trying to determine when a drag is performed, how far that drag is from the origin. I currently have this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:touch.view];
if(CGRectContainsPoint(myimage.frame, location) == 0){
}
else
{ //user didn't tap inside image}
The image itself does not move, but a person can take their finger, click on the image and then drag their finger. I am just trying to determine that drag distance.
If you want to calculate distance, you need to remember the point (store it somewhere) in touchesBegan if the user tapped on your image.
Then in touchesMoved or touchesEnd you will be able to get current point and calculate distance to your original point.
If you need to get distance from UIImageView origin, you can call [touch locationInView:myImage];
And I suggest you to use UIGestureRecognizer class instead of handling touches by yourself. They are simpler to implement.
Related
i need to drag an element during an animation. The element fall from the top of the screen and i need the user can drag it wherever he want, even during animation
Thanks
You can use the touchesBegan method to detect when the user touches the element.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
if (touch != nil && (touch.view == elementView))
//do your stuff
}
Then set the element's position to the touch location, and remove the animation.
elementView.center = [touch locationInView:self.view];
[elementView.layer removeAllAnimations];
This should work. Then you can use the similar touchesMoved method to update the position during the drag.
Since you're using the UIView block based animations, try using:
animateWithDuration:delay:options:animations:completion:
with the UIViewAnimationOptionAllowUserInteraction option.
In my app, I allow the user to annotate a photo by adding arrows (custom ArrowView). There can be many arrows added, with various zoom & rotation.
I am trying implement selecting of arrow by touch. Currently, I am iterating & using
CGRectContainsPoint(arrowView.frame, touchPoint)
to decide which arrow to select based on a touch gesture.
But, this does not work well when some of the arrows are big & rotated to 45 degrees (since the frame becomes big).
Question:
I would like to use bounds of the arrow translated to parent co-ordinates instead of frame. How can I get this when scaling & rotation is applied?
Alternatively, is there a better method to solve this selection problem?
This code find the arrow under touchPoint:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
UIView *arrow = [self.view hitTest:touchPoint withEvent:event];
}
I have a UIView that contains a number of CALayer subclasses. I am using the following code to detect which layer a touch event corresponds to:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self];
NSLog(#"%#,%#",NSStringFromCGPoint(point),[self.layer hitTest:point].name);
}
This works fine until the device is rotated. When the device is rotated all current layers are removed from the superlayer, and new CALayers are created to fit the new orientation. The new layers are correctly inserted and viewable in the correct orientation.
After the rotation the hitTest method consistently returns null for the layer. I have noticed that when rotated 180 degrees, the returned layer is what was in that location before the rotation, i.e. touching the top left layer gives the layer in the bottom right when rotated 180 degrees. The coordinates of the hit test are printed as expected with (0,0) being in the top left. I redraw the layers with every rotation, but for some reason they seem to be mapped to being the "correct" way up, with the home button at the bottom. Am I missing a function call or something after handling the rotation?
Cheers,
Adam
Okay, I've found that the following code works in all orientations without any issues (In my case there are no overlapping views so this is appropriate for me):
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *t = [touches anyObject];
CGPoint point = [t locationInView:self];
for(CALayer *layer in self.layer.sublayers) {
CGPoint convertedPoint = [self.layer convertPoint:point
toLayer:layer];
if([layer containsPoint:convertedPointPoint]) {
NSLog(#"%#",layer.name);
}
}
}
While this manual point conversion works correctly, the question still remains as to why the original method call did not. Can anybody enlighten me?
Adam
I have drawn a circle on the iphone simulator using quartz 2d with fill. Is there a way by which I can detect a touch event on that circle?
Thanks
Harikant Jammi
If you have a CGPath of that circle, you can obtain a CGPoint of where the user's finger fell inside of touchesBegan and check to see whether it falls within this CGPath using the following code.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
// Depending on your code, you may need to check a different view than self.view
// You should probably check the docs for the arguments of this function--
// It's been a while since I last used it
if (CGPathContainsPoint(yourCircle, nil, location, nil)) {
// Do something swanky
} else {
// Don't do teh swank
}
}
I haven't tested it, but if the surrounding space around the circle is set to an alpha of 0, then maybe only the circle would accept touches. You'd probably have to turn off isOpaque for the view, and have no background color.
If that doesn't work, then you'll have to process the tap location and write code to see if it is within the circle area or not.
Just put a custom invisible button with 0.0 alpha on that circle, so the button will always be able to detect touch.
Currently I see that a touch event will show me the UIView where the touch occured. But what if I need to detect a touch of some non rectangular shape, like a circle. How would I go about doing something like that ?
Basically I want to do something only if the user touches somewhere within a circular area that's not visible.
Any help/direction is appreciated, TIA!
You would do it like so. Note that 'locationInView' will return the coordinates of the touch with respect to the specified view, so a touch in the top-left corner of a view will return (0,0) regardless of where that view is onscreen.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
// gets the coordinats of the touch with respect to the specified view.
CGPoint touchPoint = [touch locationInView:self];
// test the coordinates however you wish,
...
}
To test against a sphere you would calculate the distance from the touch point to the center of the sphere, then check whether this was less than the sphere radius.