How to check whether a CGPoint is inside a UIImageView? - iphone

In touchesBegan:
CGPoint touch_point = [[touches anyObject] locationInView:self.view];
There are tens of UIImageView around, stored in a NSMutableArray images. I'd like to know is there a built-in function to check if a CGPoint (touch_point) is inside one of the images, e.g.:
for (UIImageView *image in images) {
// how to test if touch_point is tapped on a image?
}
Thanks
Follow up:
For unknown reason, pointInside never returns true. Here is the full code.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
touch_point = [touch locationInView:self.view];
for (UIImageView *image in piece_images) {
if ([image pointInside:touch_point withEvent:event]) {
image.hidden = YES;
} else {
image.hidden = NO;
}
NSLog(#"image %.0f %.0f touch %.0f %.0f", image.center.x, image.center.y, touch_point.x, touch_point.y);
}
}
although I can see the two points are sometimes identical in the NSLog output.
I also tried:
if ([image pointInside:touch_point withEvent:nil])
the result is the same. never returns a true.
To eliminate the chance of anything goes with with the images. I tried the following:
if (YES or [image pointInside:touch_point withEvent:event])
all images are hidden after the first click on screen.
EDIT 2:
Really weird. Even I hard coded this:
point.x = image.center.x;
point.y = image.center.y;
the code becomes:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint point; // = [touch locationInView:self.view];
for (UIImageView *image in piece_images) {
point.x = image.center.x;
point.y = image.center.y;
if ([image pointInside:point withEvent:event]) {
image.hidden = YES;
NSLog(#"YES");
} else {
image.hidden = NO;
NSLog(#"NO");
}
NSLog(#"image %.0f %.0f touch %.0f %.0f", image.center.x, image.center.y, point.x, point.y);
}
}
pointInside always returns false ...

Sounds to me like the problem is you need to convert coordinates from your view's coordinates to the UIImageView's coordinates. You can use UIView's convertPoint method.
Try replacing
if ([image pointInside:touch_point withEvent:event]) {
with
if ([image pointInside: [self.view convertPoint:touch_point toView: image] withEvent:event]) {
(Note that you are allowed to pass nil instead of event.)
I guess this is a bit late for the questioner, but I hope it is of use to someone.

if ([image pointInside:touch_point withEvent:event]) {
// Point inside
}else {
// Point isn't inside
}
In this case event is taken from :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self.view];
for (UIImageView *piece in piece_images) {
CGFloat x_min = piece.center.x - (piece.bounds.size.width / 2);
CGFloat x_max = x_min + piece.bounds.size.width;
CGFloat y_min = piece.center.y - (piece.bounds.size.height / 2);
CGFloat y_max = y_min + piece.bounds.size.height;
if (point.x > x_min && point.x < x_max && point.y > y_min && point.y < y_max ) {
piece.hidden = YES;
} else {
piece.hidden = NO;
}
}
}
too bad, I have do it myself...
it's OS 3.2, XCode 3.2.2, tried both on simulator and iPad

Related

how can i fill the transparent area of the image when my finger moved on

I wanna draw the transparent area with brush, but my code work not very well.I think someone can help me here. My code :
// Handles the start of a touch
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds = [self bounds];
UITouch *touch = [[event touchesForView:self] anyObject];
if (![image isPointTransparent:[touch locationInView:self]]
|| ![image isPointTransparent:[touch previousLocationInView:self]])
{
return;
}
firstTouch = YES;
// Convert touch point from UIView referential to OpenGL one (upside-down flip)
location = [touch locationInView:self];
location.y = bounds.size.height - location.y;
}
// Handles the continuation of a touch.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds = [self bounds];
UITouch *touch = [[event touchesForView:self] anyObject];
if (![image isPointTransparent:[touch locationInView:self]]
|| ![image isPointTransparent:[touch previousLocationInView:self]])
{
return;
}
// Convert touch point from UIView referential to OpenGL one (upside-down flip)
if (firstTouch)
{
firstTouch = NO;
previousLocation = [touch previousLocationInView:self];
previousLocation.y = bounds.size.height - previousLocation.y;
}
else
{
location = [touch locationInView:self];
location.y = bounds.size.height - location.y;
previousLocation = [touch previousLocationInView:self];
previousLocation.y = bounds.size.height - previousLocation.y;
}
// Render the stroke
[self renderLineFromPoint:previousLocation toPoint:location];
}
// Handles the end of a touch event when the touch is a tap.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds = [self bounds];
UITouch *touch = [[event touchesForView:self] anyObject];
if (![image isPointTransparent:[touch locationInView:self]] || ![image isPointTransparent:[touch previousLocationInView:self]])
{
return;
}
if (firstTouch)
{
firstTouch = NO;
previousLocation = [touch previousLocationInView:self];
previousLocation.y = bounds.size.height - previousLocation.y;
[self renderLineFromPoint:previousLocation toPoint:location];
}
}
The important thing to know is that you should do your actual drawing in the drawRect: of your UIView. So the renderLineFromPoint:toPoint: method in your code should just be building up an array of lines and telling the view to redraw each time, something like this:
- (void)renderLineFromPoint:(CGPoint)from toPoint:(CGPoint)to
{
[lines addObject:[Line lineFrom:from to:to]];
[self setNeedsDisplay];
}
This assumes that you have a Line class which has 2 CGPoint properties. Your drawRect: may look something like this:
- (void)drawRect:(CGRect)rect
{
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetRGBStrokeColor(context, 0.0f, 0.0f, 0.0f, 1.0f);
for (Line *line in lines) {
CGContextMoveToPoint(context, line.from.x, line.from.y);
CGContextAddLineToPoint(context, line.to.x, line.to.y);
CGContextStrokePath(context);
}
}
If you do it like this (without OpenGL) there is no need to flip the y-axis.
The only thing left is to implement the isPointTransparent: method. It's difficult to know from your code how this should work.

How to find out the touch screen position

I am developing one application. In that I am using one UIImageView and I am changing the position of the UIImageView every 0.5 seconds using below code.
[NSTimer scheduledTimerWithTimeInterval:0.5
target: self
selector:#selector(moveImage)
userInfo: nil repeats:YES];
-(void) moveImage
{
//[image1 setCenter: CGPointMake(634, 126)];
CGFloat x = (CGFloat) (arc4random() % (int) self.view.bounds.size.width);
CGFloat y = (CGFloat) (arc4random() % (int) self.view.bounds.size.height);
CGPoint squarePostion = CGPointMake(x, y);
img.center=squarePostion;
}
Now i can touch the screen. What i need to find out is my touch location and that imageview location both are correct or not.
use this
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
if ([touch view] == photo1) {
//photo1 is image view give tag to it
if( photo1.tag==3)
{
NSLog(#"You have been touched image view");
}
photo1.center = touchLocation;
}
}
//it used to find the point in view
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
CGPoint pos = [touch locationInView: [UIApplication sharedApplication].keyWindow];
NSLog(#"%f,%f",pos.x, pos.y);
}

Dragging an Image View

I am trying to drag an image view. I have got little bit of success in doing so, but it is not behaving as i want. I wish that it should move only if touch inside the image and drag it.
But it is moving even if I am touching and dragging from anywhere on the screen.
I have written code like this:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//retrieve touch point
CGPoint pt= [[ touches anyObject] locationInView:[self.view.subviews objectAtIndex:0]];
startLocation = pt;
}
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
CGPoint pt = [[touches anyObject] locationInView: [self.view.subviews objectAtIndex:0]];
CGRect frame = [[self.view.subviews objectAtIndex:0]frame];
frame.origin.x += pt.x - startLocation.x;
frame.origin.y += pt.y - startLocation.y;
[[self.view.subviews objectAtIndex:0] setFrame: frame];
}
The return value of locationInView method is point relative to the view frame. check it is or not in the view frame first.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect targetFrame = [self.view.subviews objectAtIndex:0].frame;
//retrieve touch point
CGPoint pt= [[ touches anyObject] locationInView:[self.view.subviews objectAtIndex:0]];
//check if the point in the view frame
if (pt.x < 0 || pt.x > targetFrame.size.width || pt.y < 0 || pt.y > targetFrame.size.height)
{
isInTargetFrame = NO;
}
else
{
isInTargetFrame = YES;
startLocation = pt;
}
}
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
if(!isInTargetFrame)
{
return;
}
//move your view here...
}
Try something like this:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//retrieve touch point
startLocation = [[ touches anyObject] locationInView:self.view];
// Now here check to make sure that start location is within the frame of
// your subview [self.view.subviews objectAtIndex:0]
// if it is you need to have a property like dragging = YES
// Then in touches ended you set dragging = NO
}
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
CGPoint pt = [[touches anyObject] locationInView: [self.view.subviews objectAtIndex:0]];
CGRect frame = [[self.view.subviews objectAtIndex:0]frame];
frame.origin.x += pt.x - startLocation.x;
frame.origin.y += pt.y - startLocation.y;
[[self.view.subviews objectAtIndex:0] setFrame: frame];

Multitouch don't work in cocos2d

This is my ccTouchesMoved method.
Whats wrong? I use cocos2d framework.
-(void) ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
CCNode *sprite = [self getChildByTag:kTagPlayer];
CCNode *sprite2 = [self getChildByTag:kTagEnemy];
CGPoint point;
//Собрать все касания.
NSSet *allTouches = [event allTouches];
for (UITouch *touch in allTouches)
{
point = [touch locationInView:[touch view]];
point = [[CCDirector sharedDirector] convertToGL:point];
if (point.y > 384)
{
if (point.x > 992)
sprite2.position = ccp(992, size.height - 100);
else if (point.x < 32)
sprite2.position = ccp(32, size.height - 100);
else
sprite2.position = ccp(point.x, size.height - 100);
}
else
{
if (point.x > 992)
sprite.position = ccp(992, 100);
else if (point.x < 32)
sprite.position = ccp(32, 100);
else
sprite.position = ccp(point.x, 100);
}
}
}
Have you enabled multiple touches in your glView? By default the glView is instantiated in the app delegate. The code is below.
[glView setMultipleTouchEnabled:YES];
In case you're developing a Retina display App, be aware that all coordinates are in points, not pixels. So even on a Retina display with 960x640 pixels the coordinates in points (your touch location) will be in the range 480x320.
If you want to use pixels, use the "InPixels" version of all coordinates, in this case:
sprite.positionInPixels = ccp(992, 100);
If that's not the problem you should add to your post what the expected outcome is and what happens instead. A little context goes a long way.
What does the debugger say is in allTouches? You could try getting all the touches for the view like this instead:
UITouch* touch = [touches anyObject];
NSSet* allTouches = [touches setByAddingObjectsFromSet:[event touchesForView:[touch view]]];

Multi touch is not working perfectly on UIImageView

I am doing multi touch on UImageView means zoom in and zoom out on image view. I am using followng code but it doesn't work very well. Can anyone look at this code,
#import "ZoomingImageView.h"
#implementation ZoomingImageView
#synthesize zoomed;
#synthesize moved;
define HORIZ_SWIPE_DRAG_MIN 24
define VERT_SWIPE_DRAG_MAX 24
define TAP_MIN_DRAG 10
CGPoint startTouchPosition;
CGFloat initialDistance;
- (id)initWithFrame:(CGRect)frame {
if (self = [super initWithFrame:frame]) {
// Initialization code
moved = NO;
zoomed = NO;
}
return self;
}
- (void)drawRect:(CGRect)rect {
// Drawing code
}
- (void)dealloc {
if([timer isValid])
[timer invalidate];
[super dealloc];
}
- (void) setImage: (UIImage*)img
{
zoomed = NO;
moved = NO;
self.transform = CGAffineTransformIdentity;
[super setImage:img];
}
- (CGFloat)distanceBetweenTwoPoints:(CGPoint)fromPoint toPoint:(CGPoint)toPoint {
float x = toPoint.x - fromPoint.x;
float y = toPoint.y - fromPoint.y;
return sqrt(x * x + y * y);
}
- (CGFloat)scaleAmount: (CGFloat)delta {
CGFloat pix = sqrt(self.frame.size.width * self.frame.size.height);
CGFloat scale = 1.0 + (delta / pix);
return scale;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if([timer isValid])
[timer invalidate];
moved = NO;
switch ([touches count]) {
case 1:
{
// single touch
UITouch * touch = [touches anyObject];
startTouchPosition = [touch locationInView:self];
initialDistance = -1;
break;
}
default:
{
// multi touch
UITouch *touch1 = [[touches allObjects] objectAtIndex:0];
UITouch *touch2 = [[touches allObjects] objectAtIndex:1];
initialDistance = [self distanceBetweenTwoPoints:[touch1 locationInView:self]
toPoint:[touch2 locationInView:self]];
break;
}
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch1 = [[touches allObjects] objectAtIndex:0];
if([timer isValid])
[timer invalidate];
/*if ([touches count] == 1) {
CGPoint pos = [touch1 locationInView:self];
self.transform = CGAffineTransformTranslate(self.transform, pos.x - startTouchPosition.x, pos.y - startTouchPosition.y);
moved = YES;
return;
}****/
if ((initialDistance > 0) && ([touches count] > 1)) {
UITouch *touch2 = [[touches allObjects] objectAtIndex:1];
CGFloat currentDistance = [self distanceBetweenTwoPoints:[touch1 locationInView:self]
toPoint:[touch2 locationInView:self]];
CGFloat movement = currentDistance - initialDistance;
NSLog(#"Touch moved: %f", movement);
CGFloat scale = [self scaleAmount: movement];
self.transform = CGAffineTransformScale(self.transform, scale, scale);
// }
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch1 = [[touches allObjects] objectAtIndex:0];
if ([touches count] == 1) {
// double tap to reset to default size
if ([touch1 tapCount] > 1) {
if (zoomed) {
self.transform = CGAffineTransformIdentity;
moved = NO;
zoomed = NO;
}
return;
}
}
else {
// multi-touch
UITouch *touch2 = [[touches allObjects] objectAtIndex:1];
CGFloat finalDistance = [self distanceBetweenTwoPoints:[touch1 locationInView:self]
toPoint:[touch2 locationInView:self]];
CGFloat movement = finalDistance - initialDistance;
NSLog(#"Final Distance: %f, movement=%f",finalDistance,movement);
if (movement != 0) {
CGFloat scale = [self scaleAmount: movement];
self.transform = CGAffineTransformScale(self.transform, scale, scale);
NSLog(#"Scaling: %f", scale);
zoomed = YES;
}
}
}
- (void)singleTap: (NSTimer*)theTimer {
// must override
}
- (void)animateSwipe: (int) direction {
// must override
}
It is not working fine on device. can anyone tell that where i am wrong.
When you use any CGAffinTransform.... the value of the frame property becomes undefined. In your code you are using the frame.size.width and frame.size.height to calculate the change in size. After the first iteration of CGAffinTransformScale you would not get the right scale factor. According to the documentation bounds would be the right property for scale calculations.