Image view should not move outside the its superview using UIPangesture - iphone

I am using a UIPangesture to move a imageview and this imageview is subview of of view of class, when I move imageview then it move outside its superview. I want to set the boundary of moving this imageview, means the imageview should move only inside its superview.

i guess in the selector that is called by the pan gesture you are setting the view's center or origin accordingly to the point of the pan.
this is how your code should look in order to solve your problem:
- (void)thePanSelector:(id)sender{
UIPanGestureRecognizer *recognizer = (UIPanGestureRecognizer *)sender;
CGPoint p = [recognizer locationInView:theSuperView];
float boundedX = MAX(0,MIN(theSuperView.frame.size.width,p.x));
float boundedY = MAX(0,MIN(theSuperView.frame.size.height,p.y));
CGPoint boundedPoint = CGPointMake(boundedX,boundedY);
view.center = p;
}
this code is untested so you may need to play a little with the values of boundedX and boundedY

Related

detect touch of subview of uiview being animated using CAAnimation

I'm trying to detect a touch on a UISubview of a view being animated
Here's my detection code:
//simple but not very elegant way of getting correct frame
CGRect keepFrame = self.fishContainer.layer.frame;
self.fishContainer.layer.frame = [[self.fishContainer.layer presentationLayer] frame];
//get touch location, taking presentation layer into account. (See above)
CGPoint p = [sender locationInView:self.fishContainer];
CALayer *layer =[self.fishContainer.layer presentationLayer];
//apply relevant transform
p = CGPointApplyAffineTransform(p,layer.affineTransform);
EBLog(#"checking point %#",NSStringFromCGPoint(p));
UIView *vToRemove = nil;
//find topmost view containing touched point
for (UIView *v in self.parasites) {
EBLog(#"-BOUND %#",NSStringFromCGRect(v.frame));
if(CGRectContainsPoint(v.frame, p))
{
vToRemove = v;
}
}
//OK, we have a view. Let's remove it.
if(vToRemove)
{
EBLog(#"found one");
[vToRemove removeFromSuperview];
[self.parasites removeObject:vToRemove];
if ([self.parasites count] == 0) {
[self showWinnerScreen];
[self stopGame];
}
}
//restore view frame
self.fishContainer.layer.frame = keepFrame;
Everything works correctly as long as I don't animate parasiteArea parentview.
When I animate parasiteArea's parentview (A CAAnimation consisting of move of the view, scale of the view, and rotate of the view) , the touch is outside the bounds of the expected subview.
UPDATE
I manged to get the detection working in most cases (see code above), by using the presentationLayer property and CGPointApplyAffineTransform. There is however, still some cases where it dosnt work.
I guess I need to translate the touch point to the coordinate space of the CAAnimation.
Or something like that? any suggestions?
I ended up using UIView animateWithDuration instead of CAAninmation. For my purpose the limited animation possibles were enough.

Detect tap touches on CALayer

My setup is as follows
- UIView
- CALayer (container)
-CAShapeLayer
-CAShapeLayer
-..
And i want to detect tap touches on every shapelayer to change its color
I have put a UITapGestureRecognizer on my UIView and have the following code
CGPoint point = [self tapWithPoint:[recognizer locationInView:pieView]];
PieSliceLayer* layerThatWasTapped = (PieSliceLayer *)[_containerLayer hitTest:point];
[(PieSliceLayer *)[layerThatWasTapped modelLayer] setFillColor:UIColor.redColor];
But it seems that it only changes 1 CAShapeLayer always the first that was added.

How to convert SubView's Frame Coordinate System to Self View's Coordinate System

I create programmatically one scrollView and some buttons inside there. When I click the any button have to show a popover.
My button's origin in self.view is like (100,11) and inside scrollView (9,11) and scrowView is in somewhere in self.view. The popover shows in (9,11) but right one would be (100,11). I try use convert without success.
-(IBAction)showPopover:(id)sender{
//... implemented popover above
//Wrong Origin:
NSLog(#"wrong x:%f y:%f",[sender frame].origin.x, [sender frame].origin.y);
//Transform to correct
CGRect frame = [self.view convertRect:[sender frame] toView:nil];
//Shoulf be right, but is not...
NSLog(#"new x:%f y:%f",frame.origin.x, frame.origin.y);
}
Anyone cam help me?
A view's frame is already in the superview's coordinate system. So if your setup is self.view contains scrollview contains sender:
CGRect frame = [sender.superview convertRect:sender.frame toView:self.view];
// or, better:
CGRect frame = [sender convertRect:sender.bounds toView:self.view];
Swift:
let frame = sender.convert(sender.bounds, to: self.view)

How to rotate a CGRect or CGImageCreateWithImageInRect?

To describe my project:
I have a rectangle UIImageView frame floating over a white layer. Inside the UIImageView, I'm successfully creating the illusion that it is showing a portion of a background image behind the white layer. You can drag the rectangle around, and it will "redraw" the image so that you can peer into what is behind the white. Its basically this code:
//whenever the frame is moved, updated the CGRect frameRect and run this:
self.newCroppedImage = CGImageCreateWithImageInRect([bgImage.image CGImage], frameRect);
frame.image = [UIImage imageWithCGImage:self.newCroppedImage];
Anyhow, I also have a rotation gesture recognizer that allows the user to rotate the frame (and consequentially rotates the image). This is because the CGRect sent to the CGImageCreateWithImageInRect is still oriented at its original rotation. This breaks the illusion that you're looking through a window because the image you see is rotated when only the frame should appear that way.
So ideally, I need to take the rotation of my frame and apply it to the image created from my bgImage. Does anyone have any clues or ideas on how I could apply this?
I suggest you take a different approach. Don't constantly create new images to put in your UIImageView. Instead, set up your view hierarchy like this:
White view
"Hole" view (just a regular UIView)
Image view
That is, the white view has the hole view as a subview. The hole view has the UIImageView as its subview.
The hole view must have its clipsToBounds property set to YES (you can set it with code or in your nib).
The image view should have its size set to the size of its image. This will of course be larger than the size of the hole view.
And this is very very important: the image view's center must be set to the hole view's center.
Here's the code I used in my test project to set things up. The white view is self.view. I start with the hole centered in the white view, and I set the image view's center to the hole view's center.
- (void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:YES];
CGRect bounds = self.view.bounds;
self.holeView.center = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
self.holeView.clipsToBounds = YES;
bounds = self.holeView.bounds;
self.imageView.center = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
self.imageView.bounds = (CGRect){ CGPointZero, self.imageView.image.size };
}
I also set the image view's size to the size of its image. You might want to set it to the size of the white view.
To pan and rotate the hole, I'm going to set holeView.transform. I'm not going to change holeView.frame or holeView.center. I have two instance variables, _holeOffset and _holeRotation, that I use to compute the transform. The trick to making it seem like a hole through the white view, revealing the image view, is to apply the inverse transform to the image view, undoing the effects of the hole view's transform:
- (void)updateTransforms {
CGAffineTransform holeTransform = CGAffineTransformIdentity;
holeTransform = CGAffineTransformTranslate(holeTransform, _holeOffset.x, _holeOffset.y);
holeTransform = CGAffineTransformRotate(holeTransform, _holeRotation);
self.holeView.transform = holeTransform;
self.imageView.transform = CGAffineTransformInvert(holeTransform);
}
This trick of using the inverse transform on the subview only works if the center of the subview is at the center of its superview. (Technically the anchor points have to line up, but by default the anchor point of a view is its center.)
I put a UIPanGestureRecognizer on holeView. I configured it to send panGesture: to my view controller:
- (IBAction)panGesture:(UIPanGestureRecognizer *)sender {
CGPoint offset = [sender translationInView:self.view];
[sender setTranslation:CGPointZero inView:self.view];
_holeOffset.x += offset.x;
_holeOffset.y += offset.y;
[self updateTransforms];
}
I also put a UIRotationGestureRecognizer on holeView. I configured it to send rotationGesture: to my view controller:
- (IBAction)rotationGesture:(UIRotationGestureRecognizer *)sender {
_holeRotation += sender.rotation;
sender.rotation = 0;
[self updateTransforms];
}

How do i drag a button?

I have a UIButton that i'd like the user to be able to drag with TouchDragInside. How do i get the button to move as the user moves their finger?
As Jamie noted, a pan gesture recognizer is probably the way to go. The code would look something like what follows.
The button's view controller might add a gesture recognizer to the button (possibly in viewDidLoad) as follows:
UIPanGestureRecognizer *pangr = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan:)];
[myButton addGestureRecognizer:pangr];
[pangr release];
And, the view controller would have the following target method to handle the gesture:
- (void)pan:(UIPanGestureRecognizer *)recognizer
{
if (recognizer.state == UIGestureRecognizerStateChanged ||
recognizer.state == UIGestureRecognizerStateEnded) {
UIView *draggedButton = recognizer.view;
CGPoint translation = [recognizer translationInView:self.view];
CGRect newButtonFrame = draggedButton.frame;
newButtonFrame.origin.x += translation.x;
newButtonFrame.origin.y += translation.y;
draggedButton.frame = newButtonFrame;
[recognizer setTranslation:CGPointZero inView:self.view];
}
}
CORRECTED as per rohan-patel's comment.
In the previously posted code , the x and y coordinate's of the origin of the button's frame were set directly. It was incorrect as: draggedButton.frame.origin.x += translation.x. A view's frame can be changed, but the frame's components cannot be changed directly.
You probably don't want to use TouchDragInside. That is a method of recognizing that a button or other control has been activated in a certain way. To move the button, you probably want to use a UIPanGestureRecognizer and then change the buttons position in its superview as the users finger moves around.
You have to implement these four methods, touchesBegan:withEvent:, touchesMoved:withEvent:, touchesEnded:withEvent:, and touchesCancelled:withEvent: in the view that holds the button. The property you are referring to cannot be used directly to drag any uiview