How to prevent user from panning my view off screen? - iphone

I've got a UIView that receives user panning from a gesture recognizer. I realized some times user would "throw" my view almost out of screen and that looks very bad. I want to prevent this from happening. I know I should do some check in my selector, but I don't know how to do it when the view is translated.
Here is my code:
- (void)panPiece:(UIPanGestureRecognizer *)gestureRecognizer
{
UIView *piece = [gestureRecognizer view];
[self adjustAnchorPointForGestureRecognizer:gestureRecognizer];
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
CGPoint translation = [gestureRecognizer translationInView:[piece superview]];
[piece setCenter:CGPointMake([piece center].x + translation.x, [piece center].y + translation.y)];
[gestureRecognizer setTranslation:CGPointZero inView:[piece superview]];
}
}
Thanks in advance
Regards
Leo

You can add following condition in you code.
if([gestureRecognizer state] == UIGestureRecognizerStateEnd)
Inside that you can check weather any corner of view is outside of screen or not. If its is outside then you can bounce it back inside screen.
Following is a possible example.
if (zl.layer.frame.origin.x > 0) {
self.moveFactorX = self.moveFactorX-zl.layer.frame.origin.x;
[zl.layer setValue:[NSNumber numberWithFloat: moveFactorX+totalMoveX] forKeyPath: #"transform.translation.x"];
}
else if(zl.layer.frame.origin.x < pageSize.width - zl.layer.frame.size.width){
self.moveFactorX = self.moveFactorX-(zl.layer.frame.origin.x - pageSize.width + zl.layer.frame.size.width);
[zl.layer setValue:[NSNumber numberWithFloat: moveFactorX+totalMoveX] forKeyPath: #"transform.translation.x"];
}
This checked for me weather left side of layer of zl class is inside the screen. When It is inside screen then i was pushing is back to left most corner.
In else it checks weather it is inside the screen from right side or not. and when condition satisfies it will push it to right border.
Same way you have to implement for top and bottom also.
If you find code confusing then post here. I will modify it accordingly.

Related

Get UIPinchGestureRecognizer finger positions

Im using the UIGestureRecognizer to transform a view and its workin perfectly, but now I want to use it to transform my view width and height independently. The only way that comes to my mind to solve this is getting the two finger positions and make an if clause to recognize if the user is trying to increase width or height, but for this I need to get each finger position involved in the Pinch Gesture. But I cant find any method to do this I was wondering if this is posible or if there is another alternative for achieving this.
- (IBAction)handlePinch:(UIPinchGestureRecognizer *)recognizer {
recognizer.view.transform = CGAffineTransformScale(recognizer.view.transform, recognizer.scale, 1);//To transform height insted of width just swap positions of the second and third parameter.
NSLog(#"%f",recognizer.scale);
recognizer.scale = 1;
}
Got the answer if somebody needs to do this, there is a method called location of touch. With this method you can get each touch x and y position. But call it when the Gesture recognizer state began because it crashes if you do it in the state changed. Save this values in some variables and you are good to go. Hope it helps someone who is interested.
- (IBAction)handlePinch:(UIPinchGestureRecognizer *)recognizer {
if(recognizer.state == UIGestureRecognizerStateBegan){
NSLog(#"pos : 0%f, %f",[recognizer locationOfTouch:0 inView:self.view].x,[recognizer locationOfTouch:0 inView:self.view].y);
NSLog(#"pos 1: %f, %f",[recognizer locationOfTouch:1 inView:self.view].x,[recognizer locationOfTouch:1 inView:self.view].y);
}
if(recognizer.state == UIGestureRecognizerStateChanged){
recognizer.view.transform = CGAffineTransformScale(recognizer.view.transform, recognizer.scale, 1);
//NSLog(#"%f",recognizer.scale);
recognizer.scale = 1;
}
if(recognizer.state == UIGestureRecognizerStateEnded){
}

Move image with touch

I have a scroll view with a fixed number of thumbnail images added as subview. I want to move these images along touch to the rectangle in another view. I am able to move the image along scrollview (ie, along the same view) but not able to move across another view.
Now am changing the center of image depending on touch position. When touch point increases beyond the frame of scrollview the image disappears. This is my problem
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView: self];
NSLog(#"%f,%f",location.x,location.y);
touch.view.center = location;
}
Any solution to this problem will be a great help for me !!!
Please refer image for further details
Here is what I'd do :
Add a panGestureRecognizer to each image, with the following handlePan: method as its action. You still have to figure out how to get the correct imageView (myImageView), but this will make your image follow your finger.
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer {
// Find the correct image you're dragging
// UIImageView *myImageview = ...
CGPoint translation = [recognizer translationInView:self.view];
if (recognizer.state == UIGestureRecognizerStateEnded) {
// What to do when you start the gesture
// You may also define myImageView here
// (if so, make sure you put it in #interface,
// because this method will be called multiple times,
// but you will enter this "if" only when you start the touch)
}
// Keep your image in the screen
if (myImageView.frame.origin.x + translation.x >= 0.0f &&
myImageView.frame.origin.x + myImageView.frame.size.width + translation.x <= 320.0f &&
myImageView.frame.origin.y + translation.y >= 0.0f &&
myImageView.frame.origin.y + myImageView.frame.size.height + translation.y <= 480.0f) {
myImageView.center = CGPointMake(myImageView.center.x + translation.x,
myImageView.center.y + translation.y);
}
if (recognizer.state == UIGestureRecognizerStateEnded) {
// What to do when you remove your finger from the screen
}
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}

UIPanGestureRecognizer Collision

I have 6 UIImageViews each connected to UIPanGestureRecognizer and they are all connected to the same method. The method is:
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer {
CGPoint translation = [recognizer translationInView:self.view];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x,
recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}
I am following Ray Wenderlich's tutorial on using GestureRecognizers. So, I was wondering how to detect collisions so that when one image collides with another image, some code is run. The code is different for each image.
Thanks
If you want move the image with the recognizer maybe you should attach the recognizer to your view.
Belonging to this, the fastest way to do this, is (in the method that change the frame at your UIImageView)
for (UIImageView *iv in _imageArray){
if (CGRectIntersectsRect(iv.frame, _selectedImageView.frame)) {
NSLog(#"Collision");
}
}
_selectedImageView is the image that your are moving and _imageArray is an array that contains all your UIImageView (in your case are 6).

How to make dragging faster using velocityView in UIPanGestureRecognizer?

Having a little trouble understanding how to implement a faster dragging speed for the code I'm using below (written by #PaulSoltz) which allows you to drag an object across the screen. I realize you have to use the velocityInView method from UIPanGestureRecognizer, and I understand it returns the x velocity vector and y velocity vector. If velocity = distance over time, then for instance velocityx = (x2 - x1) / time and I'm unsure how to use this formula to get what I need. Basically I just want to be able to adjust the speed of my movement to make it a little faster. Maybe I'm over thinking things, but if anyone could help me understand this it would be appreciated. Thanks.
- (void)handlePanGesture:(UIPanGestureRecognizer *)gestureRecognizer {
UIView *myView = [gestureRecognizer view];
CGPoint translate = [gestureRecognizer translationInView:[myView superview]];
if ([gestureRecognizer state] == UIGestureRecognizerStateChanged || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
[myView setCenter:CGPointMake(myView.center.x + translate.x, myView.center.y + translate.y)];
[gestureRecognizer setTranslation:CGPointZero inView:[myView superview]];
}
}
Just multiply the components of the translation vector by some constant.
[myView setCenter:CGPointMake(myView.center.x + translate.x * 2, myView.center.y + translate.y * 2)];

Iphone SDK: move image around with the accelerometer

Im trying to move an image around with the accelerometer by doing that:
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration {
image.center = CGPointMake(acceleration.x, acceleration.y);
}
When i test the app, the image that is supposed to move around just sits in the x0 y0 position.
I declared the accelerometer, called the .h UIAccelerometerDelegate and so on...
What am i doing wrong?
Thanks in advance! -DD
You do realize that the accelerometer returns, as the name would suggest, measures of acceleration not points on the display? Anyway, what you need to do, is alter the center (not replace it completely), which will allow you to move the image.
Something along these lines:
image.center = CGPointMake(image.center.x + acceleration.x,
image.center.y - acceleration.y);
It is also important to note that the acceleration usually stays between -1 and 1 (unless the user shakes the device), which is due to the gravity being 1G. Therefore you should probably multiply the acceleration.x and .y values with some constant to make the image move a bit faster than about 1 point at a time.
There are additional things you should think about, what if the image is at the edge of the screen? What if the user wants to use the app in some other position than flat on a surface (needs calibration of the accelerometer)?
-(void)moveImage:(id)sender
{
[operationView bringSubviewToFront:[(UIPanGestureRecognizer*)sender view]];
[[[(UIPanGestureRecognizer*)sender view] layer] removeAllAnimations];
CGPoint translatedPoint = [(UIPanGestureRecognizer*)sender translationInView:self.view];
if([(UIPanGestureRecognizer*)sender state] == UIGestureRecognizerStateBegan)
{
firstX = [[sender view] center].x;
firstY = [[sender view] center].y;
[imgDeleteView setHidden:FALSE];
}
else if ([(UIPanGestureRecognizer*)sender state] == UIGestureRecognizerStateEnded)
{
[imgDeleteView setHidden:TRUE];
}
translatedPoint = CGPointMake(firstX+translatedPoint.x, firstY+translatedPoint.y);
[[(UIPanGestureRecognizer *)sender view] setCenter:translatedPoint];
}