How to set minimum and maximum zoom scale using UIPinchGestureRecognizer - iphone

I want to zoom in and zoom out an image view and i dont want to use UIScrollView for that.
so for this i used UIPinchGestureRecognizer and here is my code -
[recognizer view].transform = CGAffineTransformScale([[recognizer view] transform], [recognizer scale], [recognizer scale]);
recognizer.scale = 1;
this is working fine for zoom in and zoom out.
But problem is that i want to zoom in and zoom out in specific scale like in UIScrollView we can set the maxZoom and minZoom. i could not found any solution for that, every tutorial about UIPinchGestureRecognizer just describe the same code.

Declare 2 ivars CGFloat __scale and CGFloat __previousScale in the interface of the class that handles the gesture. Set __scale to 1.0 by overriding one of the init functions (make sure to call the super constructor here).
- (void)zoom:(UIPinchGestureRecognizer *)gesture {
NSLog(#"Scale: %f", [gesture scale]);
if ([gesture state] == UIGestureRecognizerStateBegan) {
__previousScale = __scale;
}
CGFloat currentScale = MAX(MIN([gesture scale] * __scale, MAX_SCALE), MIN_SCALE);
CGFloat scaleStep = currentScale / __previousScale;
[self.view setTransform: CGAffineTransformScale(self.view.transform, scaleStep, scaleStep)];
__previousScale = currentScale;
if ([gesture state] == UIGestureRecognizerStateEnded ||
[gesture state] == UIGestureRecognizerStateCancelled ||
[gesture state] == UIGestureRecognizerStateFailed) {
// Gesture can fail (or cancelled?) when the notification and the object is dragged simultaneously
__scale = currentScale;
NSLog(#"Final scale: %f", __scale);
}
}

– (void)handlePinchGesture:(UIPinchGestureRecognizer *)gestureRecognizer
{
if([gestureRecognizer state] == UIGestureRecognizerStateBegan) {
// Reset the last scale, necessary if there are multiple objects with different scales
lastScale = [gestureRecognizer scale];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan ||
[gestureRecognizer state] == UIGestureRecognizerStateChanged) {
CGFloat currentScale = [[[gestureRecognizer view].layer valueForKeyPath:#”transform.scale”] floatValue];
// Constants to adjust the max/min values of zoom
const CGFloat kMaxScale = 2.0;
const CGFloat kMinScale = 1.0;
CGFloat newScale = 1 – (lastScale – [gestureRecognizer scale]); // new scale is in the range (0-1)
newScale = MIN(newScale, kMaxScale / currentScale);
newScale = MAX(newScale, kMinScale / currentScale);
CGAffineTransform transform = CGAffineTransformScale([[gestureRecognizer view] transform], newScale, newScale);
[gestureRecognizer view].transform = transform;
lastScale = [gestureRecognizer scale]; // Store the previous scale factor for the next pinch gesture call
}
}

I had similar situation. My requirement was imageView will bounce back to its last transformation if imageView is smaller than a minimum size or bigger than a certain maximum size.
if ((self.frame.size.width > IMAGE_MIN_SIZE) && (self.frame.size.height > IMAGE_MIN_SIZE) && (self.frame.size.width < IMAGE_MAX_SIZE) && (self.frame.size.height < IMAGE_MAX_SIZE)) {
lastSizeTransform = self.transform;
}else {
self.transform = lastSizeTransform;
}
Here self is the imageView.

If you logged view.transform while you are pinching, you can see your image coordination which zoomed in and out. So, this solutions doesn't work as i expect. I made my solution like that;
Obj-C Version
- (void)handlePinchGesture:(UIPinchGestureRecognizer *)recognizer {
[recognizer.view setTransform:CGAffineTransformScale(recognizer.view.transform,
recognizer.scale, recognizer.scale)];
if (recognizer.view.transform.a > 1.6) {
CGAffineTransform fooTransform = recognizer.view.transform;
fooTransform.a = 1.6; // this is x coordinate
fooTransform.d = 1.6; // this is y coordinate
recognizer.view.transform = fooTransform;
}
if (recognizer.view.transform.a < 0.95) {
CGAffineTransform fooTransform = recognizer.view.transform;
fooTransform.a = 0.95; // this is x coordinate
fooTransform.d = 0.95; // this is y coordinate
recognizer.view.transform = fooTransform;
}
recognizer.scale = 1.0;
}
Swift Version
func handlePinchGesture(recognizer: UIPinchGestureRecognizer) {
if let view = recognizer.view {
view.transform = CGAffineTransformScale(view.transform,
recognizer.scale, recognizer.scale)
if CGFloat(view.transform.a) > 1.6 {
view.transform.a = 1.6 // this is x coordinate
view.transform.d = 1.6 // this is x coordinate
}
if CGFloat(view.transform.d) < 0.95 {
view.transform.a = 0.95 // this is x coordinate
view.transform.d = 0.95 // this is x coordinate
}
recognizer.scale = 1
}
}

Related

Image gone certain Zoomout level

Im using GestureRecognizer delegate for pinching for images. I used UIPinchGestureRecognizer delegate for pinching. But, when i pinch zoomIn it doesn't have any problem. When i zoomOut certain level the images are gone from view.
code:
UIPinchGestureRecognizer *pinchGesture1 = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(ahandlePinch1:)];
[myImageView addGestureRecognizer:pinchGesture1];
-(void)ahandlePinch1:(UIPinchGestureRecognizer*)sender {
mCurrentScale += [sender scale] - mLastScale;
mLastScale = [sender scale];
if (sender.state == UIGestureRecognizerStateEnded)
{
mLastScale = 1.0;
}
CGAffineTransform currentTransform = CGAffineTransformIdentity;
CGAffineTransform newTransform = CGAffineTransformScale(currentTransform, mCurrentScale, mCurrentScale);
myImageView.transform = newTransform;
}
Do this :
if([pinch state] == UIGestureRecognizerStateEnded)
{
lastScale = 1.0;
return;
}
CGFloat scale = 1.0 - (lastScale - [pinch scale]);
CGAffineTransform currentTransform = myImageView.transform;
CGAffineTransform newTransform = CGAffineTransformScale(currentTransform, scale, scale);
[myImageView setTransform:newTransform];
lastScale = [pinch scale];
And it should work for everything (in / out).

Make UIPanGestureRecognizer stretch when pulling past certain threshold

I'm trying to implement something similar to how Mac/iOS web pages are implemented, where if you pull past a certain threshold, the main view becomes "stretchy" and moves based on the velocity of the pull.
The difference, though, is I'm trying to do this horizontally for UITableViewCells with a UIPanGestureRecognizer.
I've got a handlePan method:
- (void)handlePan:(UIPanGestureRecognizer *)recognizer
{
[recognizer setMaximumNumberOfTouches:1];
CGPoint velocity = [recognizer velocityInView:self];
int panDelta = ([recognizer locationInView:self].x - [recognizer locationInView:self.superview].x);
if (recognizer.state == UIGestureRecognizerStateBegan) {
_originalCenter = self.center;
}
if (recognizer.state == UIGestureRecognizerStateChanged) {
CGPoint translation = [recognizer translationInView:self];
float xVal = 0;
if (panDelta <= 38) {
xVal = _originalCenter.x + translation.x;
} /*else {
// this is where I struggle.
xVal = _originalCenter.x;
}*/
self.center = CGPointMake(xVal, _originalCenter.y);
}
if (recognizer.state == UIGestureRecognizerStateEnded) {
CGRect newFrame;
int xOffset = 0;
newFrame = CGRectMake(xOffset, self.frame.origin.y,
self.bounds.size.width, self.bounds.size.height);
[UIView animateWithDuration:0.2 animations:^{
self.frame = newFrame;
}];
}
}
I don't have any specific algorithm to create the "stretchiness"; all I'm trying to do is make it so the sliding doesn't go as wide as the user's interaction is, and is fluid.
Ideas?
I think you are looking for a simple function with a horizontal asymptote indicating a maximum offset:
// this is where I struggle.
CGFloat maxOffset = 50.0; // change as you like
CGFloat elementWidth = ?; // fill in the width of your view animated, eg: self.frame.size.width
CGFloat absPannedOffset = fabsf(translation.x);
CGFloat rico = powf(elementWidth, 2) / maxOffset;
CGFloat absOffset = (((- 1 / rico) * powf(absPannedOffset - elementWidth, 2)) + maxOffset);
if (pannedOffset < 0) {
xVal = -absOffset;
} else {
xVal = absOffset;
}

UIRotationGestureRecognizer

I have a very weird problem, I am implementing UIRotationGestureRecognizer in my application to rotate an image. My problem is that sometimes when you start rotating the image it jumps to the other finger.
Example: I am dragging my image with one finger, and after that I use my second finger to start rotating, but the image goes to the second finger. Is that normal? can it be fixed?
The code I am using is:
if([recognizer state] == UIGestureRecognizerStateEnded) {
prevRotation = 0.0;
return;
}
CGFloat newRotation = 0.0 - (prevRotation - [recognizer rotation]);
CGAffineTransform currentTransformation = image.transform;
CGAffineTransform newTransform = CGAffineTransformRotate(currentTransformation, newRotation);
image.transform = newTransform;
prevRotation = [recognizer rotation];
I tried saving the last location of the center and applying it again, but it is not working either, like this:
lastPoint=image.center;
if([recognizer state] == UIGestureRecognizerStateEnded) {
prevRotation = 0.0;
return;
}
CGFloat newRotation = 0.0 - (prevRotation - [recognizer rotation]);
CGAffineTransform currentTransformation = image.transform;
CGAffineTransform newTransform = CGAffineTransformRotate(currentTransformation, newRotation);
image.transform = newTransform;
prevRotation = [recognizer rotation];
image.center=lastPoint;
Try this code:
-(void)rotate:(id)sender
{
[self.view bringSubviewToFront:[(UIRotationGestureRecognizer*)sender view]];
self.imageTag = [[sender view]tag];
NSLog(#"sender tag %d", self.imageTag);
if([(UIRotationGestureRecognizer*)sender state] == UIGestureRecognizerStateEnded) {
lastRotation = 0.0;
return;
}
CGFloat rotation = 0.0 - (lastRotation - [(UIRotationGestureRecognizer*)sender rotation]);
CGAffineTransform currentTransform = [(UIPinchGestureRecognizer*)sender view].transform;
CGAffineTransform newTransform = CGAffineTransformRotate(currentTransform,rotation);
[[(UIRotationGestureRecognizer*)sender view] setTransform:newTransform];
lastRotation = [(UIRotationGestureRecognizer*)sender rotation];
}

How do you determine the frame of a future transformation? UPDATE: More elegant way?

I have a pinch gesture recognizer attached to an imageview from which I use pinches to enlarge and minimize the photo. Below is the code that I'm using in the delegate method:
- (void)scale:(UIPinchGestureRecognizer *)sender
{
if([sender state] == UIGestureRecognizerStateBegan)
{
lastScale = [sender scale];
}
if ([sender state] == UIGestureRecognizerStateBegan ||
[sender state] == UIGestureRecognizerStateChanged)
{
CGFloat currentScale = [[[sender view].layer valueForKeyPath:#"transform.scale"] floatValue];
CGFloat newScale = 1 - (lastScale - [sender scale]) * (UIComicImageViewPinchSpeed);
newScale = MIN(newScale, minScale / currentScale);
newScale = MAX(newScale, maxScale / currentScale);
CGAffineTransform transform = CGAffineTransformScale([[sender view] transform], newScale, newScale);
[sender view].transform = transform;
lastScale = [sender scale];
}
}
I need to determine where the new center of the imageview frame will be before I actually perform the transformation. Is there anyway to determine this? Basically, I'm trying to halt the scaling if it's about to move the image off the screen or close to it.
UPDATE
Thanks to Robin below for suggesting that method to figure out the transformed frame. The problem I'm running into now is that the frame becomes invalid after the transform is performed, and I need to keep track of the most recent frame in order to figure out the boundary of my image. Obviously, I can do this manually and store it in an instance variable, but wondering if there is a more "elegant" way to accomplish this?
Use CGRectApplyAffineTransform like this:
CGRect currentFrame = ....;
CGRect newFrame = CGRectApplyAffineTransform(currentFrame, transform);
// Then test if newFrame is within the limits you want

Max/Min Scale of Pinch Zoom in UIPinchGestureRecognizer - iPhone iOS

How would I be able to limit the scale of the UIPinchGestureRecognizer to a min and max level? The scale property below seems to be relative to the last known scale (the delta from last state) and I can't figure out how to set a limit to the size/heigh of the object being zoomed.
-(void)scale:(id)sender {
[self.view bringSubviewToFront:[(UIPinchGestureRecognizer*)sender view]];
if([(UIPinchGestureRecognizer*)sender state] == UIGestureRecognizerStateEnded) {
lastScale = 1.0;
return;
}
CGFloat pinchscale = [(UIPinchGestureRecognizer*)sender scale];
CGFloat scale = 1.0 - (lastScale - pinchscale);
CGAffineTransform currentTransform = [(UIPinchGestureRecognizer*)sender view].transform;
CGAffineTransform holderTransform = holderView.transform;
CGAffineTransform newTransform = CGAffineTransformScale(currentTransform, scale, scale);
[[(UIPinchGestureRecognizer*)sender view] setTransform:newTransform];
lastScale = [(UIPinchGestureRecognizer*)sender scale];
}
Here is the solution that I figured out after using Anomie's answer as a starting point.
- (void)handlePinchGesture:(UIPinchGestureRecognizer *)gestureRecognizer {
if([gestureRecognizer state] == UIGestureRecognizerStateBegan) {
// Reset the last scale, necessary if there are multiple objects with different scales
lastScale = [gestureRecognizer scale];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan ||
[gestureRecognizer state] == UIGestureRecognizerStateChanged) {
CGFloat currentScale = [[[gestureRecognizer view].layer valueForKeyPath:#"transform.scale"] floatValue];
// Constants to adjust the max/min values of zoom
const CGFloat kMaxScale = 2.0;
const CGFloat kMinScale = 1.0;
CGFloat newScale = 1 - (lastScale - [gestureRecognizer scale]);
newScale = MIN(newScale, kMaxScale / currentScale);
newScale = MAX(newScale, kMinScale / currentScale);
CGAffineTransform transform = CGAffineTransformScale([[gestureRecognizer view] transform], newScale, newScale);
[gestureRecognizer view].transform = transform;
lastScale = [gestureRecognizer scale]; // Store the previous scale factor for the next pinch gesture call
}
}
There isn't a way to limit the scale on a UIPinchGestureRecognizer. To limit the height in your code, you should be able to do something like this:
CGFloat scale = 1.0 - (lastScale - pinchscale);
CGRect bounds = [(UIPinchGestureRecognizer*)sender view].bounds;
scale = MIN(scale, maximumHeight / CGRectGetHeight(bounds));
scale = MAX(scale, minimumHeight / CGRectGetHeight(bounds));
To limit width, change 'Height' to 'Width' in the last two lines.
I took some info gleaned from Paul Solt and Anoime's answers, and added that to an existing category I have made for UIViewController to allow making any UIView draggable, to now make it pinchable using gestures and transforms.
Note: this dirties the tag property of the view you are making draggable/pinchable. So if you needed the tag for something else, you can consider placing that value in the NSMutableDictionary being used by this technique. That's available as [self dictForView:theView]
Implementing in your project:
You can make any subview within the view controllers "view" draggable or pinchable (or both)
place a single line of code in your viewDidLoad (for example:)
[self makeView:mySubView draggable:YES pinchable:YES minPinchScale:0.75 maxPinchScale:1.0];
turn it off in viewDidUnload (releases guestures & dictionary):
[self makeView:mySubView draggable:NO pinchable:NO minPinchScale:1.0 maxPinchScale:1.0];
DragAndPinchScale.h file
#import <UIKit/UIKit.h>
#interface UIViewController (DragAndPinchScale)
-(void) makeView:(UIView*)aView
draggable:(BOOL)draggable
pinchable:(BOOL)pinchable
minPinchScale:(CGFloat)minPinchScale
maxPinchScale:(CGFloat)maxPinchScale;
-(NSMutableDictionary *) dictForView:(UIView *)theView;
-(NSMutableDictionary *) dictForViewGuestures:(UIGestureRecognizer *)guesture;
#end
DragAndPinchScale.m file
#import "DragAndPinchScale.h"
#implementation UIViewController (DragAndPinchScale)
-(NSMutableDictionary *) dictForView:(UIView *)theView{
NSMutableDictionary *dict = (NSMutableDictionary*) (void*) theView.tag;
if (!dict) {
dict = [[NSMutableDictionary dictionary ] retain];
theView.tag = (NSInteger) (void *) dict;
}
return dict;
}
-(NSMutableDictionary *) dictForViewGuestures:(UIGestureRecognizer *)guesture {
return [self dictForView:guesture.view];
}
- (IBAction)fingersDidPinchInPinchableView:(UIPinchGestureRecognizer *)fingers {
NSMutableDictionary *dict = [self dictForViewGuestures:fingers];
UIView *viewToZoom = fingers.view;
CGFloat lastScale;
if([fingers state] == UIGestureRecognizerStateBegan) {
// Reset the last scale, necessary if there are multiple objects with different scales
lastScale = [fingers scale];
} else {
lastScale = [[dict objectForKey:#"lastScale"] floatValue];
}
if ([fingers state] == UIGestureRecognizerStateBegan ||
[fingers state] == UIGestureRecognizerStateChanged) {
CGFloat currentScale = [[[fingers view].layer valueForKeyPath:#"transform.scale"] floatValue];
// limits to adjust the max/min values of zoom
CGFloat maxScale = [[dict objectForKey:#"maxScale"] floatValue];
CGFloat minScale = [[dict objectForKey:#"minScale"] floatValue];
CGFloat newScale = 1 - (lastScale - [fingers scale]);
newScale = MIN(newScale, maxScale / currentScale);
newScale = MAX(newScale, minScale / currentScale);
CGAffineTransform transform = CGAffineTransformScale([[fingers view] transform], newScale, newScale);
viewToZoom.transform = transform;
lastScale = [fingers scale]; // Store the previous scale factor for the next pinch gesture call
}
[dict setObject:[NSNumber numberWithFloat:lastScale]
forKey:#"lastScale"];
}
- (void)fingerDidMoveInDraggableView:(UIPanGestureRecognizer *)finger {
NSMutableDictionary *dict = [self dictForViewGuestures:finger];
UIView *viewToDrag = finger.view;
if (finger.state == UIGestureRecognizerStateBegan) {
[dict setObject:[NSValue valueWithCGPoint:viewToDrag.frame.origin]
forKey:#"startDragOffset"];
[dict setObject:[NSValue valueWithCGPoint:[finger locationInView:self.view]]
forKey:#"startDragLocation"];
}
else if (finger.state == UIGestureRecognizerStateChanged) {
NSMutableDictionary *dict = (NSMutableDictionary*) (void*) viewToDrag.tag;
CGPoint stopLocation = [finger locationInView:self.view];
CGPoint startDragLocation = [[dict valueForKey:#"startDragLocation"] CGPointValue];
CGPoint startDragOffset = [[dict valueForKey:#"startDragOffset"] CGPointValue];
CGFloat dx = stopLocation.x - startDragLocation.x;
CGFloat dy = stopLocation.y - startDragLocation.y;
// CGFloat distance = sqrt(dx*dx + dy*dy );
CGRect dragFrame = viewToDrag.frame;
CGSize selfViewSize = self.view.frame.size;
if (!UIDeviceOrientationIsPortrait(self.interfaceOrientation)) {
selfViewSize = CGSizeMake(selfViewSize.height,selfViewSize.width);
}
selfViewSize.width -= dragFrame.size.width;
selfViewSize.height -= dragFrame.size.height;
dragFrame.origin.x = MIN(selfViewSize.width, MAX(0,startDragOffset.x+dx));
dragFrame.origin.y = MIN(selfViewSize.height,MAX(0,startDragOffset.y+dy));
viewToDrag.frame = dragFrame;
}
else if (finger.state == UIGestureRecognizerStateEnded) {
[dict removeObjectForKey:#"startDragLocation"];
[dict removeObjectForKey:#"startDragOffset"];
}
}
-(void) makeView:(UIView*)aView
draggable:(BOOL)draggable
pinchable:(BOOL)pinchable
minPinchScale:(CGFloat)minPinchScale
maxPinchScale:(CGFloat)maxPinchScale{
NSMutableDictionary *dict = (NSMutableDictionary*) (void*) aView.tag;
if (!(pinchable || draggable)) {
if (dict){
[dict release];
aView.tag = 0;
}
return;
}
if (dict) {
UIPanGestureRecognizer *pan =[dict objectForKey:#"UIPanGestureRecognizer"];
if(pan){
if ([aView.gestureRecognizers indexOfObject:pan]!=NSNotFound) {
[aView removeGestureRecognizer:pan];
}
[dict removeObjectForKey:#"UIPanGestureRecognizer"];
}
UIPinchGestureRecognizer *pinch =[dict objectForKey:#"UIPinchGestureRecognizer"];
if(pinch){
if ([aView.gestureRecognizers indexOfObject:pinch]!=NSNotFound) {
[aView removeGestureRecognizer:pinch];
}
[dict removeObjectForKey:#"UIPinchGestureRecognizer"];
}
[dict removeObjectForKey:#"startDragLocation"];
[dict removeObjectForKey:#"startDragOffset"];
[dict removeObjectForKey:#"lastScale"];
[dict removeObjectForKey:#"minScale"];
[dict removeObjectForKey:#"maxScale"];
}
if (draggable) {
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(fingerDidMoveInDraggableView:)];
pan.minimumNumberOfTouches = 1;
pan.maximumNumberOfTouches = 1;
[aView addGestureRecognizer:pan];
[pan release];
dict = [self dictForViewGuestures:pan];
[dict setObject:pan forKey:#"UIPanGestureRecognizer"];
}
if (pinchable) {
CGAffineTransform initialTramsform = CGAffineTransformMakeScale(1.0, 1.0);
aView.transform = initialTramsform;
UIPinchGestureRecognizer *pinch = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(fingersDidPinchInPinchableView:)];
[aView addGestureRecognizer:pinch];
[pinch release];
dict = [self dictForViewGuestures:pinch];
[dict setObject:pinch forKey:#"UIPinchGestureRecognizer"];
[dict setObject:[NSNumber numberWithFloat:minPinchScale] forKey:#"minScale"];
[dict setObject:[NSNumber numberWithFloat:maxPinchScale] forKey:#"maxScale"];
}
}
#end
The problem with most of the other answers is that they are trying to deal with scale as a linear value, when in fact it is non-linear due to the way UIPinchGestureRecognizer calculates its scale property based on the touch distance. When this isn't taken into account, the user must use more or less pinch distance to 'undo' the scaling applied by a previous pinch gesture.
Consider: suppose transform.scale = 1.0 and I place my fingers 6cm apart on the screen, then pinch inwards to 3cm apart - the resulting gestureRecognizer.scale is 0.5, and 0.5-1.0 is -0.5, so transform.scale will become 1.0+(-0.5) = 0.5. Now, I lift my fingers, place them back down 3cm apart and pinch outwards to 6cm. The resulting gestureRecognizer.scale will be 2.0, and 2.0-1.0 is 1.0, so transform.scale will become 0.5+1.0 = 1.5. Not what I wanted to happen.
The fix is to calculate the delta pinch scale as a proportion of its previous value. I place my fingers down 6cm apart, and pinch inwards to 3cm, so gestureRecognizer.scale is 0.5. 0.5/1.0 is 0.5, so my new transform.scale is 1.0*0.5 = 0.5. Next, I place my fingers down 3cm apart, and pinch outwards to 6cm. gestureRecognizer.scale is then 2.0, and 2.0/1.0 is 2.0, so my new transform.scale is 0.5*2.0 = 1.0, which is exactly what I wanted to happen.
Here it is in code:
in -(void)viewDidLoad:
self.zoomGestureCurrentZoom = 1.0f;
in -(void)onZoomGesture:(UIPinchGestureRecognizer*)gestureRecognizer:
if ( gestureRecognizer.state == UIGestureRecognizerStateBegan )
{
self.zoomGestureLastScale = gestureRecognizer.scale;
}
else if ( gestureRecognizer.state == UIGestureRecognizerStateChanged )
{
// we have to jump through some hoops to clamp the scale in a way that makes the UX intuitive
float scaleDeltaFactor = gestureRecognizer.scale/self.zoomGestureLastScale;
float currentZoom = self.zoomGestureCurrentZoom;
float newZoom = currentZoom * scaleDeltaFactor;
// clamp
float kMaxZoom = 4.0f;
float kMinZoom = 0.5f;
newZoom = MAX(kMinZoom,MIN(newZoom,kMaxZoom));
self.view.transform = CGAffineTransformScale([[gestureRecognizer view] transform], newZoom, newZoom);
// store for next time
self.zoomGestureCurrentZoom = newZoom;
self.zoomGestureLastScale = gestureRecognizer.scale;
}
Thanks, really useful code snippet above clamping to a minimum and maximum scale.
I found that when I flipped the view first using:
CGAffineTransformScale(gestureRecognizer.view.transform, -1.0, 1.0);
it would cause a flicker when scaling the view.
Let me know what you think but the solution for me was to update the code sample above, and if the view has been flipped (flag set via property) then invert the scale value:
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged)
{
CGFloat currentScale = [[[gestureRecognizer view].layer valueForKeyPath:#"transform.scale"] floatValue];
if(self.isFlipped) // (inverting)
{
currentScale *= -1;
}
CGFloat newScale = 1 - (self.lastScale - [gestureRecognizer scale]);
newScale = MIN(newScale, self.maximumScaleFactor / currentScale);
newScale = MAX(newScale, self.minimumScaleFactor / currentScale);
CGAffineTransform transform = CGAffineTransformScale([[gestureRecognizer view] transform], newScale, newScale);
gestureRecognizer.view.transform = transform;
self.lastScale = [gestureRecognizer scale]; // Store the previous scale factor for the next pinch gesture call
Method 1
gestureRecognizer.scale start with 1.0 at the beginning of pinch (gestureRecognizer.state == .began), and gestureRecognizer.scale in later state (.changed or .end) is always based on that, for example, if the view size is view_size at the beginning of pinch (might not be the same with the original size orig_view_size), gestureRecognizer.scale always starts with 1.0, and if it becomes 2.0 later, it's size will be 2 * view_size, so the scale always based on that when the pinch starts.
And we can get the scale at the beginning of pinch (gestureRecognizer.state == .began) lastScale = self.imageView.frame.width/self.imageView.bounds.size.width, so the scale of the original image now should be lastScale * gestureRecognizer.scale
lastScale: The scale of last round of Pinch, a round of Pinch is from state.start to state.end, and the scale is based on the original view size.
gestureRecognizer.scale: current scale, based on the view size after last round of Pinch.
currentScale: current scale, based on the orignial view size.
newScale: new scale, based on the orignial view size. newScale = lastScale * gestureRecognizer.scale, and you can limit the scale of the view by comparing the limitation with newScale.
```
var lastScale:CGFloat = 1.0
#objc func handlePinch(_ gestureRecognizer: UIPinchGestureRecognizer) {
var newScale = gestureRecognizer.scale
if gestureRecognizer.state == .began {
lastScale = self.imageView.frame.width/self.imageView.bounds.size.width
}
newScale = newScale * lastScale
if newScale < minScale {
newScale = minScale
} else if newScale > maxScale {
newScale = maxScale
}
let currentScale = self.imageView.frame.width/self.imageView.bounds.size.width
self.imageView.transform = CGAffineTransform(scaleX: newScale, y: newScale)
print("last Scale: \(lastScale), current scale: \(currentScale), new scale: \(newScale), gestureRecognizer.scale: \(gestureRecognizer.scale)")
}
```
Method 2
gestureRecognizer.scale start with 1.0 on each Pinch notification, this require you reset gestureRecognizer.scale = 1 in the code in the end of each notification handler, so now gestureRecognizer.scale is based on the view size of last Pinch notification, NOT based on the view size at the beginning of pinch. This is the most important difference with method 1. And since we don't rely on the scale of last round, we don't need lastScale anymore.
currentScale: current scale, based on the orignial view size.
gestureRecognizer.scale: new scale, based on the view size of last Pinch (not the last round), the scale value based on the orignial view size will be currentScale * gestureRecognizer.scale
And we use transform.scaledBy now, which use the scale based on view size of last Pinch (not the last round).
```
#objc func handlePinch(_ gestureRecognizer: UIPinchGestureRecognizer) {
let currentScale = self.imageView.frame.width/self.imageView.bounds.size.width
var newScale = gestureRecognizer.scale
if currentScale * gestureRecognizer.scale < minScale {
newScale = minScale / currentScale
} else if currentScale * gestureRecognizer.scale > maxScale {
newScale = maxScale / currentScale
}
self.imageView.transform = self.imageView.transform.scaledBy(x: newScale, y: newScale)
print("current scale: \(currentScale), new scale: \(newScale)")
gestureRecognizer.scale = 1
}
```
Other approaches mentioned here did not work for me, but taking a couple things from previous answers and (in my opinion) simplifying things, I've got this to work for me. effectiveScale is an ivar set to 1.0 in viewDidLoad.
-(void)zoomScale:(UIPinchGestureRecognizer *)recognizer
{
if([recognizer state] == UIGestureRecognizerStateEnded) {
// Reset last scale
lastScale = 1.0;
return;
}
if ([recognizer state] == UIGestureRecognizerStateBegan ||
[recognizer state] == UIGestureRecognizerStateChanged) {
CGFloat pinchscale = [recognizer scale];
CGFloat scaleDiff = pinchscale - lastScale;
if (scaleDiff < 0)
scaleDiff *= 2; // speed up zoom-out
else
scaleDiff *= 0.7; // slow down zoom-in
effectiveScale += scaleDiff;
// Limit scale between 1 and 2
effectiveScale = effectiveScale < 1 ? 1 : effectiveScale;
effectiveScale = effectiveScale > 2 ? 2 : effectiveScale;
// Handle transform in separate method using new effectiveScale
[self makeAndApplyAffineTransform];
lastScale = pinchscale;
}
}
- (void)handlePinch:(UIPinchGestureRecognizer *)recognizer{
//recognizer.scale=1;
CGFloat pinchScale = recognizer.scale;
pinchScale = round(pinchScale * 1000) / 1000.0;
NSLog(#"%lf",pinchScale);
if (pinchScale < 1)
{
currentLabel.font = [UIFont fontWithName:currentLabel.font.fontName size:
(currentLabel.font.pointSize - pinchScale)];
recognizer.view.transform = CGAffineTransformScale(recognizer.view.transform, recognizer.scale, recognizer.scale);
[currentLabel sizeToFit];
recognizer.scale=1;
}
else
{
currentLabel.font = [UIFont fontWithName:currentLabel.font.fontName size:(currentLabel.font.pointSize + pinchScale)];
recognizer.view.transform = CGAffineTransformScale(recognizer.view.transform, recognizer.scale, recognizer.scale);
[currentLabel sizeToFit];
recognizer.scale=1;
}
//currentLabel.adjustsFontSizeToFitWidth = YES;
// [currentLabel sizeToFit];
NSLog(#"Font :%#",label.font);
}
- (void)pinchToZoom:(UIPinchGestureRecognizer*)gesture
{
switch (gesture.state)
{
case UIGestureRecognizerStateBegan:
{
lastScale = gesture.scale;
}break;
case UIGestureRecognizerStateChanged:
{
const CGFloat zoomSensitivity = 5;
const CGFloat zoomMin = 1;
const CGFloat zoomMax = 16;
CGFloat objectScale = gesture.view.contentScaleFactor;
CGFloat zoomDiff = lastScale - gesture.scale;
CGFloat zoomDirty = objectScale - zoomDiff * zoomSensivity;
CGFloat zoomTo = fmaxf(zoomMin, fminf(zoomDirty, zoomMax));
// step round if needed (neutralize elusive changes)
zoomTo = (NSInteger)(zoomTo * 10) * 0.1;
if ( objectScale != zoomTo )
gesture.view.contentScaleFactor = zoomTo;
lastScale = gesture.scale;
}break;
default:
break;
}
}
I took #Paul Solt solution - that is great btw, and adapted it to Swift, for those interested
#objc func pinchUpdated(recognizer: UIPinchGestureRecognizer) {
if recognizer.state == .began {
// Reset the last scale, necessary if there are multiple objects with different scales
lastScale = recognizer.scale
}
if recognizer.state == .began || recognizer.state == .changed {
let currentScale = recognizer.view!.layer.value(forKeyPath: "transform.scale") as! CGFloat
// Constants to adjust the max/min values of zoom
let maxScale: CGFloat = 4.0
let ninScale: CGFloat = 0.9
var newScale: CGFloat = 1 - (lastScale - recognizer.scale)
newScale = min(newScale, maxScale / currentScale)
newScale = max(newScale, ninScale / currentScale)
recognizer.view!.transform = recognizer.view!.transform.scaledBy(x: newScale, y: newScale)
lastScale = recognizer.scale // Store the previous scale factor for the next pinch gesture call
}
}
Can you use a scroll view instead? Then you could use scrollView.minimumZoomScale and scrollView.maximumZoomScale