Scaling UIImageView inside UIScrollView with maintaining the rotation - iphone

I have a problem in scaling the uiimageview which is placed inside the uiscrollview. I have googled and checked all the questions related to my problem in StackOverflow as well. I tried all the answers that are posted in the StackOverflow also. Nothing worked for me.
First I am placing the uiimageview inside uiscrollview in nib file and I am taking the image from Camera roll and filling the image view. Then I am using uirotationgesturerecognizer to rotate the image.
Here is the code that I am trying to do.
- (void)viewDidLoad
{
[super viewDidLoad];
NSLog(#"%#",[[UIDevice currentDevice] model]);
// Do any additional setup after loading the view, typically from a nib.
self.imagePicker = [[[UIImagePickerController alloc] init] autorelease];
self.picChosenImageView.layer.shouldRasterize = YES;
self.picChosenImageView.layer.rasterizationScale = [UIScreen mainScreen].scale;
self.picChosenImageView.layer.contents = (id)[UIImage imageNamed:#"test"].CGImage;
self.picChosenImageView.layer.shadowColor = [UIColor blackColor].CGColor;
self.picChosenImageView.layer.shadowOpacity = 0.8f;
self.picChosenImageView.layer.shadowRadius = 8;
self.picChosenImageView.layer.shadowPath = [UIBezierPath bezierPathWithRect:self.picChosenImageView.bounds].CGPath;
UIRotationGestureRecognizer *rotationRecognizer = [[[UIRotationGestureRecognizer alloc]initWithTarget:self
action:#selector(handleRotate:)] autorelease];
rotationRecognizer.delegate = self;
[self.picChosenImageView addGestureRecognizer:rotationRecognizer];
self.containerView.delegate = self;
self.containerView.contentSize = self.picChosenImageView.layer.frame.size;
self.containerView.maximumZoomScale = 4.0f;
self.containerView.minimumZoomScale = 1.0f;
angle = 0.0f;
useRotation = 0.0;
isRotationStarted=FALSE;
isZoomingStarted = FALSE;
}
-(void)lockZoom
{
maximumZoomScale = self.containerView.maximumZoomScale;
minimumZoomScale = self.containerView.minimumZoomScale;
self.containerView.maximumZoomScale = 1.0;
self.containerView.minimumZoomScale = 1.0;
self.containerView.clipsToBounds = false;
self.containerView.scrollEnabled = false;
}
-(void)unlockZoom
{
self.containerView.maximumZoomScale = maximumZoomScale;
self.containerView.minimumZoomScale = minimumZoomScale;
self.containerView.clipsToBounds = true;
self.containerView.scrollEnabled = true;
}
#pragma mark - ScrollView delegate methods
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView
{
return self.picChosenImageView;
}
- (void)scrollViewDidZoom:(UIScrollView *)scrollView
{
CGRect frame = self.picChosenImageView.frame;
frame.origin = CGPointZero;
self.picChosenImageView.frame = frame;
//self.picChosenImageView.transform = prevTransform;
}
-(void) scrollViewWillBeginZooming:(UIScrollView *)scrollView withView:(UIView *)view
{
if(!isZoomingStarted)
{
self.picChosenImageView.transform = CGAffineTransformRotate(self.picChosenImageView.transform, angle);
NSLog(#"The zooming started");
isZoomingStarted = TRUE;
CGSize contentSize = self.containerView.bounds.size;
CGRect contentFrame = self.containerView.bounds;
NSLog(#"frame on start: %#", NSStringFromCGRect(contentFrame));
NSLog(#"size on start: %#", NSStringFromCGSize(contentSize));
//prevTransform = self.picChosenImageView.transform;
}
}
-(void) scrollViewDidEndZooming:(UIScrollView *)scrollView withView:(UIView *)view atScale:(float)scale
{
if(isZoomingStarted)
{
self.picChosenImageView.transform = CGAffineTransformRotate(self.picChosenImageView.transform, angle);
isZoomingStarted = FALSE;
CGSize contentSize = self.containerView.contentSize;
CGRect contentFrame = self.containerView.bounds;
NSLog(#"frame on end: %#", NSStringFromCGRect(contentFrame));
NSLog(#"size on end: %#", NSStringFromCGSize(contentSize));
}
}
#pragma mark - GestureRecognizer methods
- (void) handleRotate:(UIRotationGestureRecognizer *)recognizer
{
if(isZoomingStarted == FALSE)
{
if([recognizer state] == UIGestureRecognizerStateBegan)
{
angle = 0.0f;
[self lockZoom];
}
useRotation+= recognizer.rotation;
while( useRotation < -M_PI )
{
useRotation += M_PI*2;
}
while( useRotation > M_PI )
{
useRotation -= M_PI*2;
}
NSLog(#"The rotated value is %f",RADIANS_TO_DEGREES(useRotation));
self.picChosenImageView.transform = CGAffineTransformRotate([self.picChosenImageView transform],
recognizer.rotation);
[recognizer setRotation:0];
if([recognizer state] == UIGestureRecognizerStateEnded)
{
angle = useRotation;
useRotation = 0.0f;
isRotationStarted = FALSE;
self.containerView.hidden = NO;
//prevTransform = self.picChosenImageView.transform;
[self unlockZoom];
}
}
}
My problem is, I am able to successfully do a zoom in and zoom out. I am able to rotate the uiimageview as I wanted to. After rotating the uiimageview to a certain angle, and when I am trying to zoom in, the imageview gets back to the original position (rotate itself back to zero degree) and then the zooming happens. I want to retain the rotation and also zoom. I tried saving the previous transform and assign in back scrollDidzoom and scrollDidBegin delegate methods. None worked. Please help me to spot my mistake which I am overlooking.

try using CGAffineTransformScale instead of just resizing the frame for zooming:
anImage.transform = CGAffineTransformScale(anImage.transform, 2.0, 2.0);
changing the transform for scaling might fix your rotation issue.
hope this helps.

I had the same problem. UIScrollView is taking control over UIImageView and it is using transform without rotation.
So I do not give image reference to scroll and I have added UIPinchGestureRecognizer for scaling.
func viewForZoomingInScrollView(scrollView: UIScrollView) -> UIView? {
return nil
}
Dragging is still working :)
// viewDidLoad
var pinchGestureRecognizer = UIPinchGestureRecognizer(target: self, action: #selector(pinchRecogniezed))
scrollView.addGestureRecognizer(pinchGestureRecognizer)
func pinchRecogniezed(sender: UIPinchGestureRecognizer) {
if sender.state == .Began || sender.state == .Changed {
let scale = sender.scale
imageView.transform = CGAffineTransformScale(imageView.transform, scale, scale)
sender.scale = 1
}
}

Related

Users should be able to drag the images and place it on image view in ios

I am creating an app in which users can design their own stage with the images present over there. I have created buttons as images, now I want some code which enables user to drag and drop the images(buttons) in the particular image view area.
You can achieve this with tap gestures (e.g. tap first image view, tap on destination image view), but it's not very intuitive. Doing a proper drag-and-drop with a UIPanGestureRecognizer will be far more intuitive for your end user.
Technically, you're not dragging an image from the image view, but rather you'll actually be dragging the image view itself. (An image, itself, has no visual representation without an image view.) And when you let go, you'll animate the changing of the image view frames to complete the illusion.
If you had a NSArray of imageViews, you could add a gesture to their superview:
#property (nonatomic, strong) NSMutableArray *imageViews;
- (void)viewDidLoad
{
[super viewDidLoad];
[self createImageViewArray];
UIPanGestureRecognizer *gesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePan:)];
[self.view addGestureRecognizer:gesture];
}
- (void)createImageViewArray
{
self.imageViews = [NSMutableArray array];
}
- (void)handlePan:(UIPanGestureRecognizer *)gesture
{
static UIImageView *draggedImage = nil;
static CGRect draggedImageOriginalFrame;
CGPoint location = [gesture locationInView:gesture.view];
if (gesture.state == UIGestureRecognizerStateBegan)
{
draggedImage = [self determineImageForLocation:location];
if (draggedImage)
{
draggedImageOriginalFrame = draggedImage.frame;
[draggedImage.superview bringSubviewToFront:draggedImage];
}
}
else if (gesture.state == UIGestureRecognizerStateChanged && draggedImage != nil)
{
CGPoint translation = [gesture translationInView:gesture.view];
CGRect frame = draggedImageOriginalFrame;
frame.origin.x += translation.x;
frame.origin.y += translation.y;
draggedImage.frame = frame;
}
else if (draggedImage != nil && (gesture.state == UIGestureRecognizerStateEnded ||
gesture.state == UIGestureRecognizerStateCancelled ||
gesture.state == UIGestureRecognizerStateFailed))
{
UIImageView *droppedOver = nil;
if (gesture.state == UIGestureRecognizerStateEnded)
droppedOver = [self draggedImageView:draggedImage toLocation:location];
if (droppedOver == nil)
{
[UIView animateWithDuration:0.25
animations:^{
draggedImage.frame = draggedImageOriginalFrame;
}];
}
else
{
[droppedOver.superview bringSubviewToFront:droppedOver];
[UIView animateWithDuration:0.25
animations:^{
draggedImage.frame = droppedOver.frame;
droppedOver.frame = draggedImageOriginalFrame;
}];
}
}
}
- (UIImageView *)draggedImageView:(UIImageView *)draggedView toLocation:(CGPoint)location
{
for (UIImageView *imageview in self.imageViews)
if (CGRectContainsPoint(imageview.frame, location) && imageview != draggedView)
return imageview;
return nil;
}
- (UIImageView *)determineImageForLocation:(CGPoint)location
{
return [self draggedImageView:nil toLocation:location];
}

Pinch Gesture not working with imageVIew

I am trying to zoom in and out an imageView
here is the code
- (void)pinch:(UIPinchGestureRecognizer *)gesture
{
if (handSelected == YES)
{
if (gesture.state == UIGestureRecognizerStateEnded || gesture.state == UIGestureRecognizerStateChanged)
{
NSLog(#"gesture.scale = %f", gesture.scale);
CGFloat currentScale = self.imgHand.frame.size.width / self.imgHand.bounds.size.width;
CGFloat newScale = currentScale * gesture.scale;
if (newScale < 1.0) {
newScale = 1.0;
}
if (newScale > 4.0) {
newScale = 4.0;
}
CGAffineTransform transform = CGAffineTransformMakeScale(newScale, newScale);
self.imgHand.transform = transform;
gesture.scale = 1;
}
}
}
- (void)viewDidLoad
{
[super viewDidLoad];
[self adjustRingPressed:self];
self.view.multipleTouchEnabled = YES;
self.imgHand.multipleTouchEnabled = YES;
UIPinchGestureRecognizer *gst = [[UIPinchGestureRecognizer alloc] initWithTarget:self.imgHand action:#selector(pinch:)];
[gst setDelegate:self];
[self.imgHand addGestureRecognizer:gst];
}
it seems that my pinch Code never runs
try to add:
self.imgHand.userInteractionEnabled = TRUE;
by default UIImageView has userInteractionEnabled = FALSE
make it a function of gesturelike template code below:
- (IBAction)handlePinch:(UIPinchGestureRecognizer *)recognizer
{
recognizer.view.transform = CGAffineTransformScale(recognizer.view.transform, recognizer.scale, recognizer.scale);
recognizer.scale = 1;
}

convertRect:toView: works different on iPad retina and iPad 2

I have this code that works perfectly on an iPad2 but it scales wrong on a Retina iPad.
I executed the app with no change on both iPads and the behaviour is totally different.
On the Retina iPad the images come back to the original position and the transform doesn't take place.
The code takes a group of views, add them to a temporal view, resize the temporal view and them add those views back, so I can resize all of them at the same time.
- (IBAction)scaleParts:(UIPinchGestureRecognizer *)sender {
if (sender.state == UIGestureRecognizerStateBegan) {
self.tempCanvasView = [[UIView alloc] initWithFrame:self.canvas.bounds];
self.tempCanvasView.backgroundColor = [UIColor redColor];
for (UIView *view in [self.canvas subviews]) {
CGRect currentFrame = view.bounds;
CGRect newFrame = [view convertRect:currentFrame toView:self.tempCanvasView];
view.bounds = newFrame;
[self.tempCanvasView addSubview:view];
}
[self.canvas addSubview:self.tempCanvasView];
} else if (sender.state == UIGestureRecognizerStateChanged) {
self.tempCanvasView.transform = CGAffineTransformScale(self.tempCanvasView.transform, sender.scale, sender.scale);
sender.scale = 1.0;
} else if (sender.state == UIGestureRecognizerStateEnded) {
for (UIView *view in [self.tempCanvasView subviews]) {
CGRect currentFrame = view.bounds;
CGRect newFrame = [view convertRect:currentFrame toView:self.canvas];
view.frame = newFrame;
[self.canvas addSubview:view];
}
[self.tempCanvasView removeFromSuperview];
self.tempCanvasView = nil;
}
}
Maybe you need to detect which screen type the app is using (scale=1 for iPad and 2 for iPadRetina).
CGFloat scale = [[UIScreen mainScreen] scale];
after that..
if (scale==1) {
//code from above
} else {
//code from above with small modification
}
Here is a correct method to do it:
SWIFT:
extension UIView {
func convertRectCorrectly(rect: CGRect, toView view: UIView) -> CGRect {
if UIScreen.mainScreen().scale == 1 {
return self.convertRect(rect, toView: view)
}
else if self == view {
return rect
}
else {
var rectInParent = self.convertRect(rect, toView: self.superview)
rectInParent.origin.x /= UIScreen.mainScreen().scale
rectInParent.origin.y /= UIScreen.mainScreen().scale
let superViewRect = self.superview!.convertRectCorrectly(self.superview!.frame, toView: view)
rectInParent.origin.x += superViewRect.origin.x
rectInParent.origin.y += superViewRect.origin.y
return rectInParent
}
}
}

How to zoom while keeping the previous rotation transform as it is?

I have an imageView, to which, I have added UIPinchGestureRecognizer and UIRotationGestureRecognizer.
in pinch gesture, I transform and scale the View and in rotation gesture I apply rotation transform to it.
The problem is when I rotate the imageView and then start zooming. Zooming always begins from the normal state.
So What i want is when I rotate it to say 30 degree clockwise and then zoom it. it should zoom while remaining that 30 degree on the clockwise direction.
Here is the code:
- (void)viewDidLoad{
[super viewDidLoad];
//setting up the image view
mTotalRotation = 0.0;
self.imageView.image = self.photo;
self.imageView.userInteractionEnabled = YES;
UIRotationGestureRecognizer *twoFingersRotate =
[[[UIRotationGestureRecognizer alloc] initWithTarget:self action:#selector(twoFingersRotate:)] autorelease];
[self.imageView addGestureRecognizer:twoFingersRotate];
UIPinchGestureRecognizer *pinchGesture = [[[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(pinchZoom:)] autorelease];
[self.imageView addGestureRecognizer:pinchGesture];
// Do any additional setup after loading the view from its nib.
}
// Rotation gesture handler
- (void)twoFingersRotate:(UIRotationGestureRecognizer *)recognizer
{
if ([recognizer state] == UIGestureRecognizerStateEnded) {
mTotalRotation += recognizer.rotation;
return;
}
self.imageView.transform = CGAffineTransformMakeRotation(mTotalRotation + recognizer.rotation);
}
// Pinch Gesture
-(void)pinchZoom:(UIPinchGestureRecognizer*)recognizer{
self.imageView.transform = CGAffineTransformMakeScale(recognizer.scale, recognizer.scale) ;
}
Change the line:
self.imageView.transform = CGAffineTransformMakeRotation(mTotalRotation + recognizer.rotation);
with:
self.imageView.transform = CGAffineTransformRotate(self.imageView.transform, recognizer.rotation);
And the line:
self.imageView.transform = CGAffineTransformMakeScale(recognizer.scale, recognizer.scale);
with:
self.imageView.transform = CGAffineTransformScale(self.imageView.transform, recognizer.scale, recognizer.scale);
Edit
To limit the scale, you can do the following:
CGAffineTransform transform = self.imageView.transform;
float newScale = recognizer.scale * sqrt(transform.a*transform.a + transform.c*transform.c);
if (newScale > scaleLimit) {
self.imageView.transform = CGAffineTransformScale(transform, recognizer.scale, recognizer.scale);
}
Maybe you could have your scale and rotate view as a subview of a view that you zoom?

Max/Min Scale of Pinch Zoom in UIPinchGestureRecognizer - iPhone iOS

How would I be able to limit the scale of the UIPinchGestureRecognizer to a min and max level? The scale property below seems to be relative to the last known scale (the delta from last state) and I can't figure out how to set a limit to the size/heigh of the object being zoomed.
-(void)scale:(id)sender {
[self.view bringSubviewToFront:[(UIPinchGestureRecognizer*)sender view]];
if([(UIPinchGestureRecognizer*)sender state] == UIGestureRecognizerStateEnded) {
lastScale = 1.0;
return;
}
CGFloat pinchscale = [(UIPinchGestureRecognizer*)sender scale];
CGFloat scale = 1.0 - (lastScale - pinchscale);
CGAffineTransform currentTransform = [(UIPinchGestureRecognizer*)sender view].transform;
CGAffineTransform holderTransform = holderView.transform;
CGAffineTransform newTransform = CGAffineTransformScale(currentTransform, scale, scale);
[[(UIPinchGestureRecognizer*)sender view] setTransform:newTransform];
lastScale = [(UIPinchGestureRecognizer*)sender scale];
}
Here is the solution that I figured out after using Anomie's answer as a starting point.
- (void)handlePinchGesture:(UIPinchGestureRecognizer *)gestureRecognizer {
if([gestureRecognizer state] == UIGestureRecognizerStateBegan) {
// Reset the last scale, necessary if there are multiple objects with different scales
lastScale = [gestureRecognizer scale];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan ||
[gestureRecognizer state] == UIGestureRecognizerStateChanged) {
CGFloat currentScale = [[[gestureRecognizer view].layer valueForKeyPath:#"transform.scale"] floatValue];
// Constants to adjust the max/min values of zoom
const CGFloat kMaxScale = 2.0;
const CGFloat kMinScale = 1.0;
CGFloat newScale = 1 - (lastScale - [gestureRecognizer scale]);
newScale = MIN(newScale, kMaxScale / currentScale);
newScale = MAX(newScale, kMinScale / currentScale);
CGAffineTransform transform = CGAffineTransformScale([[gestureRecognizer view] transform], newScale, newScale);
[gestureRecognizer view].transform = transform;
lastScale = [gestureRecognizer scale]; // Store the previous scale factor for the next pinch gesture call
}
}
There isn't a way to limit the scale on a UIPinchGestureRecognizer. To limit the height in your code, you should be able to do something like this:
CGFloat scale = 1.0 - (lastScale - pinchscale);
CGRect bounds = [(UIPinchGestureRecognizer*)sender view].bounds;
scale = MIN(scale, maximumHeight / CGRectGetHeight(bounds));
scale = MAX(scale, minimumHeight / CGRectGetHeight(bounds));
To limit width, change 'Height' to 'Width' in the last two lines.
I took some info gleaned from Paul Solt and Anoime's answers, and added that to an existing category I have made for UIViewController to allow making any UIView draggable, to now make it pinchable using gestures and transforms.
Note: this dirties the tag property of the view you are making draggable/pinchable. So if you needed the tag for something else, you can consider placing that value in the NSMutableDictionary being used by this technique. That's available as [self dictForView:theView]
Implementing in your project:
You can make any subview within the view controllers "view" draggable or pinchable (or both)
place a single line of code in your viewDidLoad (for example:)
[self makeView:mySubView draggable:YES pinchable:YES minPinchScale:0.75 maxPinchScale:1.0];
turn it off in viewDidUnload (releases guestures & dictionary):
[self makeView:mySubView draggable:NO pinchable:NO minPinchScale:1.0 maxPinchScale:1.0];
DragAndPinchScale.h file
#import <UIKit/UIKit.h>
#interface UIViewController (DragAndPinchScale)
-(void) makeView:(UIView*)aView
draggable:(BOOL)draggable
pinchable:(BOOL)pinchable
minPinchScale:(CGFloat)minPinchScale
maxPinchScale:(CGFloat)maxPinchScale;
-(NSMutableDictionary *) dictForView:(UIView *)theView;
-(NSMutableDictionary *) dictForViewGuestures:(UIGestureRecognizer *)guesture;
#end
DragAndPinchScale.m file
#import "DragAndPinchScale.h"
#implementation UIViewController (DragAndPinchScale)
-(NSMutableDictionary *) dictForView:(UIView *)theView{
NSMutableDictionary *dict = (NSMutableDictionary*) (void*) theView.tag;
if (!dict) {
dict = [[NSMutableDictionary dictionary ] retain];
theView.tag = (NSInteger) (void *) dict;
}
return dict;
}
-(NSMutableDictionary *) dictForViewGuestures:(UIGestureRecognizer *)guesture {
return [self dictForView:guesture.view];
}
- (IBAction)fingersDidPinchInPinchableView:(UIPinchGestureRecognizer *)fingers {
NSMutableDictionary *dict = [self dictForViewGuestures:fingers];
UIView *viewToZoom = fingers.view;
CGFloat lastScale;
if([fingers state] == UIGestureRecognizerStateBegan) {
// Reset the last scale, necessary if there are multiple objects with different scales
lastScale = [fingers scale];
} else {
lastScale = [[dict objectForKey:#"lastScale"] floatValue];
}
if ([fingers state] == UIGestureRecognizerStateBegan ||
[fingers state] == UIGestureRecognizerStateChanged) {
CGFloat currentScale = [[[fingers view].layer valueForKeyPath:#"transform.scale"] floatValue];
// limits to adjust the max/min values of zoom
CGFloat maxScale = [[dict objectForKey:#"maxScale"] floatValue];
CGFloat minScale = [[dict objectForKey:#"minScale"] floatValue];
CGFloat newScale = 1 - (lastScale - [fingers scale]);
newScale = MIN(newScale, maxScale / currentScale);
newScale = MAX(newScale, minScale / currentScale);
CGAffineTransform transform = CGAffineTransformScale([[fingers view] transform], newScale, newScale);
viewToZoom.transform = transform;
lastScale = [fingers scale]; // Store the previous scale factor for the next pinch gesture call
}
[dict setObject:[NSNumber numberWithFloat:lastScale]
forKey:#"lastScale"];
}
- (void)fingerDidMoveInDraggableView:(UIPanGestureRecognizer *)finger {
NSMutableDictionary *dict = [self dictForViewGuestures:finger];
UIView *viewToDrag = finger.view;
if (finger.state == UIGestureRecognizerStateBegan) {
[dict setObject:[NSValue valueWithCGPoint:viewToDrag.frame.origin]
forKey:#"startDragOffset"];
[dict setObject:[NSValue valueWithCGPoint:[finger locationInView:self.view]]
forKey:#"startDragLocation"];
}
else if (finger.state == UIGestureRecognizerStateChanged) {
NSMutableDictionary *dict = (NSMutableDictionary*) (void*) viewToDrag.tag;
CGPoint stopLocation = [finger locationInView:self.view];
CGPoint startDragLocation = [[dict valueForKey:#"startDragLocation"] CGPointValue];
CGPoint startDragOffset = [[dict valueForKey:#"startDragOffset"] CGPointValue];
CGFloat dx = stopLocation.x - startDragLocation.x;
CGFloat dy = stopLocation.y - startDragLocation.y;
// CGFloat distance = sqrt(dx*dx + dy*dy );
CGRect dragFrame = viewToDrag.frame;
CGSize selfViewSize = self.view.frame.size;
if (!UIDeviceOrientationIsPortrait(self.interfaceOrientation)) {
selfViewSize = CGSizeMake(selfViewSize.height,selfViewSize.width);
}
selfViewSize.width -= dragFrame.size.width;
selfViewSize.height -= dragFrame.size.height;
dragFrame.origin.x = MIN(selfViewSize.width, MAX(0,startDragOffset.x+dx));
dragFrame.origin.y = MIN(selfViewSize.height,MAX(0,startDragOffset.y+dy));
viewToDrag.frame = dragFrame;
}
else if (finger.state == UIGestureRecognizerStateEnded) {
[dict removeObjectForKey:#"startDragLocation"];
[dict removeObjectForKey:#"startDragOffset"];
}
}
-(void) makeView:(UIView*)aView
draggable:(BOOL)draggable
pinchable:(BOOL)pinchable
minPinchScale:(CGFloat)minPinchScale
maxPinchScale:(CGFloat)maxPinchScale{
NSMutableDictionary *dict = (NSMutableDictionary*) (void*) aView.tag;
if (!(pinchable || draggable)) {
if (dict){
[dict release];
aView.tag = 0;
}
return;
}
if (dict) {
UIPanGestureRecognizer *pan =[dict objectForKey:#"UIPanGestureRecognizer"];
if(pan){
if ([aView.gestureRecognizers indexOfObject:pan]!=NSNotFound) {
[aView removeGestureRecognizer:pan];
}
[dict removeObjectForKey:#"UIPanGestureRecognizer"];
}
UIPinchGestureRecognizer *pinch =[dict objectForKey:#"UIPinchGestureRecognizer"];
if(pinch){
if ([aView.gestureRecognizers indexOfObject:pinch]!=NSNotFound) {
[aView removeGestureRecognizer:pinch];
}
[dict removeObjectForKey:#"UIPinchGestureRecognizer"];
}
[dict removeObjectForKey:#"startDragLocation"];
[dict removeObjectForKey:#"startDragOffset"];
[dict removeObjectForKey:#"lastScale"];
[dict removeObjectForKey:#"minScale"];
[dict removeObjectForKey:#"maxScale"];
}
if (draggable) {
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(fingerDidMoveInDraggableView:)];
pan.minimumNumberOfTouches = 1;
pan.maximumNumberOfTouches = 1;
[aView addGestureRecognizer:pan];
[pan release];
dict = [self dictForViewGuestures:pan];
[dict setObject:pan forKey:#"UIPanGestureRecognizer"];
}
if (pinchable) {
CGAffineTransform initialTramsform = CGAffineTransformMakeScale(1.0, 1.0);
aView.transform = initialTramsform;
UIPinchGestureRecognizer *pinch = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(fingersDidPinchInPinchableView:)];
[aView addGestureRecognizer:pinch];
[pinch release];
dict = [self dictForViewGuestures:pinch];
[dict setObject:pinch forKey:#"UIPinchGestureRecognizer"];
[dict setObject:[NSNumber numberWithFloat:minPinchScale] forKey:#"minScale"];
[dict setObject:[NSNumber numberWithFloat:maxPinchScale] forKey:#"maxScale"];
}
}
#end
The problem with most of the other answers is that they are trying to deal with scale as a linear value, when in fact it is non-linear due to the way UIPinchGestureRecognizer calculates its scale property based on the touch distance. When this isn't taken into account, the user must use more or less pinch distance to 'undo' the scaling applied by a previous pinch gesture.
Consider: suppose transform.scale = 1.0 and I place my fingers 6cm apart on the screen, then pinch inwards to 3cm apart - the resulting gestureRecognizer.scale is 0.5, and 0.5-1.0 is -0.5, so transform.scale will become 1.0+(-0.5) = 0.5. Now, I lift my fingers, place them back down 3cm apart and pinch outwards to 6cm. The resulting gestureRecognizer.scale will be 2.0, and 2.0-1.0 is 1.0, so transform.scale will become 0.5+1.0 = 1.5. Not what I wanted to happen.
The fix is to calculate the delta pinch scale as a proportion of its previous value. I place my fingers down 6cm apart, and pinch inwards to 3cm, so gestureRecognizer.scale is 0.5. 0.5/1.0 is 0.5, so my new transform.scale is 1.0*0.5 = 0.5. Next, I place my fingers down 3cm apart, and pinch outwards to 6cm. gestureRecognizer.scale is then 2.0, and 2.0/1.0 is 2.0, so my new transform.scale is 0.5*2.0 = 1.0, which is exactly what I wanted to happen.
Here it is in code:
in -(void)viewDidLoad:
self.zoomGestureCurrentZoom = 1.0f;
in -(void)onZoomGesture:(UIPinchGestureRecognizer*)gestureRecognizer:
if ( gestureRecognizer.state == UIGestureRecognizerStateBegan )
{
self.zoomGestureLastScale = gestureRecognizer.scale;
}
else if ( gestureRecognizer.state == UIGestureRecognizerStateChanged )
{
// we have to jump through some hoops to clamp the scale in a way that makes the UX intuitive
float scaleDeltaFactor = gestureRecognizer.scale/self.zoomGestureLastScale;
float currentZoom = self.zoomGestureCurrentZoom;
float newZoom = currentZoom * scaleDeltaFactor;
// clamp
float kMaxZoom = 4.0f;
float kMinZoom = 0.5f;
newZoom = MAX(kMinZoom,MIN(newZoom,kMaxZoom));
self.view.transform = CGAffineTransformScale([[gestureRecognizer view] transform], newZoom, newZoom);
// store for next time
self.zoomGestureCurrentZoom = newZoom;
self.zoomGestureLastScale = gestureRecognizer.scale;
}
Thanks, really useful code snippet above clamping to a minimum and maximum scale.
I found that when I flipped the view first using:
CGAffineTransformScale(gestureRecognizer.view.transform, -1.0, 1.0);
it would cause a flicker when scaling the view.
Let me know what you think but the solution for me was to update the code sample above, and if the view has been flipped (flag set via property) then invert the scale value:
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged)
{
CGFloat currentScale = [[[gestureRecognizer view].layer valueForKeyPath:#"transform.scale"] floatValue];
if(self.isFlipped) // (inverting)
{
currentScale *= -1;
}
CGFloat newScale = 1 - (self.lastScale - [gestureRecognizer scale]);
newScale = MIN(newScale, self.maximumScaleFactor / currentScale);
newScale = MAX(newScale, self.minimumScaleFactor / currentScale);
CGAffineTransform transform = CGAffineTransformScale([[gestureRecognizer view] transform], newScale, newScale);
gestureRecognizer.view.transform = transform;
self.lastScale = [gestureRecognizer scale]; // Store the previous scale factor for the next pinch gesture call
Method 1
gestureRecognizer.scale start with 1.0 at the beginning of pinch (gestureRecognizer.state == .began), and gestureRecognizer.scale in later state (.changed or .end) is always based on that, for example, if the view size is view_size at the beginning of pinch (might not be the same with the original size orig_view_size), gestureRecognizer.scale always starts with 1.0, and if it becomes 2.0 later, it's size will be 2 * view_size, so the scale always based on that when the pinch starts.
And we can get the scale at the beginning of pinch (gestureRecognizer.state == .began) lastScale = self.imageView.frame.width/self.imageView.bounds.size.width, so the scale of the original image now should be lastScale * gestureRecognizer.scale
lastScale: The scale of last round of Pinch, a round of Pinch is from state.start to state.end, and the scale is based on the original view size.
gestureRecognizer.scale: current scale, based on the view size after last round of Pinch.
currentScale: current scale, based on the orignial view size.
newScale: new scale, based on the orignial view size. newScale = lastScale * gestureRecognizer.scale, and you can limit the scale of the view by comparing the limitation with newScale.
```
var lastScale:CGFloat = 1.0
#objc func handlePinch(_ gestureRecognizer: UIPinchGestureRecognizer) {
var newScale = gestureRecognizer.scale
if gestureRecognizer.state == .began {
lastScale = self.imageView.frame.width/self.imageView.bounds.size.width
}
newScale = newScale * lastScale
if newScale < minScale {
newScale = minScale
} else if newScale > maxScale {
newScale = maxScale
}
let currentScale = self.imageView.frame.width/self.imageView.bounds.size.width
self.imageView.transform = CGAffineTransform(scaleX: newScale, y: newScale)
print("last Scale: \(lastScale), current scale: \(currentScale), new scale: \(newScale), gestureRecognizer.scale: \(gestureRecognizer.scale)")
}
```
Method 2
gestureRecognizer.scale start with 1.0 on each Pinch notification, this require you reset gestureRecognizer.scale = 1 in the code in the end of each notification handler, so now gestureRecognizer.scale is based on the view size of last Pinch notification, NOT based on the view size at the beginning of pinch. This is the most important difference with method 1. And since we don't rely on the scale of last round, we don't need lastScale anymore.
currentScale: current scale, based on the orignial view size.
gestureRecognizer.scale: new scale, based on the view size of last Pinch (not the last round), the scale value based on the orignial view size will be currentScale * gestureRecognizer.scale
And we use transform.scaledBy now, which use the scale based on view size of last Pinch (not the last round).
```
#objc func handlePinch(_ gestureRecognizer: UIPinchGestureRecognizer) {
let currentScale = self.imageView.frame.width/self.imageView.bounds.size.width
var newScale = gestureRecognizer.scale
if currentScale * gestureRecognizer.scale < minScale {
newScale = minScale / currentScale
} else if currentScale * gestureRecognizer.scale > maxScale {
newScale = maxScale / currentScale
}
self.imageView.transform = self.imageView.transform.scaledBy(x: newScale, y: newScale)
print("current scale: \(currentScale), new scale: \(newScale)")
gestureRecognizer.scale = 1
}
```
Other approaches mentioned here did not work for me, but taking a couple things from previous answers and (in my opinion) simplifying things, I've got this to work for me. effectiveScale is an ivar set to 1.0 in viewDidLoad.
-(void)zoomScale:(UIPinchGestureRecognizer *)recognizer
{
if([recognizer state] == UIGestureRecognizerStateEnded) {
// Reset last scale
lastScale = 1.0;
return;
}
if ([recognizer state] == UIGestureRecognizerStateBegan ||
[recognizer state] == UIGestureRecognizerStateChanged) {
CGFloat pinchscale = [recognizer scale];
CGFloat scaleDiff = pinchscale - lastScale;
if (scaleDiff < 0)
scaleDiff *= 2; // speed up zoom-out
else
scaleDiff *= 0.7; // slow down zoom-in
effectiveScale += scaleDiff;
// Limit scale between 1 and 2
effectiveScale = effectiveScale < 1 ? 1 : effectiveScale;
effectiveScale = effectiveScale > 2 ? 2 : effectiveScale;
// Handle transform in separate method using new effectiveScale
[self makeAndApplyAffineTransform];
lastScale = pinchscale;
}
}
- (void)handlePinch:(UIPinchGestureRecognizer *)recognizer{
//recognizer.scale=1;
CGFloat pinchScale = recognizer.scale;
pinchScale = round(pinchScale * 1000) / 1000.0;
NSLog(#"%lf",pinchScale);
if (pinchScale < 1)
{
currentLabel.font = [UIFont fontWithName:currentLabel.font.fontName size:
(currentLabel.font.pointSize - pinchScale)];
recognizer.view.transform = CGAffineTransformScale(recognizer.view.transform, recognizer.scale, recognizer.scale);
[currentLabel sizeToFit];
recognizer.scale=1;
}
else
{
currentLabel.font = [UIFont fontWithName:currentLabel.font.fontName size:(currentLabel.font.pointSize + pinchScale)];
recognizer.view.transform = CGAffineTransformScale(recognizer.view.transform, recognizer.scale, recognizer.scale);
[currentLabel sizeToFit];
recognizer.scale=1;
}
//currentLabel.adjustsFontSizeToFitWidth = YES;
// [currentLabel sizeToFit];
NSLog(#"Font :%#",label.font);
}
- (void)pinchToZoom:(UIPinchGestureRecognizer*)gesture
{
switch (gesture.state)
{
case UIGestureRecognizerStateBegan:
{
lastScale = gesture.scale;
}break;
case UIGestureRecognizerStateChanged:
{
const CGFloat zoomSensitivity = 5;
const CGFloat zoomMin = 1;
const CGFloat zoomMax = 16;
CGFloat objectScale = gesture.view.contentScaleFactor;
CGFloat zoomDiff = lastScale - gesture.scale;
CGFloat zoomDirty = objectScale - zoomDiff * zoomSensivity;
CGFloat zoomTo = fmaxf(zoomMin, fminf(zoomDirty, zoomMax));
// step round if needed (neutralize elusive changes)
zoomTo = (NSInteger)(zoomTo * 10) * 0.1;
if ( objectScale != zoomTo )
gesture.view.contentScaleFactor = zoomTo;
lastScale = gesture.scale;
}break;
default:
break;
}
}
I took #Paul Solt solution - that is great btw, and adapted it to Swift, for those interested
#objc func pinchUpdated(recognizer: UIPinchGestureRecognizer) {
if recognizer.state == .began {
// Reset the last scale, necessary if there are multiple objects with different scales
lastScale = recognizer.scale
}
if recognizer.state == .began || recognizer.state == .changed {
let currentScale = recognizer.view!.layer.value(forKeyPath: "transform.scale") as! CGFloat
// Constants to adjust the max/min values of zoom
let maxScale: CGFloat = 4.0
let ninScale: CGFloat = 0.9
var newScale: CGFloat = 1 - (lastScale - recognizer.scale)
newScale = min(newScale, maxScale / currentScale)
newScale = max(newScale, ninScale / currentScale)
recognizer.view!.transform = recognizer.view!.transform.scaledBy(x: newScale, y: newScale)
lastScale = recognizer.scale // Store the previous scale factor for the next pinch gesture call
}
}
Can you use a scroll view instead? Then you could use scrollView.minimumZoomScale and scrollView.maximumZoomScale