Gesture problem: UISwipeGestureRecognizer + UISlider - iphone

Got a gesture related problem. I implemented UISwipeGestureRecognizer to get swipe left and right events and that is working fine. However the problem I'm facing is that the UISlider's I have in the same view are not playing nice. The sliding motion of the sliders is being mistaken as a swipe left/right.
Any one experienced this problem before, got any ideas how to correct it?
Many thanks.
Here is the code contained within the view controller:
- (void)viewDidLoad {
[super viewDidLoad];
//Setup handling of LEFT and RIGHT swipes
UISwipeGestureRecognizer *recognizer;
recognizer = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(handleSwipeFrom:)];
[recognizer setDirection:(UISwipeGestureRecognizerDirectionRight)];
[[self view] addGestureRecognizer:recognizer];
[recognizer release];
recognizer = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(handleSwipeFrom:)];
[recognizer setDirection:(UISwipeGestureRecognizerDirectionLeft)];
[[self view] addGestureRecognizer:recognizer];
[recognizer release];
}
-(void)handleSwipeFrom:(UISwipeGestureRecognizer *)recognizer {
if (recognizer.direction == UISwipeGestureRecognizerDirectionRight) {
NSLog(#"Swipe Right");
//Do stuff
}
if (recognizer.direction == UISwipeGestureRecognizerDirectionLeft) {
NSLog(#"Swipe Left");
//Do stuff
}
}

The simplest way to handle this is probably to prevent the gesture recognizer from seeing touches on your slider. You can do that by setting yourself as the delegate, and then implementing
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
if ([touch.view isKindOfClass:[UISlider class]]) {
// prevent recognizing touches on the slider
return NO;
}
return YES;
}
If this doesn't work, it's possible the slider actually has subviews that receive the touch, so you could walk up the superview chain, testing each view along the way.

Swift 4.0 version. Don't forget the UIGestureRecognizerDelegate.
func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer, shouldReceive touch: UITouch) -> Bool {
if let touchedView = touch.view, touchedView.isKind(of: UISlider.self) {
return false
}
return true
}

I ended up getting this working just before Lily responded above. Here is the code I used, but Lily's looks cleaner (haven't tested Lily's thou):
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
BOOL AllowSwipes = YES;
CGPoint point1 = [touch locationInView:_sliderLeft];
CGPoint point2 = [touch locationInView:_sliderRight];
//Left slider test
if ([_sliderLeft hitTest:point1 withEvent:nil] != nil) {
AllowSwipes = NO;
NSLog(#"On Left Slider");
}
//Right slider test
if ([_sliderRight hitTest:point2 withEvent:nil] != nil) {
AllowSwipes = NO;
NSLog(#"On Right Slider");
}
}
return AllowSwipes;
}

Related

How to get touch position of screen for any control objective c?

Hi I want to get touch position/point of any control or any where touch happened.
For this I have implemented this but I am not getting correct touch points.
// Create gesture recognizer, notice the selector method
UITapGestureRecognizer *oneFingerTwoTaps =
[[[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(oneFingerTwoTaps)] autorelease];
oneFingerTwoTaps.delegate=self;
// Set required taps and number of touches
[oneFingerTwoTaps setNumberOfTapsRequired:1];
[oneFingerTwoTaps setNumberOfTouchesRequired:1];
[[self view] addGestureRecognizer:oneFingerTwoTaps];
And
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
CGPoint point= [touch locationInView:touch.view];
NSLog(#"Point - %f, %f",point.x,point.y);
NSLog(#"Touch");
return NO; // handle the touch
}
When I am tried to hit any UIButton, UIImage , UITableView it's not giving me correct point of hit is there any thing I am doing wrong? Please help me. thank you.
Your code prints the location of the touch in the view that it occured in. So if you touch a 50x100 sized button in the middle, it will print "Point 25.0, 50.0".
If you want to find the touch location of UIScreen, you have to convert the value:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
CGPoint point = [touch locationInView:touch.view];
CGPoint pointOnScreen = [touch.view convertPoint:point toView:nil];
NSLog(#"Point - %f, %f", pointOnScreen.x, pointOnScreen.y);
NSLog(#"Touch");
return NO; // handle the touch
}
Or just immediately get the coordinate in the window (screen) space:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
CGPoint pointOnScreen = [touch locationInView:nil];
NSLog(#"Point - %f, %f", pointOnScreen.x, pointOnScreen.y);
NSLog(#"Touch");
return NO; // handle the touch
}
This is swift version of #DrummerB's answer.
// add as property
var tapRecognizer: UIGestureRecognizer!
// add in viewDidLoad()
tapRecognizer = UIGestureRecognizer()
tapRecognizer.delegate = self // inherit UIGestureRecognizerDelegate
self.view.addGestureRecognizer(tapRecognizer)
// listen for delegate method
func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer, shouldReceive touch: UITouch) -> Bool {
let pointOnScreen: CGPoint = touch.location(in: nil)
print("Point - \(pointOnScreen.x), \(pointOnScreen.y)")
print("Touch")
return false
}
// this worked for me. xcode 9.2, ios 11.2

Is this a safe way to process gestures using one of the UIGestureRecognizers?

Wondering if the following is ok to process UIGestureRecognizer touches:
if (pinchGestureRecognizer.state == UIGestureRecognizerStateBegan || pinchGestureRecognizer.state == UIGestureRecognizerStateChanged)
{
//process touches
}
else if (pinchGestureRecognizer.state == UIGestureRecognizerStateEnded)
{
//do whatever after gesture recognizer finishes
}
Basically, I'm wondering if UIGestureRecognizerStateEnded occurs should the UIGestureRecognizer still process touches, or at that point have all the touches finished? I'm getting weird values for translationInView so just wanted to ask here.
You asked:
Basically, I'm wondering if UIGestureRecognizerStateEnded occurs should the UIGestureRecognizer still process touches, or at that point have all the touches finished?
When you get UIGestureRecognizerStateEnded, yes, the gesture is done. But obviously, unless you remove the gesture recognizer from the view at that point, if the user starts a new gesture, the gesture recognition process starts over again starting at UIGestureRecognizerStateBegan.
Furthermore, you said:
I'm getting weird values for translationInView so just wanted to ask here.
Your code sample suggests that you're dealing with a pinch gesture, which doesn't do translationInView, so I'm not sure what "weird values" you're getting. You can, though have two simultaneous gestures by setting your gesture's delegate and implementing shouldRecognizeSimultaneouslyWithGestureRecognizer:
- (void)viewDidLoad
{
[super viewDidLoad];
UIPinchGestureRecognizer *pinch = [[UIPinchGestureRecognizer alloc] initWithTarget:self
action:#selector(handlePinch:)];
pinch.delegate = self;
[self.view addGestureRecognizer:pinch];
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self
action:#selector(handlePan:)];
pan.delegate = self;
[self.view addGestureRecognizer:pan];
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
if ([gestureRecognizer isKindOfClass:[UIPanGestureRecognizer class]] && [otherGestureRecognizer isKindOfClass:[UIPinchGestureRecognizer class]])
return YES;
if ([gestureRecognizer isKindOfClass:[UIPinchGestureRecognizer class]] && [otherGestureRecognizer isKindOfClass:[UIPanGestureRecognizer class]])
return YES;
return NO;
}
- (void)handlePinch:(UIPinchGestureRecognizer *)gesture
{
CGFloat scale = [gesture scale];
NSLog(#"%s: %#: scale=%.2f", __FUNCTION__, [self stringFromGestureState:gesture.state], scale);
}
- (void)handlePan:(UIPanGestureRecognizer *)gesture
{
CGPoint translation = [gesture translationInView:gesture.view];
NSLog(#"%s: %#: translation=%#", __FUNCTION__, [self stringFromGestureState:gesture.state], NSStringFromCGPoint(translation));
}
The above code works, where the handlePan returns the pan, and the handlePinch returns the pinch, and the translationInView of handlePan looks unexceptional. Perhaps you can show us how you're using a pinch gesture and getting translationInView and tell us what's odd in the values you're getting.

Zoom with UIScrollView and rotate with UIRotationGestureRecognizer

I'm trying to have an UIImageView rotate inside a UIScrollView but when I try to zoom/unzoom, my rotation gets back to 0.
Here's the code for my rotation :
- (void)rotateImage:(UIRotationGestureRecognizer*)rotate
{
if ([rotate state] == UIGestureRecognizerStateEnded)
{
rotateAngle += [spin rotation];
return;
}
myView.transform = CGAffineTransformMakerotation(rotateAngle + [rotate rotation]);
}
Concerning the UIScrollView, I just return myView in -(UIView*)viewForZoomingInScrollView:
And a last information, in my interface builder, this is my view stack :
UIImageView
UIView (myView)
UISCrollView
Which means that I have a UIView between the UIImageView and the UIScrollView
I would rather suggest you to handle zoom using pinch gesture. It will look more neat and even . add pinch gesture to the view . for zooming add the following code in its selector method
- (void)handlePinch:(UIPinchGestureRecognizer *)recognizer
{
myView.transform = CGAffineTransformScale(recogniser.view.transform, recognizer.scale, recognizer.scale);
recognizer.scale = 1;
}
for rotating ,
-(void)handleRotate:(UIRotationGestureRecognizer *)rec
{
myView.transform = CGAffineTransformRotate(rec.view.transform, rec.rotation);
rec.rotation = 0;
}
make sure you declare self as the delegate for both the gesture and implement the following delegate method
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
remove myView from
-(UIView*)viewForZoomingInScrollView:
method before implementing my solution. let just gesture alone handle the zooming and not scrollview.:) Good luck
for pinch Gesture
use the code from apple library
Don't forgot to add UIGestureRecognizerDelegate,UIScrollViewDelegate
self.ScrollView= [[UIScrollView alloc]init];
self.ScrollView.frame = CGRectMake(0, 0, DEVICE_WIDTH, DEVICE_HEIGHT);
[self.view addSubview:self.ScrollView];
// for pinch Gesture
self.ScrollView.minimumZoomScale=0.5;
self.ScrollView.maximumZoomScale=6.0;
self.ScrollView.contentSize=CGSizeMake(imageView.frame.size.width, imageView.frame.size.height);
self.ScrollView.delegate = self;
[self.ScrollView addSubview:imageView];
-(UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView
{
return imageView;
}
For rotation ,
UIRotationGestureRecognizer *rotation = [[UIRotationGestureRecognizer alloc]initWithTarget:self action:#selector(handleRotate:)];
rotation.delegate =self;
[self.ScrollView addGestureRecognizer:rotation];
- (IBAction)handleRotate:(UIRotationGestureRecognizer *)recognizer {
recognizer.view.transform = CGAffineTransformRotate(recognizer.view.transform, recognizer.rotation);
recognizer.rotation = 0;
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}

LongPress on UIIImageView, create a clone "ghost" image, then drag it around

I have a sample project where I am trying to test out this behavior. I have two UIImageViews side by side. I want to be able to longPress on either the left or right UIImageView, create a semi-transparent clone image, and drag it to the other UIImageView to swap images.
For example, to swap the images, the user could do:
Tap and hold on the left UIImageView
A cloned, smaller "ghost" image will appear at the touch coordinate
The user will drag the cloned image to the right UIImageView
The user will release their finger from the screen to "drop" the cloned image
The left and right UIImageViews can then swamp their images.
Here are some pics to illustrate:
Original state:
http://d.pr/i/PNVc
After long press on left-side UIImageView with smaller cloned image added as subview:
http://d.pr/i/jwxj
I can detect the longpress and make the cloned image, but I cannot pan that image around unless I release my finger and do another touch on the screen.
I'd like to be able to be able to do it all in one motion, without the user needing to remove their finger from the screen.
I don't know if this is the right approach, but this is how I'm doing it for now. Thanks for any help!
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
[self addLongPressGestureToPiece:leftImageView];
[self addLongPressGestureToPiece:rightImageView];
}
- (void)addLongPressGestureToPiece:(UIView *)piece
{
NSLog(#"addLongPressGestureToPiece");
UILongPressGestureRecognizer *longPressGesture = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(longPressPiece:)];
[longPressGesture setDelegate:self];
[piece addGestureRecognizer:longPressGesture];
[longPressGesture release];
}
- (void)addPanGestureRecognizerToPiece:(UIView *)piece
{
NSLog(#"addPanGestureRecognizerToPiece");
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panPiece:)];
[panGesture setMaximumNumberOfTouches:1];
[panGesture setDelegate:self];
[piece addGestureRecognizer:panGesture];
[panGesture release];
}
- (void)longPressPiece:(UILongPressGestureRecognizer *)gestureRecognizer
{
UIImageView *piece = (UIImageView*)[gestureRecognizer view];
CGPoint point = [gestureRecognizer locationInView:self.view];
if(gestureRecognizer.state == UIGestureRecognizerStateBegan)
{
NSLog(#"UIGestureRecognizerStateBegan");
// create the semi-transparent imageview with the selected pic
UIImage *longPressImage = [piece image];
UIImageView *draggableImageView = [[UIImageView alloc] initWithFrame:CGRectMake(point.x - longPressImage.size.width/6/2, point.y - longPressImage.size.height/6/2, longPressImage.size.width/6, longPressImage.size.height/6)];
draggableImageView.image = longPressImage;
draggableImageView.alpha = 0.5;
draggableImageView.userInteractionEnabled = YES;
[self.view addSubview:draggableImageView];
[self addPanGestureRecognizerToPiece:draggableImageView];
photoView.userInteractionEnabled = NO;
}
else if(gestureRecognizer.state == UIGestureRecognizerStateChanged)
{
NSLog(#"Changed");
}
else if(gestureRecognizer.state == UIGestureRecognizerStateEnded)
{
NSLog(#"Ended");
photoView.userInteractionEnabled = YES;
}
}
- (void)adjustAnchorPointForGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
{
NSLog(#"adjustAnchorPointForGestureRecognizer");
if (gestureRecognizer.state == UIGestureRecognizerStateBegan) {
UIView *piece = gestureRecognizer.view;
CGPoint locationInView = [gestureRecognizer locationInView:piece];
CGPoint locationInSuperview = [gestureRecognizer locationInView:piece.superview];
piece.layer.anchorPoint = CGPointMake(locationInView.x / piece.bounds.size.width, locationInView.y / piece.bounds.size.height);
piece.center = locationInSuperview;
}
}
- (void)panPiece:(UIPanGestureRecognizer *)gestureRecognizer
{
NSLog(#"pan piece");
UIView *piece =[gestureRecognizer view];
[self adjustAnchorPointForGestureRecognizer:gestureRecognizer];
CGPoint translation = [gestureRecognizer translationInView:[piece superview]];
// if velocity.y is positive, user is moving down, if negative, then moving up
CGPoint velocity = [gestureRecognizer velocityInView:[piece superview]];
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged)
{
[piece setCenter:CGPointMake([piece center].x + translation.x, [piece center].y + translation.y)];
[gestureRecognizer setTranslation:CGPointZero inView:[piece superview]];
}
else if([gestureRecognizer state] == UIGestureRecognizerStateEnded)
{
NSLog(#"piece y %f", piece.frame.origin.y);
}
}
This is probably because the touch was already detected before the image (and thus the gesture recognizer) was created. I would suggest creating the ghost images beforehand, but have them hidden. That way, the gesture recognizer is actually fired as the ghost image is already there.
This may be expensive though, if you have a lot of images. If performance is bad, you can also consider just creating the image views, but only setting the images for those views when they are touched. That way you don't have to load the images into memory unnecessarily.
For me, I discovered i was getting ghost images on account of longGestureAction being called multiple times for the .begin state. While I knew that .changed would be continuous, I found (and docs confirmed) that each state can occur multiple times. So in the .begin state when you're creating the image from GraphicsContext, its getting interrupted with multiple calls - leading to ghosts where the old one was. Put a guard in your code as I did to prevent this from occurring. Problem solved.

UIImageView Gestures (Zoom, Rotate) Question

I would like to make 2 operations to an UIImageView zoom, rotate, I have 2 problems:
A. I make an operation for zoom for ex. and when I try to make rotation the UIImageView is set to initial size, I would like to know how to keep the zoomed UIImageView and make the rotation from the zoomed image.
B. I would like to combine the zoom operation with rotation and I don't know ho to implement this:
- (void)viewDidLoad
{
foo = [[UIImageView alloc]initWithFrame:CGRectMake(100.0, 100.0, 600, 800.0)];
foo.userInteractionEnabled = YES;
foo.multipleTouchEnabled = YES;
foo.image = [UIImage imageNamed:#"earth.jpg"];
foo.contentMode = UIViewContentModeScaleAspectFit;
foo.clipsToBounds = YES;
[self.view addSubview:foo];
}
//---pinch gesture---
UIPinchGestureRecognizer *pinchGesture =
[[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(handlePinchGesture:)];
[foo addGestureRecognizer:pinchGesture];
[pinchGesture release];
//---rotate gesture---
UIRotationGestureRecognizer *rotateGesture =
[[UIRotationGestureRecognizer alloc] initWithTarget:self action:#selector(handleRotateGesture:)];
[foo addGestureRecognizer:rotateGesture];
[rotateGesture release];
//---handle pinch gesture---
-(IBAction) handlePinchGesture:(UIGestureRecognizer *) sender {
NSLog(#"Pinch");
CGFloat factor = [(UIPinchGestureRecognizer *) sender scale];
if (factor > 1) {
//---zooming in---
sender.view.transform = CGAffineTransformMakeScale(
lastScaleFactor + (factor-1),
lastScaleFactor + (factor-1));
}
else {
//---zooming out---
sender.view.transform = CGAffineTransformMakeScale(lastScaleFactor * factor, lastScaleFactor * factor);
}
if (sender.state == UIGestureRecognizerStateEnded) {
if (factor > 1) {
lastScaleFactor += (factor-1);
} else {
lastScaleFactor *= factor;
}
}
}
//---handle rotate gesture---
-(IBAction) handleRotateGesture:(UIGestureRecognizer *) sender {
CGFloat rotation = [(UIRotationGestureRecognizer *) sender rotation];
CGAffineTransform transform = CGAffineTransformMakeRotation(rotation + netRotation);
sender.view.transform = transform;
if (sender.state == UIGestureRecognizerStateEnded) {
netRotation += rotation;
}
}
Thanks
Hope this can be helpful to you, that's how I usually implement gesture recognizers:
UIRotationGestureRecognizer *rotationGesture = [[UIRotationGestureRecognizer alloc] initWithTarget:self action:#selector(rotatePiece:)];
[piece addGestureRecognizer:rotationGesture];
[rotationGesture release];
UIPinchGestureRecognizer *pinchGesture = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(scalePiece:)];
[pinchGesture setDelegate:self];
[piece addGestureRecognizer:pinchGesture];
[pinchGesture release];
Rotate method: Reset the gesture recognizer's rotation to 0 after applying so the next callback is a delta from the current rotation
- (void)rotatePiece:(UIRotationGestureRecognizer *)gestureRecognizer {
[self adjustAnchorPointForGestureRecognizer:gestureRecognizer];
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
[gestureRecognizer view].transform = CGAffineTransformRotate([[gestureRecognizer view] transform], [gestureRecognizer rotation]);
[gestureRecognizer setRotation:0];
}
}
Scale Method, at the end reset the gesture recognizer's scale to 1 after applying so the next callback is a delta from the current scale
- (void)scalePiece:(UIPinchGestureRecognizer *)gestureRecognizer {
[self adjustAnchorPointForGestureRecognizer:gestureRecognizer];
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
[gestureRecognizer view].transform = CGAffineTransformScale([[gestureRecognizer view] transform], [gestureRecognizer scale], [gestureRecognizer scale]);
[gestureRecognizer setScale:1];
}
}
Than ensure that the pinch, pan and rotate gesture recognizers on a particular view can all recognize simultaneously prevent other gesture recognizers from recognizing simultaneously
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
// if the gesture recognizers are on different views, don't allow simultaneous recognition
if (gestureRecognizer.view != otherGestureRecognizer.view)
return NO;
// if either of the gesture recognizers is the long press, don't allow simultaneous recognition
if ([gestureRecognizer isKindOfClass:[UILongPressGestureRecognizer class]] || [otherGestureRecognizer isKindOfClass:[UILongPressGestureRecognizer class]])
return NO;
return YES;
}
Scale and rotation transforms are applied relative to the layer's anchor point this method moves a gesture recognizer's view's anchor point between the user's fingers
- (void)adjustAnchorPointForGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer {
if (gestureRecognizer.state == UIGestureRecognizerStateBegan) {
UIView *piece = gestureRecognizer.view;
CGPoint locationInView = [gestureRecognizer locationInView:piece];
CGPoint locationInSuperview = [gestureRecognizer locationInView:piece.superview];
piece.layer.anchorPoint = CGPointMake(locationInView.x / piece.bounds.size.width, locationInView.y / piece.bounds.size.height);
piece.center = locationInSuperview;
}
}
Just implement gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: in your delegate.
I have a UIPinchGestureRecognizer, a UIPanGestureRecognizer and a UIRotationGestureRecognizer set up and I want them all to work at the same time. I also have a UITapGestureRecognizer which I do not want to be recognized simultaneously. All I did was this:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
if (![gestureRecognizer isKindOfClass:[UITapGestureRecognizer class]] && ![otherGestureRecognizer isKindOfClass:[UITapGestureRecognizer class]]) {
return YES;
}
return NO;
}
I found something that may interest you on the stanford university website:
http://www.stanford.edu/class/cs193p/cgi-bin/drupal/downloads-2010-winter
on this site you will need to scroll down until you see the number 14: "Title: Lecture #14 - MultiTouch"
Download the: "14_MultiTouchDemo.zip"
In this example you can scale and rotate every image at the same time.
hope i helped :)
When you use CGAffineTransformMakeScale, you are resetting the transformation of Identity every time you use it and you lose the previous transformation information.
Try using CGAffineTransformScale(view.transform,scale, scale) for the pinch zooming. You will need to retain the original frame size to keep the zooming under control though.
see: How can I use pinch zoom(UIPinchGestureRecognizer) to change width of a UITextView?
For rotation similarly:
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
view.transform = CGAffineTransformRotate([view transform], [gestureRecognizer rotation]);
[gestureRecognizer setRotation:0];
}
I know this is a pretty old thread, I came across this imageview subclass, which works nice for zoom, rotate and pan. It uses gesture recognizer on an imageview. I am using this for one of my app.
ZoomRotatePanImageView