Im using GestureRecognizer delegate for pinch,pan,rotate,longpress for images. I used UIPinchGestureRecognizer delegate for pinching.
But, when i pinch zoomIn it doesn't have any problem. When i zoomOut certain level the images are small, and i can't ZoomIn pinching the images. After that, when i apply pan, the pan is applying whole view and only the image while i release my finger. After release my finger,the pan is apply only image. After touch the image pan apply on whole view
code:
UIPinchGestureRecognizer *pinchGesture1 = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(ahandlePinch1:)];
[myImageView addGestureRecognizer:pinchGesture1];
-(void)ahandlePinch1:(UIPinchGestureRecognizer*)sender {
mCurrentScale += [sender scale] - mLastScale;
mLastScale = [sender scale];
if (sender.state == UIGestureRecognizerStateEnded)
{
mLastScale = 1.0;
}
CGAffineTransform currentTransform = CGAffineTransformIdentity;
CGAffineTransform newTransform = CGAffineTransformScale(currentTransform, mCurrentScale, mCurrentScale);
myImageView.transform = newTransform;
}
You should modify your ahandlePinch1 method so that you don't reduce the size of the image below a certain amount. It is almost certainly getting so small that it can no longer detect two distinct touches (which are required for the pinch gesture).
Apple generally recommend allowing a minimum of 44x44 pts as a touchable area, so I would suggest you stop your image from resizing below 88x88.
Alternatively, if you actually need your image to be less than that then you should add the gesture recognizer to a different view (perhaps the superview), rather than the image itself.
Related
I'm trying to detect a touch on a UISubview of a view being animated
Here's my detection code:
//simple but not very elegant way of getting correct frame
CGRect keepFrame = self.fishContainer.layer.frame;
self.fishContainer.layer.frame = [[self.fishContainer.layer presentationLayer] frame];
//get touch location, taking presentation layer into account. (See above)
CGPoint p = [sender locationInView:self.fishContainer];
CALayer *layer =[self.fishContainer.layer presentationLayer];
//apply relevant transform
p = CGPointApplyAffineTransform(p,layer.affineTransform);
EBLog(#"checking point %#",NSStringFromCGPoint(p));
UIView *vToRemove = nil;
//find topmost view containing touched point
for (UIView *v in self.parasites) {
EBLog(#"-BOUND %#",NSStringFromCGRect(v.frame));
if(CGRectContainsPoint(v.frame, p))
{
vToRemove = v;
}
}
//OK, we have a view. Let's remove it.
if(vToRemove)
{
EBLog(#"found one");
[vToRemove removeFromSuperview];
[self.parasites removeObject:vToRemove];
if ([self.parasites count] == 0) {
[self showWinnerScreen];
[self stopGame];
}
}
//restore view frame
self.fishContainer.layer.frame = keepFrame;
Everything works correctly as long as I don't animate parasiteArea parentview.
When I animate parasiteArea's parentview (A CAAnimation consisting of move of the view, scale of the view, and rotate of the view) , the touch is outside the bounds of the expected subview.
UPDATE
I manged to get the detection working in most cases (see code above), by using the presentationLayer property and CGPointApplyAffineTransform. There is however, still some cases where it dosnt work.
I guess I need to translate the touch point to the coordinate space of the CAAnimation.
Or something like that? any suggestions?
I ended up using UIView animateWithDuration instead of CAAninmation. For my purpose the limited animation possibles were enough.
I am building an iphone app that has multiple UIViews in a 3x3 matrix (so total of 9 UIViews) within one UIViewController. I am trying to find a way to let the user move one view to a location in the matrix which will then rearrange the rest of the views accordingly. Think of when you drag an app on your springboard to another place and all the other icons arrange themselves accordingly.
What is the best way to accomplish something like that?
Use UIPanGestureRecognizer, using the translationInView to adjust the coordinates of the item you're dragging. For a discussion of gesture recognizers, see the Event Handling Guide for iOS.
When you let go (i.e. the gesture ends), you can use UIView class method animateWithDuration to animate the moving of various items to their final locations.
So, in viewDidLoad, you might do something like the following (assuming that you had your nine controls in an array called arrayOfViews):
for (UIView *subview in self.arrayOfViews)
{
UIPanGestureRecognizer *gesture = [[UIPanGestureRecognizer alloc] initWithTarget:self
action:#selector(movePiece:)];
[subview addGestureRecognizer:gesture];
}
And then your gesture recognizer handler might look like:
- (void)movePiece:(UIPanGestureRecognizer *)gesture
{
static CGPoint originalCenter;
if (gesture.state == UIGestureRecognizerStateBegan)
{
originalCenter = gesture.view.center;
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
CGPoint translation = [gesture translationInView:self.view];
gesture.view.center = CGPointMake(originalCenter.x + translation.x, originalCenter.y + translation.y);
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
[UIView animateWithDuration:0.25
animations:^{
// move your views to their final resting places here
}];
}
}
This is the bare-bones of what a dragging of controls around might look like.
If you are developing for iOS 6, definitely have a look at the UICollectionView class. If you have dealt with table views before the learning curve should not be too steep. You get the rearranging as well as all the animation for free.
I looked through other solutions, yet I can't find the right solution for myself.
I have a UIImageView within a UIScrollView, where I show big pictures.
I have pinch gesture enabled within my UIScrollView as well as left and right swipe gesture recognizers.
Right now, scrollview's pan gesture seems to disable, (or corrupt) swipe gestures. I don't want to disable horizontal or vertical scrolling on my UIScrollView, and initial scaling of my photos are too large to disable horizontal scrolling.
What I want to do is to trigger swipe gesture when i come at the edge of my UIScrollView.
Here are some codes;
- (void)viewDidLoad
{
// recognizer for pinch gestures
UIPinchGestureRecognizer *pinchRecognizer =[[UIPinchGestureRecognizer alloc]initWithTarget:self action:#selector(handlePinch:)];
[self.myScrollView addGestureRecognizer:pinchRecognizer];
[self.myScrollView setClipsToBounds:NO];
// recognizer for swipe gestures
UISwipeGestureRecognizer *recognizer;
// left and right swipe recognizers for left and right animation
recognizer = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(handleLeftSwipe:)];
[recognizer setDirection:(UISwipeGestureRecognizerDirectionRight)];
[[self myScrollView] addGestureRecognizer:recognizer];
recognizer = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(handleRightSwipe:)];
[recognizer setDirection:(UISwipeGestureRecognizerDirectionLeft)];
[[self myScrollView] addGestureRecognizer:recognizer];
....
My left swipe handler, currently left and right swipes don't have any additional features
-(void)handleLeftSwipe:(UISwipeGestureRecognizer *)recognizer
{
if(!self.tableView.hidden) self.tableView.hidden = YES;
[self showRequiredStuff];
CATransition *transition = [CATransition animation];
transition.duration = 0.75;
transition.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionEaseInEaseOut];
transition.type = kCATransitionPush;
transition.subtype =kCATransitionFromLeft;
transition.delegate = self;
[self.view.layer addAnimation:transition forKey:nil];
My initial screen size;
#define IMAGE_VIEW_WIDTH 320.0
#define IMAGE_VIEW_HEIGHT 384.0
I use scaling for pictures, to make them scale as small as they can be, but most of them are wide images, which case horizontal scrolling is enabled, and vertical scrolling is disabled. Although my swipe handlers are horizontal as well.
I suppose I have clearly explained what is going on and what I need. I have posted codes because I am a newbie on iphone application development, I both want to help other people to see as much code as they can, and maybe someone will point out any bad programming, and we all will benefit from it.
Additional findings from related solutions;
After setting;
#interface myViewController () <UIScrollViewDelegate>
and
self.myScrollView.delegate = self;
Detecting if a the edge is reached, horizontally
- (BOOL) hasReachedAHorizontalEdge {
CGPoint offset = self.myScrollView.contentOffset;
CGSize contentSize = self.myScrollView.contentSize;
CGFloat height = self.myScrollView.frame.size.height;
CGFloat width = self.myScrollView.frame.size.width;
if ( offset.x == 0 ||
(offset.x + width) == contentSize.width ) {
return YES;
}
return NO;
}
- (void) scrollViewDidScroll:(UIScrollView *)scrollView {
if ( [self hasReachedAHorizontalEdge] ) {
NSLog(#"Reached horizontal edge.");
// required here
}
}
At this point, all I need to disable the scrolling at the reached end, etc. if I reached right edge of scroll, I need to disable only right scrolling, that way swipe will be triggered.
I added a thin UIView over the edges of my scrollView and have them recognize swipes. I could not get the scrollView to cooperate with another swipe gesture recognizer. This isn't ideal, but it works for now. I will have to investigate if there is a way to override the scrollView's swipe gesture recognizer.
The trick would be to check and see if the swipe starts near the edges, and then to decide whether to call the scrollView's or my swipe recognizer.
I have a scroll view which shows a image view. I am trying to handle UIRotationGestureRecognizer on the image view. I get the event for rotation and apply the required transform on the same. The image gets properly rotated in the scroll view. Then when I do any operations in the scroll view like zoom or pan the image rotation and position goes for a toss
_mainView is the subView of UIScrollView which is also used for zooming
UIRotationGestureRecognizer *rotationGesture=[[UIRotationGestureRecognizer alloc] initWithTarget:self action:#selector(rotationGesture:)];
[_mainView addGestureRecognizer:rotationGesture];
[rotationGesture release];
-(void) rotationGesture:(UIRotationGestureRecognizer *) sender {
if(sender.state == UIGestureRecognizerStateBegan ||
sender.state == UIGestureRecognizerStateChanged)
{
sender.view.transform = CGAffineTransformRotate(sender.view.transform,
sender.rotation);
_currRotation = _currRotation + sender.rotation;
[sender setRotation:0];
}
}
I will like to understand what would be the right way to handling rotation with-in the scroll view and it maintains that rotation even after zoom events in Scroll View.
Implement the gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: method in your UIGestureRecognizerDelegate, and return all gestures you want to recognize simultaneously. If you still have trouble with it, check out the answers to UIImageView Gestures (Zoom, Rotate) Question.
Good luck!
EDIT: Your comment has me guessing that the issue is that you can only have one transform on at a time, and the scroll view applies a scale transform, replacing the rotation one. You could remove the native zoom recognizer (see this question), or nest another UIView in the scroll view, and apply the rotation transform to that. I like option two, it seems easier. If you go with option one, use CGAffineTransformConcat to apply both the zoom and rotate transformations independently.
I add One ScrollView in that i put image when i resize image using viewForZoomingInScrollView
after that i Rotate image using UIRotationGestureRecognizer at that time image getting resize automatically so any solution for this problem.
here i put code for zoom and rotate:
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView
{
return Photo;
}
/*In response to a rotation gesture, show the image view at the rotation given by the recognizer, then make it fade out in place while rotating back to horizontal.
*/
- (void)handleRotationFrom:(UIRotationGestureRecognizer *)recognizer {
CGAffineTransform transform = CGAffineTransformMakeRotation([recognizer rotation]);
Photo.transform = transform;
}
Implement class EditImageView
handle touch over there for tracking touch.