how to implement two fingers panning like safari broswer? - iphone

i try to implement panning and zooming functionality like safari browser in ipad.
i used UIPinchGestureRecognizer for zooming with two fingers touch. but i dont know how to implement two fingers panning.
when i touch with two fingers its tap count is 1.
please help.
thanks in advance.

You don't want the tapCount, you want the number of touches. If you touch down with two fingers you can two touch events each with a tap count of 1.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
[touches count] would return 2, one for each finger tip.
Read through the apple guide for touch events

Related

How to make certain part of image clickable in ios?

In one of my app i am using image for the whole screen. Which can be zoomed to some extent. That image has eight different shapes(includes person,shapes,etc).What i am trying to do is i need to make certain each shape of the image is clickable. Touching each part takes to different screens.I didn't have any idea about how to achieve this. I googled it but no solution.
1.) Is this possible by using co-ordinates(will normal image and zoomed image differ in co-ordinates? How to achieve this by using co-ordinates?
2.) If not what will be the best approach to achieve my goal?
Any ideas/samples is much appreciated.
I would add a UITapGestureRecognizer to the imageView holding your image. And the locationOfTouch:inView: method to determine the coordinates of your touch.
Correct me if i don't understand your question. For me, that should be very simple? Just have couple of buttons which background is clear? And They are all on top of the image.
Check UIResponder and the touches methods in there. You'll probably want to hook in to something like -touchesEnded:withEvent: for detecting when a finger lifts off the screen.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
CGRect touchLocationInView = [touch locationInView:imageView];
// Do something to check that the rect is valid
// If valid, react to it
}
}
Also, a link to UITouch.

how to move view with my finger in iphone Objective-c

i have this view that the size of him is 1280 X 345 and im moving it left and right in my screen.
now, my main question is how do i make it move with my finger (not swipeLeft / swipeRight) i want it to move with my finger like the home screen of the iPhone IOS.
now, im using this code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:infoView];
//[self.view setTransform:CGAffineTransformTranslate(CGAffineTransformIdentity, (location.x + self.view.frame.origin.x - dx), 0.0)];
NSLog(#"touch: %f",location.x);
infoView.frame = CGRectMake(infoView.frame.origin.x + location.x, 0, 1280, 345);
}
that should be the current way but i cant figur it out.
i have also try to find an answer in google and here, but, as you know.. i didn't found something useful.
i've also made this so you can understand it better.
Don't write code to do this yourself. Use UIScrollView. This is what it is designed for.
If you want to have the same effect as the iPhone/iPad home screen you should use a UIScrollView and enable paging.
If you really want to handle it yourself for some reason, avoiding UIScrollView, there are UIGestureRecognizers, specifically UIPanGestureRecognizer which can leverage most of the job handling multiple touch events in one place.
i used the UIPagecontrol and UIScrollView to make it. very easy and smart object !

UIGestureRecognizer that cancels when touchup outside of targetView

I have a UIView of which I want to know when the user is doing:
touchDownInside (to highlight the view)
touchUpInside (to confirm the action)
touchUpOutside (to cancel and reset the hightlight)
what gestureRecognizer can do this for me?
Please go though these four methods also which your view can override to handle the four distinct touch events:
1) finger or fingers touches the screen
-(void)touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event;
2)finger or fingers move across the screens(this message repeatedly as a finger moves.)
-(void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event;
3)finger or fingers is removed from the screen
-(void)touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event;
4) a system event,interrupts a touch before it ends
-(void)touchesCancelled:(NSSet*)touches withEvent:(UIEvent*)event;
You can do this implementing the touches methods itself, why do you need gesture recognizer?
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
The above function for touch down.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
The above function for touch up. And the combination of both for cancel.

How to determine x&y of last touch in multitouch scenario?

I'm new to this site and to iOS programming.
I am working on a percussion app. For this I want to know the x and y location of every finger that touches the screen. I thought this was straightforward, but multitouch is making things confusing for me.
Suppose the user has two fingers pressed on the screen and the user presses a third finger on the screen. How do I determine the location of this third finger?
My feeling is that I need to implement touchesBegan
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
To determine the x and y location I have to look at the touch that triggered this call to touchesBegan. But the touches are presented in an unordered set. If the third finger triggered this touchesBegan, then I have three touches in the NSSet. But since the set is unordered, how do I determine the touch that triggered this third call to touchesBegan? If I understand my documentation correctly it could be any of those three touches.
Many thanks in advance
Maybe you can add a simple counter property and increase its value in touchesBegan and decrease in touchesEnd.
Okay, it now turns out I have been mis-interpreting my test-data. If two fingers already touch the device when a third finger touches the device, only one UITouch object is part of the NSSet in the call to touchesBegan, and not three as I seemed to experience. This one UITouch represents the last fingertouch.
The only time when more than one UITouch object is passed to touchesBegan is when in fact multiple fingers begin to touch the device at the same time.
Since, in my case, I need to handle all new touches based on their location, I need to handle all UITouch objects in the NSSet.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for ( UITouch *touch in touches)
{
CGPoint location = [touch locationInView:self.view];
// Handle finger touch at given location
// ...
}
}

difference between touchMoved and Swipe?

i am rotating circle in iPad.i have inserted swipegesture event.but I want to different operations in touchMoved and swipeEvent.but when I do touch moving , swipw gesture is called, what i have to do , any help please?
swipe:
NSEventTypeSwipe
An event representing a swipe gesture.
Available in Mac OS X v10.6 and later.
Declared in NSEvent.h.
and
touchMoved:
Sent to the receiver when one or more fingers move in the associated view.
(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
so swipe cant be use to code some thing when any thing happens like touches.swipe is use for recognizing touch event.