in iPhone SDK there are touchesBegan, touchesEnded and touchesMoved functions that are called when a touch event appears. My problem is that when I put a finger on the screen and I'm not moving it there is no way to know if there is a finger on the screen or not. Is there a way to get current touch screen state?
In any of touchesBegan, touchesEnded and touchesMoved functions I can use fallowing code to know the state for every touch, but outside them I can't:|
NSSet *allTouches = [event allTouches];
for(GLuint index = 0; index < [allTouches count]; ++index)
{
UITouch *touch = [[allTouches allObjects] objectAtIndex:index];
if([touch phase] != UITouchPhaseCancelled)
{
CGPoint touchPoint = [touch locationInView:self];
//do something with touchPoint that has state [touch phase]
}
}
You should keep a list of all points where the touches currently are. Make sure you don't just retain the UIEvent or UITouch objects (they're not guaranteed to live long) - instead, create your own data data structure. Odds are all you need is to keep track of the points where touches are currently down.
If you don't want to fix your broken design, why doesn't keeping a running list of active touches solve your problem?
NSMutableSet *touches;
In touchesBegan you add them, in touchesEnded and touchesCancelled you remove them...
You might use a "global" object (i.e. singleton) to track those and synchronize data access. I cannot see how querying this set would be different than asking UIKit directly for this information.
If you need information on touches on standard UIKit objects (i.e. UIButton), you probably need to subclass those - and I am not sure if you can get the desired effect with all classes (as they - or their subviews - could handle touches in any way they want).
Related
I am currently using this code to get the locations of all the touches:
NSSet *allTouches = [event allTouches];
NSArray *allObjects=[allTouches allObjects];
for (int i=0;i<[allObjects count];i++)
{
UITouch *touch = [allObjects objectAtIndex:i];
CGPoint location = [touch locationInView: [touch view]];
//Add to array....
}
While testing it on the simulator (don't have an iPad now to test it), it works perfectly with single-touch. But when trying with multiple-touches, the first iteration is correct while the second iteration doesn't give the correct position.
i.e.
First touch: (536,163) correct
Second touch: (198,608) but should be somewhere around (148,345)
I have a feeling that I should change something with [touch locationInView: [touch view]]; to give the right location but I don't know what to change.
Any help is appreciated.
It's difficult to understand the issue from a set of coordinates without seeing your views, any subviews and where you're touching in that, but you should know that:
[touch locationInView:aView] gives the coordinates of the touch in the coordinate system of aView. That is, the coordinates you see are relative to the top left of aView which may not be what you're expecting. Try [touch locationInView:self], which is more common.
Have a look at the "Events and Touches" section of the Event Handling Guide for iOS. You generally don't need to iterate over that collection if you're trying to track multiple touches. iOS handles all that for you. You can use properties tapCount and phase to get information about whether touches have moved, how many fingers are down and so on.
Does any of that help?
I dont know How to get following type of functionality in my app.
As shown in above figure user can slide(drag) different image parts and user can assemble image.
Can any one tell me which control is this ? Or any tutorial for it ?
Check out the MoveMe sample app. It will show you how to allow movement of subviews by touch and drag.
You can then create imageViews for each of your puzzle pieces that can be moved around.
Once you have accomplished this you need to check during movement of a piece to see when it gets close enough to it's correct spot in the puzzle so it can snap into place when then user lets go of the piece.
http://developer.apple.com/library/ios/#samplecode/MoveMe/Introduction/Intro.html
That all depends on the kind of implementation you want to do.
It can be UIImageView,UIbutton etc.
The advantage with UIIMageView could be you can easily implement UIGestureRecognizer to move the views..
For an app I built I used the touches functions:
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
- (void) touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event
for dragging UIImageViews around. Then you can use the CGRectContainsRect or CGRectContainsPoint to detect if the puzzle piece is in the right location.
If we talk about UIKit,
this could be a UIImageView with images having partial transparent areas.
My first attack on this problem would be as follows:
(1) subclass UIImageView to hold each piece of the puzzle.
(2) Customize your UIImageView subclass by initializing it with a UIPanGestureRecognizer. Something like:
self.panRecognizer =
[[UIPanGestureRecognizer alloc]
initWithTarget: self
action: #selector(handlePan:)];
[self addGestureRecognizer:
self.panRecognizer];
(3) In the action method associated with the pan gesture recognizer, update the object's location based on messages from the puzzle piece's pan gesture recognizer. Something like the following ought to work:
-(void) handlePan:
(UIGestureRecognizer *)sender
{
UIPanGestureRecognizer *panRecognizer =
(UIPanGestureRecognizer *)sender;
if (panRecognizer.state ==
UIGestureRecognizerStateBegan ||
panRecognizer.state ==
UIGestureRecognizerStateChanged)
{
CGPoint currentPoint =
self.center;
CGPoint translation =
[panRecognizer translationInView:
self.superView];
self.center = CGPointMake
(currentPoint.x + translation.x,
currentPoint.y + translation.y);
[panRecognizer setTranslation: CGPointZero
inView: self.superView];
}
}
This is not that easy. You need to write the code first for how to move the image views. There are a lot of samples in Apple references. Then you need to get the boundaries of each image to do this.
Here is the code how to move imgTest which is an UIImageView. Tested, work like a charm.
[[self imgTest]setUserInteractionEnabled:YES];
//Save the first touch point
CGPoint firstTouchPoint;
//xd,yd destance between imge center and my touch center
float xd;
float yd;
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* bTouch = [touches anyObject];
if ([bTouch.view isEqual:[self imgTest]]) {
firstTouchPoint = [bTouch locationInView:[self view]];
xd = firstTouchPoint.x - [[bTouch view]center].x;
yd = firstTouchPoint.y - [[bTouch view]center].y;
}
}
{
UITouch* mTouch = [touches anyObject];
if (mTouch.view == [self imgTest]) {
CGPoint cp = [mTouch locationInView:[self view]];
[[mTouch view]setCenter:CGPointMake(cp.x-xd, cp.y-yd)];
}
}
source
My app has 2 buttons, near.
I'm trying to touch this two buttons, at the same time, using only one finger.
There is a way to check if the touch is over this two buttons at same time?
You can (but really shouldnt) touch 2 buttons with one finger like Abizern said, but you can also call 2 methods for one button touch. For example:
-(void)viewDidLoad {
self.myButton = /*initialize the button*/;
[self.myButton addTarget:self action:#selector(callTwoMethods) forControlEvents:UIControlEventTouchUpInside];
}
-(void)callTwoMethods {
[self methodOne];
[self methodTwo];
}
However, this is not always the correct behavior for what you're trying to do, so starting with iOS 3, we can use a bit of jiggering with the UITouch and event mechanism to figure a lot of it out:
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if ([touches count] == 2) {
//we are aware of two touches, so get them both
UITouch *firstTouch = [[allTouches allObjects] objectAtIndex:0];
UITouch *secondTouch = [[allTouches allObjects] objectAtIndex:1];
CGPoint firstPoint = [firstTouch locationInView:self.view];
CGPoint secondPoint = [secondTouch locationInView:self.view];
if ([self.firstButton pointInside:firstPoint withEvent:event] && [self.secondButton secondPoint withEvent:event] || /*the opposite test for firstButton getting the first and secondButton getting the second touch*/) {
//Do stuff
}
}
}
A touch is a point. Although the recommendation is to make controls of a reasonable size, when you touch the screen, you are getting a point, not a region. That point is going to be in one or other of the controls.
You could try to intercept the touch and turn it into a larger rectangle and see if that covers both the buttons at the same time.
EDIT
Have a look at this SO question to see one way of doing intercepting touches.
I have a UIImageView and I am trying to determine when a drag is performed, how far that drag is from the origin. I currently have this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:touch.view];
if(CGRectContainsPoint(myimage.frame, location) == 0){
}
else
{ //user didn't tap inside image}
The image itself does not move, but a person can take their finger, click on the image and then drag their finger. I am just trying to determine that drag distance.
If you want to calculate distance, you need to remember the point (store it somewhere) in touchesBegan if the user tapped on your image.
Then in touchesMoved or touchesEnd you will be able to get current point and calculate distance to your original point.
If you need to get distance from UIImageView origin, you can call [touch locationInView:myImage];
And I suggest you to use UIGestureRecognizer class instead of handling touches by yourself. They are simpler to implement.
Is there any way for the touchesEnded method to tell me the position of where I lifted my finger?
Also, can it tell me which finger was lifted if I had 2 (or more) on the device? (e.g. if I put finger 1 down, then finger 2 down, then lifted finger 2 but held down finger 1, could it tell me that I have ended finger 2?)
Thanks
Jon
Look at the touch methods that are inherited by UIViews, specifically touchesEnded:withEvent: will tell you when / where the touches ended (i.e. where you lifted your finger).
Just override these methods in your view controller like this and you can find out where the touch refers to :
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
// Get a touch
UITouch *touch = [touches anyObject];
// Where did it end?
CGPoint endedAt = [touch locationInView:[self view]];
...
The touch object will be the same (isEqual:) to the touch that was sent in the touchesBegan:withEvent: method. This should let you track multiple touches if you store the touches that you are interested in in your touchesBegan and compare them in your touchesEnded