restart an uitouch - iphone

I've tried all morning, and start to think it's impossible.
So you have a touch in your screen, it entered the response chain and maybe did something. Then you moved it on the screen it called the touchesMoved: method (of the uiResponder associated with the touch) and maybe did other stuff.
At this point, is there anyway, that the touch continues to move in the screen, and when it enters some other uiView, the uiTouch gets reinitialized (I mean, call touchEnded, in the first uiView, then in the new uiView call touchesBegan, and then if continues movement call touchesMoved).
I've tried to resignFirstResponder and to manually call the methods, and I just can't.
I keep thinking that there should exist a really simple answer for this. Something like
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.superview];
if (!CGRectContainsPoint(self.frame, location)) {
[touch restart];
}
or
[[touch phase] restart];
(But anything in documentation ):

My recommendation would be to have the superview of all the views you want to transition through to receive the touches, then do a hit test on the subviews within the touchesMoved/Began/Ended methods (probably via CGRectContainsPoint(...)), and pass the touches along accordingly (calling the corresponding touchesBegan/Moved/Ended method on the subview).

Related

how to move view with my finger in iphone Objective-c

i have this view that the size of him is 1280 X 345 and im moving it left and right in my screen.
now, my main question is how do i make it move with my finger (not swipeLeft / swipeRight) i want it to move with my finger like the home screen of the iPhone IOS.
now, im using this code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:infoView];
//[self.view setTransform:CGAffineTransformTranslate(CGAffineTransformIdentity, (location.x + self.view.frame.origin.x - dx), 0.0)];
NSLog(#"touch: %f",location.x);
infoView.frame = CGRectMake(infoView.frame.origin.x + location.x, 0, 1280, 345);
}
that should be the current way but i cant figur it out.
i have also try to find an answer in google and here, but, as you know.. i didn't found something useful.
i've also made this so you can understand it better.
Don't write code to do this yourself. Use UIScrollView. This is what it is designed for.
If you want to have the same effect as the iPhone/iPad home screen you should use a UIScrollView and enable paging.
If you really want to handle it yourself for some reason, avoiding UIScrollView, there are UIGestureRecognizers, specifically UIPanGestureRecognizer which can leverage most of the job handling multiple touch events in one place.
i used the UIPagecontrol and UIScrollView to make it. very easy and smart object !

How to determine x&y of last touch in multitouch scenario?

I'm new to this site and to iOS programming.
I am working on a percussion app. For this I want to know the x and y location of every finger that touches the screen. I thought this was straightforward, but multitouch is making things confusing for me.
Suppose the user has two fingers pressed on the screen and the user presses a third finger on the screen. How do I determine the location of this third finger?
My feeling is that I need to implement touchesBegan
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
To determine the x and y location I have to look at the touch that triggered this call to touchesBegan. But the touches are presented in an unordered set. If the third finger triggered this touchesBegan, then I have three touches in the NSSet. But since the set is unordered, how do I determine the touch that triggered this third call to touchesBegan? If I understand my documentation correctly it could be any of those three touches.
Many thanks in advance
Maybe you can add a simple counter property and increase its value in touchesBegan and decrease in touchesEnd.
Okay, it now turns out I have been mis-interpreting my test-data. If two fingers already touch the device when a third finger touches the device, only one UITouch object is part of the NSSet in the call to touchesBegan, and not three as I seemed to experience. This one UITouch represents the last fingertouch.
The only time when more than one UITouch object is passed to touchesBegan is when in fact multiple fingers begin to touch the device at the same time.
Since, in my case, I need to handle all new touches based on their location, I need to handle all UITouch objects in the NSSet.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for ( UITouch *touch in touches)
{
CGPoint location = [touch locationInView:self.view];
// Handle finger touch at given location
// ...
}
}

Objects on screen start blinking when dragging objects

I have an object that the user can drag around the screen in my app. But for some reason, when they start to drag it over another UIView, the items on screen start blinking. So part of the CAGradientLayer will appear in front of everything else, some things will seem to push themselves to the back, all sorts of bizarre activity. I haven't been able to take a screenshot of this unfortunately. This is the code that i've been using to drag the dragging.
NSSet *touches = [event touchesForView:sender];
UITouch *myTouch = [touches anyObject];
CGPoint startPoint = [myTouch locationInView:self.view];
positionX = startPoint.x;
positionY = startPoint.y;
colourDropView.center = CGPointMake(positionX-15, positionY-25);
colourDropView is the object that's being dragged, as you might guess. And it's when that last line is implemented when it starts the blinking, and it happens each time the user moves their finger. No other code is running when they drag their finger, only what is above.
Any ideas as to why this may be happening?
Turns out it's a bug. colourDropView had rounded corners, and had 'maskstobounds' set to YES. For some reason this causes a CAGradientLayer to freak out when this view is dragged over it.

How can an underlying view know if its rectangle got touched, no matter if directly or indirectly?

I have an UIView which is transparent and covers almost the whole screen. I left 50 pixels at the top. It is a child of the View Controller's view.
Underneeth the UIView there's MyView that inherits from UIView, which matches the screen size. And inside this MyView class, I ask for a touch on it very simple with this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
if ([touch view] == self) {
NSLog(#"MyView touched");
}
}
Now the funny thing is, of course, that if the user touches on the transparent UIView which covers that MyView, I don't get "MyView touched" in the console. But when the user touches the little uncovered area of MyView at the top of the screen, then the touch arrives there.
That's logical to me, because I ask for [touch view] == self. But what if I wanted to know that the rectangular area of that MyView got touched (no matter if indirect or direct)?
Is there a way to catch any touch that appears on the screen/window and then just check if it matches the rectangular area of the view?
You should study the iPhone Application Programming Guide's section on Touch Events for the background you're looking for. The concept you want to master is the Responder Chain, so also look through the reference on UIResponder to understand what it's doing. You can definitely do everything you're talking about, and the full discussion is in the link above.

What happens when dragging off screen?

I have an object that can be dragged around. Once the user's finger goes off screen and comes back, I loose the ability to drag the object. If the user then does another touch and drag, everything is fine.
How can I get notified once the user's finger drags back onto the screen? Since touchesBegan doesn't fire, I don't get any notification.
Here is my touchesMoved, which I call in the touchesBegan:
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
//stop object dragging at edge of screen
if(location.x > 35){
myboject.center = location;}
}
The described behaviour seems normal to me, and all built-in Apple apps behave the same way. Since there’s no touch screen outside of the touch screen (yep), I believe there’s no way the device can distinguish between touch beginning or moving from outside of the screen.