I'm using TouchesBegan TouchesMoved with Multitouch.
I have a manual implementation of what is essentially a button.
I bounds test on the point of TouchesBegan to set the button as down and the same for TouchesEnded to reset it.
The problem is if the user moves the finger out of the bounds of the button before lifting then the TouchesEnded is outside of the bounds of the button where the touch started.
I can't just reset everything on touchesended as the user might still be holding another button down with another finger.
What is the recommended solution to this? UIButton must be doing something similar somehow.
You need to watch touchesMoved: and "deactivate" your button when the touch moves outside of its bounds, and "reactivate" your button when the touch moves back in. See Handling a Complex Multi-Touch Sequences for explanation of how to watch for mutations on a multi-touch sequence (fancy way of saying "which finger was that?")
Related
What I want: Touch a button and a view is added right where the touch is. Without having to lift the finger the touches began/moved automatically begins working on the UIView. So without lifting the finger, I have touched the button and can drag the new view around.
What I don't know how to do:
Stop the touch events on the button and immediately send the touch events to the new view that is directly under the finger.
One option could be to ditch the button and just use the uiview touches to detect when to add the subview you want to let the user drag...
Daniel
As #Daniel suggested ditch the button and just use a UIView, but I believe you may need to have a UIPanGestureRecognizer in place to get your dragging.
You could set a flag when a new UIView is created and then forward any gesture events to that view - only while the user still has their finger down from the initial touch.
After the user has lifted their finger the new view can just deal with gestures by itself by adding a UIPanGestureRecognizer to it.
I must rotate an image but I can't use the gestures or the slider, so I have thinked about using 2 buttons, one for clockwise rotation and the other for couterclockwise rotation of my UIImageView.
But I'm getting notified only one time for touch down event so my image rotate only one time.
There is a way to get continous touch down event when I press my button?
Is there isn't such opportunity, how can I implement this continous rotation with the two buttons?
Sorry if I've said something wrong I'm new to iOS development.
There's a couple of ways you could do this - UIButtons have touchDown and touchUp events, so you could start a process on touchDown and end it on touchUpInside. This discussion might be useful.
Alternatively, rather than using UIbuttons you might want to try detecting a touchesBegan in a particular area (say on a UIImageView containing a button-lie image), then running the rotation animation until the touchesEnded event occurs?
I know I can use UIControlEventTouchDragEnter in order to tell when I have touched a button dragged my touch outside its bounds and then re-entered those bounds. However I was wondering would it be possible to touch the screen, not on a button, and detect when I have dragged over/inside that button?
Also could someone tell me the difference between UIControlEventTouchDragExit UIControlEventTouchDragOutside
Thanks!
You would have to observe touch events on the button's superview and whenever the user's fingers move, call hitTest:withEvent: to check whether the touch coordinates lie on top of the button.
I believe the difference between UIControlEventTouchDragExit and UIControlEventTouchDragOutside is this: when the finger moves from inside the control to the outside, UIControlEventTouchDragExit fires once. Then, as long as the finger remains outside, UIControlEventTouchDragOutside fires on each move. But you should test this yourself.
I have a UIScrollView which is doing some custom pinch zooming. To do this I subclassed UIScrollView, the overwrote the touch methods touchesBegan, touchesMoved, and touchesEnded. Everything works well and as expected.
My problem comes when I try to add a series of UIView subviews, I can only detect taps on my UIScrollView when the UIView UserInteractions is set to NO. I would like to be able to continue to detect two finger taps on my UIScrollView, AND a single finger tap on any of my UIView subview.
Is this possible?
I've tried countless number of ways with little help. Does anyone have any experience in this?
Cheers,
Brett
Apple's documentation for UIScrollView explains how it does it:
it temporarily intercepts a touch-down event by starting a timer and, before the timer fires, seeing if the touching finger makes any movement. If the timer fires without a significant change in position, the scroll view sends tracking events to the touched subview of the content view. If the user then drags their finger far enough before the timer elapses, the scroll view cancels any tracking in the subview and performs the scrolling itself.
There are a couple of methods for interception given as answers to this question: How to make a superview intercept button touch events?
Any UIView with userInteractionEnabled will block touches from reaching views under it. You may have to rethink how you are structuring your layout. Or subclass the UIView to change how it's handling touches.
I'm building an iOS board game (similar to scrabble or words) that involves moving around little tiles on the screen and I'm finding the user sometimes has a hard time touching and moving the tiles around because they're too small. Due to the design of the game, I can't increase the size of the tiles, so I've had to implement some little tricks that make touching and moving the tiles easier for the user and they work well. The only issue that remains is the user sometimes touches just barely outside the tile and when the user tries to move it, the tile stays where it is.
I have two ideas for how I can fix this...
If a tile isn't touched when the user touches the screen, I can use the parent view's touch location to find the closest tile to the touch location and somehow forward the touch event to that tile's view.
If a tile isn't touched when the user touches the screen, I can somehow "catch" the tile when the user drags their finger over it as they attempt to move it.
I'd prefer to implement solution #2 since solution #1 has too many problems associated with, not to mention solution #2 is a more realistic experience. That said, how can I "catch" a tile when the user drags their finger over it and send it touch events to move it where the finger is?
Currently, my tiles are implemented as subclasses of the UIView and they handle the touch events (touchesBegain, touchesMoved, and touchesEnded) directly. So if the user touches just barely outside the view and drags their finger over the view, it doesn't receive any of the touch events since it didn't receive the initial touchesBegan event.
Thanks in advance for all your wisdom!
Maybe you should have the "board" view handle all the dragging. When a touch begins and there is a tile at that point, then start dragging it. Otherwise check whenever the touch is moved and as soon as you find a tile, start dragging it.
You could override hitTest:withEvent: in the board view so that it can still detect when a touch hits a tile, but always return itself so that touch events go to the board view (e.g. record the subview that was hit in a separate member variable, so that you know what to start dragging later on when touch events start coming in).
More Details
When handling touches, UIView will use hitTest to find the view that should receive touch events. The default implementation checks each subview so that the "deepest" subview in the hierarchy gets the touches. In order for the board view to receive touches, you would have to disable userInteraction on all of the tile views. But that means you can't use hitTest to find the tile that was touched, since it ignores views that have userInteraction disabled.
So what I am saying is leave userInteraction enabled on the touch views. But override hitTest on the board view so that it first calls the super implementation in order to find a tile (if the result is self, the board itself was hit). No need to implement your own tile searching. Something like this:
- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent*)event
{
UIView *hitView = [super hitTest:point withEvent:event];
if ( hitView != self )
self.draggingTile = hitView;
return self;
}
Now you know what tile to move in touchesMoved. However, I don't think hitTest is called as the touch is moved, so if no tile has been picked up yet, you may have to call it manually (you can get the point and event from the touch passed to touchesMoved.
Have you looked into the UIGestureRecognizer API? I think your best option would be to add a UIPanGestureRecognizer recognizer to your board's view which would then fire back the selector to your UIViewController.
Setup in ViewDidLoad:
UIPanGestureRecognizer *panGestureRecognizer = [[UIPanGestureRecognizer alloc]
initWithTarget:self action:#selector(handlePan:)];
[[self view] addGestureRecognizer:panGestureRecognizer];
And then implemented the selector:
- (void)handlePan:(UIPanGestureRecognizer *)gestureRecognizer{
CGPoint currentPoint = [gesture locationInView:[self view]];
}
When you set up the gesture recognizer you can set parameters to limit the callbacks (only recognize pans with 1 finger, etc. And in the callback you can check the gesture properties to see if the pan in continuing or if it's coming to an end. You can also grab the current point and determine if it's in or near a tile.