How to flick through a deck of cards? - iphone

I'm making an iPhone cards game where each use has a deck of cards, I'd like to give the user the option to drag his finger through the cards and each card get's highlighted while his finger is on it. This effect is already done in Uno for iPhone. My cards are put into UIButton, what i tried to do is to set a small image to the button in normal state and a bigger image in the highlighted state, it did achieve the effect i was looking for but, the user has to highlight each card individually to be able to see the bigger picture.
Here is the code i used to set the normal and highlighted state of the UIButton:
//player413 is an IBOutlet to a UIButton, and img,imgHigh are UIImages
[player413 setImage:img forState:UIControlStateNormal] ;
[player413 setImage:imgHigh forState:UIControlStateHighlighted] ;
Any guidelines ?

Use a single view to handle all the touch interaction.
EDIT: Oh, all right.
When a view receives a touch hitTest:withEvent: is called recursively, until a view which "accepts" the touch is found.
Once hitTest:withEvent: returns a non-nil value, it's over (by default); that view "owns" the touch (see UITouch.view). Only that view gets touchesBegan/Moved/Ended/Cancelled:withEvent: callbacks.
If you want the touch to affect any card in the deck, the deck should implement touches*:withEvent:, and either set userInteractionEnabled=NO on subviews, or override hitTest:withEvent: so it returns "self" instead of one of the "cards".
Then, in touches*:withEvent:, detect which "card" the touch is on, and then do card.highlighted = YES. If you have multipleTouchEnabled=NO, you can assume that there's only one touch and use UITouch * touch = [touches anyObject].
(There are a handful of UIKit classes which somehow sit between the touch and its owning view: UIScrollView can intercept the touch and scroll instead; gesture recognizers cancel touches when they detect a gesure.)

Related

Passing a touch gesture

I'm looking for a solution to help "pass" a touch gesture along.
Basically I have a menu, and I want users to be able to drag and drop items from the menu to the canvas in a single continuous drag.
I have already achieved a draggable image via pan gestures (we'll call these instances Sprites). I can also instantiate a Sprite anywhere on the UIView using a button or UIImageView with touch gestures.
However, this currently requires two touches. One to touch down the menu item button and release, creating the Sprite. The second to touch down on the sprite, allowing the user to drag it, and then release it where they want. I would like to merge these touches so that when a user touches a menu item, the Sprite is instantiated and already within the pan gesture, or something to that affect.
I've attached a visual description if that helps. Any help is appreciated. Thanks!
There is no way of artificially forcing UIGestureRecognizer to recognize touches that are passed to a different view.
From the Gesture Recognizer docs:
Delivery of events initially follows the usual path: from operating
system to the application object to the window object representing the
window in which the touches are occurring. But before sending an event
to the hit-tested view, the window object sends it to the gesture
recognizer attached to that view or to any of that view’s subviews.
Figure 3-1 illustrates this general path, with the numbers indicating
the order in which touches are received.
Figure 3-1
Event's delivery happens automaticaly by the system and is delivered to the appropriate view.
To do what you want I would implement the UIGestureRecognizer on the subview (view that contains your UIButton) and on press create an instance of your Sprite object and manipulate that object from withing the gesture recognizer of the subview. Alternatively you could use -(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)even to reposition the object yourself.

iOS Touches Began

What I want: Touch a button and a view is added right where the touch is. Without having to lift the finger the touches began/moved automatically begins working on the UIView. So without lifting the finger, I have touched the button and can drag the new view around.
What I don't know how to do:
Stop the touch events on the button and immediately send the touch events to the new view that is directly under the finger.
One option could be to ditch the button and just use the uiview touches to detect when to add the subview you want to let the user drag...
Daniel
As #Daniel suggested ditch the button and just use a UIView, but I believe you may need to have a UIPanGestureRecognizer in place to get your dragging.
You could set a flag when a new UIView is created and then forward any gesture events to that view - only while the user still has their finger down from the initial touch.
After the user has lifted their finger the new view can just deal with gestures by itself by adding a UIPanGestureRecognizer to it.

How can I "catch" a subview during a swipe gesture on an iOS device?

I'm building an iOS board game (similar to scrabble or words) that involves moving around little tiles on the screen and I'm finding the user sometimes has a hard time touching and moving the tiles around because they're too small. Due to the design of the game, I can't increase the size of the tiles, so I've had to implement some little tricks that make touching and moving the tiles easier for the user and they work well. The only issue that remains is the user sometimes touches just barely outside the tile and when the user tries to move it, the tile stays where it is.
I have two ideas for how I can fix this...
If a tile isn't touched when the user touches the screen, I can use the parent view's touch location to find the closest tile to the touch location and somehow forward the touch event to that tile's view.
If a tile isn't touched when the user touches the screen, I can somehow "catch" the tile when the user drags their finger over it as they attempt to move it.
I'd prefer to implement solution #2 since solution #1 has too many problems associated with, not to mention solution #2 is a more realistic experience. That said, how can I "catch" a tile when the user drags their finger over it and send it touch events to move it where the finger is?
Currently, my tiles are implemented as subclasses of the UIView and they handle the touch events (touchesBegain, touchesMoved, and touchesEnded) directly. So if the user touches just barely outside the view and drags their finger over the view, it doesn't receive any of the touch events since it didn't receive the initial touchesBegan event.
Thanks in advance for all your wisdom!
Maybe you should have the "board" view handle all the dragging. When a touch begins and there is a tile at that point, then start dragging it. Otherwise check whenever the touch is moved and as soon as you find a tile, start dragging it.
You could override hitTest:withEvent: in the board view so that it can still detect when a touch hits a tile, but always return itself so that touch events go to the board view (e.g. record the subview that was hit in a separate member variable, so that you know what to start dragging later on when touch events start coming in).
More Details
When handling touches, UIView will use hitTest to find the view that should receive touch events. The default implementation checks each subview so that the "deepest" subview in the hierarchy gets the touches. In order for the board view to receive touches, you would have to disable userInteraction on all of the tile views. But that means you can't use hitTest to find the tile that was touched, since it ignores views that have userInteraction disabled.
So what I am saying is leave userInteraction enabled on the touch views. But override hitTest on the board view so that it first calls the super implementation in order to find a tile (if the result is self, the board itself was hit). No need to implement your own tile searching. Something like this:
- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent*)event
{
UIView *hitView = [super hitTest:point withEvent:event];
if ( hitView != self )
self.draggingTile = hitView;
return self;
}
Now you know what tile to move in touchesMoved. However, I don't think hitTest is called as the touch is moved, so if no tile has been picked up yet, you may have to call it manually (you can get the point and event from the touch passed to touchesMoved.
Have you looked into the UIGestureRecognizer API? I think your best option would be to add a UIPanGestureRecognizer recognizer to your board's view which would then fire back the selector to your UIViewController.
Setup in ViewDidLoad:
UIPanGestureRecognizer *panGestureRecognizer = [[UIPanGestureRecognizer alloc]
initWithTarget:self action:#selector(handlePan:)];
[[self view] addGestureRecognizer:panGestureRecognizer];
And then implemented the selector:
- (void)handlePan:(UIPanGestureRecognizer *)gestureRecognizer{
CGPoint currentPoint = [gesture locationInView:[self view]];
}
When you set up the gesture recognizer you can set parameters to limit the callbacks (only recognize pans with 1 finger, etc. And in the callback you can check the gesture properties to see if the pan in continuing or if it's coming to an end. You can also grab the current point and determine if it's in or near a tile.

How to filter touch events for a UIScrollView?

I have a view that displays a PDF. It should be zoomable, so I also created a UIScrollView, and in its delegate I implemented viewForZoomingInScrollView to return the PDF view. So far so good.
However, when the user reaches the edge of a zoomed PDF page, I'd like to flip to the next page. Sounds easy, yet I can't seem to figure out how to do it.
I've tried some different approaches:
Using scrollViewDidScroll to detect if scrolling has reached the edge. The problem here is that if zoomScale is 1, and therefore scrolling is not possible, then this function is never called. But the UIScrollView still swallows all touch events, so I also can't detect reaching the edge in touchesMoved. Setting canCancelContentTouches to NO when not zoomed is not an option, as that would also prevent zooming in.
Subclassing UIScrollView, and forwarding some of the touch events to the next responder. Unfortunately when UIScrollView detects a drag operation and cancels the touch, touchesMoved and touchesEnded are not called even for the UIScrollView subclass anymore. Again, setting canCancelContentTouches to NO is not good, as that would also prevent some desired UIScrollView functionality.
Creating a transparent view on top of the scroll view (as a sibling of it), so that this view gets all touch events first, and then forwarding some of the touches to the scroll view. Unfortunately the scroll view doesn't respond to these calls.
I can't use touchesShouldCancelInContentView, becasue that doesn't get the actual touches as an argument, and whether or not I want the scroll view to handle the touch event also depends on the properties of the touch event itself (eg. a touch movement in a direction in which we're already at the edge should not be cancelled by the scroll view, but a movement in the other direction could be).
Looks like whatever UIScrollView is doing is not initiated from touchesBegan / touchesMoved, but instead it gets some notifications about the touches way before that. Possibly in some undocumented way that I can't intercept, nor reproduce.
So is there any way to get notified about all touch movements done over a UIScrollView, while still being able to use (when certain conditions apply) the UIScrollView for zooming and scrolling?
Ok, so here's what I did in the end:
Leaving all scrolling and zooming up to UIScrollView, and handling page turning in the UIScrollViewDelegate's scrollViewDidEndDragging:willDecelerate: is almost a solution, except that this function is never called if the whole content is on-screen, so dragging / scrolling is not possible.
Swipes in this case are handled in a ViewController's touchesBegan / touchesEnded functions, but for this to work, we need to make sure that the UIScrollView does not cancel these events. However, in other cases the UIScrollView should be able to cancel touches so that it can do zooming and scrolling.
The UIScrollView should be able to cancel touches if:
Scrolling is possible (and needed) because the whole content doesn't fit on screen (zoomScale > 1 in my case),
OR
The user touched the screen with two fingers, so that zooming in and out works.
When scrolling is not possible, and the user single-touched the screen, then touches should not be cancelled, and touch events should be forwarded to the view controller.
So I created a UIScrollView subclass.
This subclass has a property pointing to the ViewController.
Using the touchesXXX methods I keep track of the current touch count.
I forward all touch events to the ViewController.
And finally, I've overridden touchesShouldCancelInContentView:, and return NO when zoomScale <= 1 and touchCount == 1.

touch multi UIViews

There are a series UIViews arranged very close.
alt text http://www.mobilepanda.com/questiontouch.png
I hope when my finger touches some of them, my app can detect which UIView touched.
Maybe one or two or three.(because the displayed parts of each UIView are too thin).
I hope to get the middle x value of the touch, then spread the UIView where the middle x value locates and the UIViews near it.
alt text http://www.mobilepanda.com/questiontouch1.png
My way is put a transparent UIView over all these UIView to detect the touch event.
I am not sure if this is ok? or there is any better solution.(for example, make each UIView has the capability to detect the touch, mix and decide which UIView is touched.
Welcome any comment
Thanks
interdev
You don't need to do all that. The OS will decide what the center point of the finger touch is and send an event with the touch x,y coordinates to the correct view. If you make them UIButton's (a subclass of UIView) instead of UIView's the OS will do all the work for you. All you have to do is attach callbacks to each button to the functions you want called for various events (like touchUpInside, touchDownInside, etc).