I have a UIView control (white rectangle in image)
Moreover I am able to drag that control...
And when I press the button I load a subview which is another nib that I created and I placed random controls in it to illustrate my point...
If you guys are interested in finding out how I placed that nib file in that UIView control take a look at this question. I don't thing you have to read it in order to understand my question.
Anyways the problem when loading that nib file is that I can no longer drag the top UIView. Because of this I changed:
for:
in the UIView of the subview. In otherwords the UIView of the nib file that I am placing in the UIView that has the white background.
and when I did that I was able to drag the control but the controls inside the subview no longer works. I have also tried placing the touchesMoved method in the subview instead but when I do that the application behaves strange. Plus the purpose of placing the nib file in a UIView control was to avoid repeating the same drag funcionality on several nib files.
I actually need to create an application like a power point presentation and I need to change the slide as the user slides the UIView and if it's cords are less than x for example then I load the next slide (nib file) in that uiview controller. Maybe there is a simpler way of doing what I need but if I get this drag to work I am done cause I would just have to do that functionality just once.
You should leave the UserInteractionEnabled flag on for your subview if you want it to respond to events.
One way to achieve this would be to do your dragging using a UIGestureRecognizer.
UIPanGestureRecognizer is perfect for this (UIGestureRecognizer at apple)
Basically you'd attach the gesturerecognizer to the view which you want to pan then adjust it's position in the callbacks it provides.
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc]
initWithTarget:self action:#selector(handlePanGesture:)];
panGesture.minimumNumberOfTouches = 1;
[draggableSubview addGestureRecognizer:panGesture];
[panGesture release];
Then in the handlePanGesture method you figure out how far the user panned using the translationInView method of the recognizer which it gets passed and translate the subview accordingly.
Related
I'm creating an app for the iphone in xcode where the background image is much larger than the screen. I've added in a pan gesture recognizer to the imageview of the background, which allows it to be moved freely so the user can see beyond the character they control.
This is working perfectly, but when I add in additional imageviews(for the character and other objects throughout the area) they will remain stationary. Is there a way to connect multiple imageviews to a single pan gesture recognizer so that when one is moved, they all move together?
I tried connecting the other imageviews to the referencing outlet collections of the pan gesture recognizer, but when the program is run it only pans the last imageview connected, while the others remain stationary.
I'm new to xcode, and I'm sure this is something relatively simple that I'm overlooking, but I haven't been able to find an answer on the internet so any help would be appreciated. :)
Thanks!
The best way is to make the character (and other views) a subview of the backgroundImage.
Note that in the NIB (.xib) file you cannot add a Subview to UIImageView but you can do it programatically.
If you prefer working in the NIB file you can make a "container" UIView that holds the background image and other characters inside that view.
Then for the gesture recognisers you'll transform that view.
I have an atypical iOS interface. Perhaps it's not practical but I'm giving it a go. Hope someone can help!
I have a menu in the form of a UIVIew. It contains 5 small UIImageViews. A UIPinchGestureRecognizer is attached to the UIVIew. When pinched inward, the 5 UIImageViews animate from off screen to form a circle in the middle of the window. When pinched outward, they animate back offscreen. Everything works great there.
I'd like to be able to, at any point in the application, pinch the screen to reveal the menu, select one of the 'buttons' (UIImageView), and load the associated subview.
The real problem is, if the current visible view is a UIScrollView or UITableView, my app is having trouble figuring out whether the menu or other subview should handle the touch event. If I really focus and make sure two finger touch the screen at the EXACT same time, the pinch will work and pull the menu inward. But otherwise, it attempts to scroll the current visible view.
I would like all events except the pinch gesture, (and a tap gesture when the menu is visible), to pass through the menu view to the rest of the subviews.
I understand I can override the hitTest:withEvent method to determine the correct view to handle the event, but I'm unclear at this point how exactly to use it. Neither the Apple docs nor any answers I've read on stack overflow have made this method clear to me.
Any help is much appreciated.
As UITableView is a subclass of UIScrollView, it inherits all of UIScrollView's properties including its gesture recognisers.
UIScrollView declares a UIPinchGestureRecognizer and UIPanGestureRecognizer. I'm not sure of the implementation details but I imagine the UITableView disables the pinch gesture recogniser as you are not supposed to be able to zoom a tableview!
In any case, you can attach your own UIPinchGestureRecognizer to the table view:
UIPinchGestureRecognizer *yPGR = [[UIPinchGestureRecognizer alloc]
initWithTarget:probablySelf action:yourMenuShowSelectorHere];
UITableView *tv = ...
// ...
[tv addGestureRecognizer:yPGR];
Then, you can make sure that the UITableView scoll does NOT scroll until your pinch has failed:
[tv.panGestureRecognizer requireGestureRecognizerToFail:yPGR];
This way, the UITableView will not scroll until it is sure that it has not detected a pinch.
EDIT: UIScrollView only uses (or at least declares public access to) UIGestureRecognizers in iOS 5 and up.
I have a UIButton underneath a (transparent) UIView. The UIView above has a UISwipeGestureRecognizer added to it, and that is its only purpose - to detect certain swipe gestures. I want all other touches to be ignored by that UIView, and passed to other views (such as my UIButton underneath). Currently, the UIView above seems to be detecting the tap (for example), doing nothing (as it should be), and not letting the UIButton underneath get a chance to respond.
I would prefer not to implement my own swipe recognizer, if possible. Any solutions / advice? I basically just want to know how to tell a UIView to pay attention to only a certain type of added gesture recognizer, and ignore (and thus let through to views behind) all other touches.
Have you set:
mySwipeGesture.cancelsTouchesInView = NO;
to allow the touches to be sent to the view hierarchy as well as the gesture?
Additionally, ensure that the view on top is:
theTransparentView.opaque = NO;
theTransparentView.userInteractionEnabled = YES;
I've had pretty good success attaching gestures to the parent view without needing to create a transparent subview on top for the gesture. Are you sure you need to do that?
I must have just been in a funk yesterday - I woke up with a simple solution today. Add the UISwipeGesture to a view which is a superview to both the UIView and the UIButton. Then, when processing those swipes, figure out where the swipe originated, and whether that point is in the frame of where I used to have the UIView. (As I mentioned, the only reason for the existence of the UIView was to define a target area for these swipe gestures.)
Can't you put your button on top of the view and add gesture recognisers to that button too?
In the end, your UIButton inherits form UIView via UIControl. Therefore there is practically nothing that you could do with a view but not with a button.
In my case, I fixed it by not using a button, but rather a UITapGestureRecognizer. My pan gesture recognizer was added to the background view, and the tap gesture was added to a view above it.
I have my UIWindow structured in Interface Builder as:
window
drawingView (custom UIView)
toolbar (UIToolbar)
Where the drawingView handles touches using touchesBegan etc. However, when I try and add 'someView' (a custom UIView) which has UIButtons on it as a subview of window, the buttons don't receive any of the touches.
I create 'someView' with:
[[[UIApplication sharedApplication] keyWindow] addSubview:someView];
'someView' displays fine on top of the drawingView, but the touches don't seem to register at all, and get passed through to drawingView. Why is that?
Another odd thing is that the backgroundColor of someView always seems to be clear, even if I set it to something else programatically or in IB.
Also, when I create someView using
[[UIPopoverController alloc] initWithContentViewController:someView];
it works fine, handling touches and all. The reason I want to add it as a subview is because I want a more general way of adding someView, that is works on the iphone as well.
Make sure the userInteractionEnabled flag is set on all parents of the view that needs to receive touches.
I've figured out a fix (in a hackish kind of way).
I had to add 'someView' as a subview of drawingView, which appears on top of drawView when shown, and allows me to programatically set up someView with UIButtons, which receives touches.
I think it's because drawView is an OpenGL view, that overrides layerClass which does something funky with how the views are arranged to obtain touches. I think.
UIView -> UIImageView
I know I have things somewhat working ok since I can tap on my UIImageView and see an NSLog() statement in my touchesBegan method.
.
UIView -> UIScrollView -> UIImageView
I drag that same UIImageView into a UIScrollView and touchesBegan no longer gets called when I tap on my UIImageView. (I haven't changed anything else. All the same connections, methods, and code remains unchanged.)
Why does touchesBegan no longer work? And what can I do to get it working again?
Add uitapgesture to get event
Code is
UITapGestureRecognizer *ges11=[[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(Handeltap:)];
[imagename addGestureRecognizer:ges11];
Create one action name "Handeltap" U will get called there.
by default UIImageView don't handle user gestures.
set UIImageView instance's userInteractionEnabled to YES
Have a look at the documentation for UIScrollView.
Because a scroll view has no scroll bars, it must know whether a touch signals an intent to scroll versus an intent to track a subview in the content. To make this determination, it temporarily intercepts a touch-down event by starting a timer and, before the timer fires, seeing if the touching finger makes any movement. If the timer fires without a significant change in position, the scroll view sends tracking events to the touched subview of the content view. If the user then drags their finger far enough before the timer elapses, the scroll view cancels any tracking in the subview and performs the scrolling itself. Subclasses can override the touchesShouldBegin:withEvent:inContentView:, pagingEnabled, and touchesShouldCancelInContentView: methods (which are called by the scroll view) to affect how the scroll view handles scrolling gestures.
I'd also recommend reading the Scroll View Programming Guide.