I'm trying to overlay a transparent image over my app, with the purpose of explaining the controls of that view. When the UIImageView is first touched it should simply disappear. My MainViewController has the touchesBegan/Ended methods implemented and they work fine until I connect the UIImageView to an IBOutlet in MainViewController. Simply stop responding to events. What am I doing wrong?
-MainView
--ScrollView
--OtherView
--UIImageView (this is the overlay)
UIImageViews do not have user interaction turned on by default. If you want it on, either set the property via IB, User Interaction Enabled or via code...
[myImageView setUserInteractionEnabled:TRUE];
Related
I am trying to capture a touch through a UILabel, but am having trouble. Here is my scenario.
I have a UIButton as a subview of a UIScrollView. I also have a UILabel as a subview of the same UIScrollView. The frame of the UILabel overlaps that of the UIButton, and thus (as far as I can tell) occludes the UIButton from being pressed.
I am trying to create a scenario where the user can touch through the UILabel (it has a transparent background, so the button is completely visible - less the labels text).
Is this possible?
I know touches behave differently when there is a UIScrollView involved. Is that impeding the touches?
Anyone have any advice?
Cheers,
Brett
myLabel.userInteractionEnabled = NO;
Creating transparent UIButton on top of UILabel that too inside a UIScrollView is a design issue for me. If you have to do it then you don't have a choice. It won't work seamlessly though. Don't expect users not to complain. If I don't see a button there and scrolling through the view accidentally triggers button action then I am irritated.
It is possible to create such UI.
I have a UIButton underneath a (transparent) UIView. The UIView above has a UISwipeGestureRecognizer added to it, and that is its only purpose - to detect certain swipe gestures. I want all other touches to be ignored by that UIView, and passed to other views (such as my UIButton underneath). Currently, the UIView above seems to be detecting the tap (for example), doing nothing (as it should be), and not letting the UIButton underneath get a chance to respond.
I would prefer not to implement my own swipe recognizer, if possible. Any solutions / advice? I basically just want to know how to tell a UIView to pay attention to only a certain type of added gesture recognizer, and ignore (and thus let through to views behind) all other touches.
Have you set:
mySwipeGesture.cancelsTouchesInView = NO;
to allow the touches to be sent to the view hierarchy as well as the gesture?
Additionally, ensure that the view on top is:
theTransparentView.opaque = NO;
theTransparentView.userInteractionEnabled = YES;
I've had pretty good success attaching gestures to the parent view without needing to create a transparent subview on top for the gesture. Are you sure you need to do that?
I must have just been in a funk yesterday - I woke up with a simple solution today. Add the UISwipeGesture to a view which is a superview to both the UIView and the UIButton. Then, when processing those swipes, figure out where the swipe originated, and whether that point is in the frame of where I used to have the UIView. (As I mentioned, the only reason for the existence of the UIView was to define a target area for these swipe gestures.)
Can't you put your button on top of the view and add gesture recognisers to that button too?
In the end, your UIButton inherits form UIView via UIControl. Therefore there is practically nothing that you could do with a view but not with a button.
In my case, I fixed it by not using a button, but rather a UITapGestureRecognizer. My pan gesture recognizer was added to the background view, and the tap gesture was added to a view above it.
I have my UIWindow structured in Interface Builder as:
window
drawingView (custom UIView)
toolbar (UIToolbar)
Where the drawingView handles touches using touchesBegan etc. However, when I try and add 'someView' (a custom UIView) which has UIButtons on it as a subview of window, the buttons don't receive any of the touches.
I create 'someView' with:
[[[UIApplication sharedApplication] keyWindow] addSubview:someView];
'someView' displays fine on top of the drawingView, but the touches don't seem to register at all, and get passed through to drawingView. Why is that?
Another odd thing is that the backgroundColor of someView always seems to be clear, even if I set it to something else programatically or in IB.
Also, when I create someView using
[[UIPopoverController alloc] initWithContentViewController:someView];
it works fine, handling touches and all. The reason I want to add it as a subview is because I want a more general way of adding someView, that is works on the iphone as well.
Make sure the userInteractionEnabled flag is set on all parents of the view that needs to receive touches.
I've figured out a fix (in a hackish kind of way).
I had to add 'someView' as a subview of drawingView, which appears on top of drawView when shown, and allows me to programatically set up someView with UIButtons, which receives touches.
I think it's because drawView is an OpenGL view, that overrides layerClass which does something funky with how the views are arranged to obtain touches. I think.
I have a number of UIImageView which have buttons on top of them.
I would like to enable user interaction on the UIImageView behind these buttons.
I see the option in IB, but would like to know how to trigger some code when the UIImageView is actually touched.
How does one do this and how is it set to enabled and disabled in the code rather than IB?
Thanks
how to trigger some code when the UIImageView is actually touched.
You have two options:
Create an instance of UITapGestureRecognizer (or another gesture recognizer), specifying a target and an action method. Then add the gesture recognizer to the image view with -[UIView addGestureRecognizer:]. Works in OS 3.2+.
Subclass UIImageView and override the -touches... methods. Make sure the image views you create are instances of your custom subclass.
See the documentation for details.
how is it set to enabled and disabled in the code rather than IB
Simple: imageView.userInteractionEnabled = YES;
I have a UIImageView, and a referencing outlet. However, altough I've set User Interaction of the ImageView, I cannot see any event in IB. Is it normal or am I missing something?
UIImageView is not a kind of UIControl, so you cannot set events to it. You have to use a UIButton with background, or subclass UIImageView and override the -touchesBegan: methods.