I am trying to capture a touch through a UILabel, but am having trouble. Here is my scenario.
I have a UIButton as a subview of a UIScrollView. I also have a UILabel as a subview of the same UIScrollView. The frame of the UILabel overlaps that of the UIButton, and thus (as far as I can tell) occludes the UIButton from being pressed.
I am trying to create a scenario where the user can touch through the UILabel (it has a transparent background, so the button is completely visible - less the labels text).
Is this possible?
I know touches behave differently when there is a UIScrollView involved. Is that impeding the touches?
Anyone have any advice?
Cheers,
Brett
myLabel.userInteractionEnabled = NO;
Creating transparent UIButton on top of UILabel that too inside a UIScrollView is a design issue for me. If you have to do it then you don't have a choice. It won't work seamlessly though. Don't expect users not to complain. If I don't see a button there and scrolling through the view accidentally triggers button action then I am irritated.
It is possible to create such UI.
Related
I have a UIImageView as subview in my scrollView.
I want to be able to pan up/down on UIImage to adjust the color of the UIImage.
Also I want to be able to pan/zoom around the image(that's why I implemented scrollView).
I have an adjustColor IBAction UIButton which adds the UIPanGestureRecogniser as target and executes the function below:
func panned(gesture: UIGestureRecognizer){
...
}
My problem is that the adjustColor button is ignored by scrollView's scroll behaviour.
If I delete the scrollview and add the UIImage, the adjustColor button activates the color adjustment function and the gestures work perfectly.
On the other hand, if I have the scrollview and the image as subviewimage, my adjustColor button has no functionality.
Any help would be appreciated.
Thank you.
Make sure to connect the UIGestureRecognizer IBOutlet with scrollview.
I have a subview layering problem where I have a rotating arrow and a UIButton. The arrows rotate and the UIButton changes depending on the rotation of the arrows. The problem is that I need to have the UIButton clickable. At the moment the arrows rotate but the UIButton is not touchable. If I try [self.view sendSubviewToBack:wheelControl]; the arrows are sent to the back and are no longer visible.
thanks for any help about how I might fix this.
You need to call
[theView bringSubviewToFront:theButton];
as the last call when laying out your subviews.
I have a UIButton underneath a (transparent) UIView. The UIView above has a UISwipeGestureRecognizer added to it, and that is its only purpose - to detect certain swipe gestures. I want all other touches to be ignored by that UIView, and passed to other views (such as my UIButton underneath). Currently, the UIView above seems to be detecting the tap (for example), doing nothing (as it should be), and not letting the UIButton underneath get a chance to respond.
I would prefer not to implement my own swipe recognizer, if possible. Any solutions / advice? I basically just want to know how to tell a UIView to pay attention to only a certain type of added gesture recognizer, and ignore (and thus let through to views behind) all other touches.
Have you set:
mySwipeGesture.cancelsTouchesInView = NO;
to allow the touches to be sent to the view hierarchy as well as the gesture?
Additionally, ensure that the view on top is:
theTransparentView.opaque = NO;
theTransparentView.userInteractionEnabled = YES;
I've had pretty good success attaching gestures to the parent view without needing to create a transparent subview on top for the gesture. Are you sure you need to do that?
I must have just been in a funk yesterday - I woke up with a simple solution today. Add the UISwipeGesture to a view which is a superview to both the UIView and the UIButton. Then, when processing those swipes, figure out where the swipe originated, and whether that point is in the frame of where I used to have the UIView. (As I mentioned, the only reason for the existence of the UIView was to define a target area for these swipe gestures.)
Can't you put your button on top of the view and add gesture recognisers to that button too?
In the end, your UIButton inherits form UIView via UIControl. Therefore there is practically nothing that you could do with a view but not with a button.
In my case, I fixed it by not using a button, but rather a UITapGestureRecognizer. My pan gesture recognizer was added to the background view, and the tap gesture was added to a view above it.
As described, I have a UIButton subclass, that I am designing in IB. I have set the button subclass to a UIView, and set an image to the button as well. I have set a UILabel beneath the image, attempting to give it the Finder look. Everything works great, except for the fact that the right 1/3rd of the image won't respond to touch!
It is the strangest thing. The button bounds are set to encapsulate the entire image, but that right 1/3rd won'r respond.
Has anyone seen this before? Does anybody know what's going on?
Thanks
How can I add a UIButton in a CALayer and hookup the touch event?
A CALayer is not an event responder, so trying to hook it up to a touch event handler will do nothing.
If you want a button that actually works on top of a CALayer, put that CALayer into a UIView (which is a subclass of UIResponder), and add a UIButton to that view (so it can get added to the event response chain).
In iOS, all UIViews own and draw themselves through a CGLayer. You probably want to create a UIView for your button to go in. Everything you can do with raw CGLayers, you can do with UIViews.