I have an UILabel. I want to glow the touch area in it. That is, when I click on a point in that UILabel, a small circular portion around the touch point should be presented with glowing effect. How can get this?
Well you can do that by creating a CALayer or a CAGradientLayer based on how you want your glow and add it as a sublayer to the label's layer at the location of the touch.
For enabling the touch on UILabel, look at userInteractionEnabled property. You will need to set it to YES.
Then you will need to attach a UITapGestureRecognizer to the label for getting the tap. Once you have the touch location, add the custom glow layer as a subview to the label's layer in an invisible state. Animate the glow layer in and out. You might want to repeat a few times before removing the glow layer as a sublayer at completion.
UILabels don't respond to touches.
Use a UIButton with a custom type and provide images for the normal state and the (glowing) highlighted state.
Gesture recognizers are nice, but if you want to do something that starts when the finger touches, stops when the finger stops touching, and moves around with the finger in between, then it's hard to think of a good gesture recognizer for that. I think in that case you'd be better off just using touchesBegan, touchesMoved, and touchesEnded (don't forget touchesCancelled).
You can either put those methods on your view controller or subclass UILabel. Either way, set userInteractionEnabled = YES on the label.
As for how to graphically make that effect, I don't have any clever ideas for it at the moment.
Related
I have two UIImageView in a UIView in my application. And I need to Zoom, Rotate and Move the UIImageView which is at bottom of the top one, I don't need to do any thing with top one. I've already implemented the code for Move, Rotate and Zoom but the problem is I cant enable the touch to UIImageview in bottom.
How to solve this problem?
UIImageView's have userInteractionEnabled set to NO by default. You have to explicitly set this to YES for the image view you want to allow touch events to occur on.
I have a UIButton underneath a (transparent) UIView. The UIView above has a UISwipeGestureRecognizer added to it, and that is its only purpose - to detect certain swipe gestures. I want all other touches to be ignored by that UIView, and passed to other views (such as my UIButton underneath). Currently, the UIView above seems to be detecting the tap (for example), doing nothing (as it should be), and not letting the UIButton underneath get a chance to respond.
I would prefer not to implement my own swipe recognizer, if possible. Any solutions / advice? I basically just want to know how to tell a UIView to pay attention to only a certain type of added gesture recognizer, and ignore (and thus let through to views behind) all other touches.
Have you set:
mySwipeGesture.cancelsTouchesInView = NO;
to allow the touches to be sent to the view hierarchy as well as the gesture?
Additionally, ensure that the view on top is:
theTransparentView.opaque = NO;
theTransparentView.userInteractionEnabled = YES;
I've had pretty good success attaching gestures to the parent view without needing to create a transparent subview on top for the gesture. Are you sure you need to do that?
I must have just been in a funk yesterday - I woke up with a simple solution today. Add the UISwipeGesture to a view which is a superview to both the UIView and the UIButton. Then, when processing those swipes, figure out where the swipe originated, and whether that point is in the frame of where I used to have the UIView. (As I mentioned, the only reason for the existence of the UIView was to define a target area for these swipe gestures.)
Can't you put your button on top of the view and add gesture recognisers to that button too?
In the end, your UIButton inherits form UIView via UIControl. Therefore there is practically nothing that you could do with a view but not with a button.
In my case, I fixed it by not using a button, but rather a UITapGestureRecognizer. My pan gesture recognizer was added to the background view, and the tap gesture was added to a view above it.
I have a UIScrollView which is doing some custom pinch zooming. To do this I subclassed UIScrollView, the overwrote the touch methods touchesBegan, touchesMoved, and touchesEnded. Everything works well and as expected.
My problem comes when I try to add a series of UIView subviews, I can only detect taps on my UIScrollView when the UIView UserInteractions is set to NO. I would like to be able to continue to detect two finger taps on my UIScrollView, AND a single finger tap on any of my UIView subview.
Is this possible?
I've tried countless number of ways with little help. Does anyone have any experience in this?
Cheers,
Brett
Apple's documentation for UIScrollView explains how it does it:
it temporarily intercepts a touch-down event by starting a timer and, before the timer fires, seeing if the touching finger makes any movement. If the timer fires without a significant change in position, the scroll view sends tracking events to the touched subview of the content view. If the user then drags their finger far enough before the timer elapses, the scroll view cancels any tracking in the subview and performs the scrolling itself.
There are a couple of methods for interception given as answers to this question: How to make a superview intercept button touch events?
Any UIView with userInteractionEnabled will block touches from reaching views under it. You may have to rethink how you are structuring your layout. Or subclass the UIView to change how it's handling touches.
I'd like to have a UIView where the user can select each of the four corners and stretch the view by independently moving them.
How would I implement such a view?
To do this, you will need to subclass UIView and handle touch events manually. When you get a touch event, you will have to do some math and then set the frame of the view to the new size. I'd recommend making the background image a stretchable image using stretchableImageWithLeftCap:topCap. It shouldn't actually be that hard.
There are a series UIViews arranged very close.
alt text http://www.mobilepanda.com/questiontouch.png
I hope when my finger touches some of them, my app can detect which UIView touched.
Maybe one or two or three.(because the displayed parts of each UIView are too thin).
I hope to get the middle x value of the touch, then spread the UIView where the middle x value locates and the UIViews near it.
alt text http://www.mobilepanda.com/questiontouch1.png
My way is put a transparent UIView over all these UIView to detect the touch event.
I am not sure if this is ok? or there is any better solution.(for example, make each UIView has the capability to detect the touch, mix and decide which UIView is touched.
Welcome any comment
Thanks
interdev
You don't need to do all that. The OS will decide what the center point of the finger touch is and send an event with the touch x,y coordinates to the correct view. If you make them UIButton's (a subclass of UIView) instead of UIView's the OS will do all the work for you. All you have to do is attach callbacks to each button to the functions you want called for various events (like touchUpInside, touchDownInside, etc).