I have a UIScrollview in my app and I populate it with LOTS of UIImageViews approx 900. They are all very small and consist of only two different images over and over again.
Now I am having a lot of trouble detecting a touch on one of these UIImageViews.
I have assigned them all a unique TAG so as to be able to distinguish between them but I am really struggling to detect the touch.
The goal is just to be able to change the image of the touched UIImageView.
Due to the large amount of views involved a simple loop checking touch coordinates against each UIImageViews frame is just hanging my app.
Does anyone have any suggestions?
Thanks in Advance.
Ben
In my case the easiest way to do it was adding a gesture recognizer:
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(singleTapGestureCaptured:)];
//Default value for cancelsTouchesInView is YES, which will prevent buttons to be clicked
singleTap.cancelsTouchesInView = NO;
[myScrollView addGestureRecognizer:singleTap];
Then use this method to capture the touch:
- (void)singleTapGestureCaptured:(UITapGestureRecognizer *)gesture
{
CGPoint touchPoint=[gesture locationInView:myScrollView];
}
UIImageView has touch processing turned off by default. To change that set userInteractionEnabled to YES (see the docs here http://developer.apple.com/library/ios/#DOCUMENTATION/UIKit/Reference/UIImageView_Class/Reference/Reference.html)
Also if you want to know which view was hit in a view hierarchy you can use hitTest:withEvent: (see docs in UIView) which will be much faster than you looping through the hierarchy.
On a wider note I don't think having 900 UIImageView's on the screen all at once is going to be a good long term strategy. Have you investigated drawing the screen's content via CoreGraphics?
The best way to solve this issue is to subclass UIScrollView, add touchesBegan etc methods to it.
#interface MyScroll : UIScrollView
{
}
You could easily implement this method on the UIImageView object:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
if ([self pointInside:point withEvent:event]) {
//You clicked inside the object
}
return [super hitTest:point withEvent:event]
}
Use the return to tell the application the next touch responder (such as the uiimageview or the scrollview).
Related
I'm in the process of creating a universal iOS app that, amongst other functions, allows the user to spawn UIImageViews on touch.
The issue that I'm having is that when I rotate the device, the views that are created do not resize correctly.
I have placed the code into my main view controller method, as shown:
-(void)touchesBegan:(NSSet*)touches withEvent:(UIEvent *)event
{
//first part - creating the frame
view = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"MyImage.png"]];
view.frame = CGRectMake(50, 50, 100, 100);
[self.view addSubview:view];
//second part - triggering image placement on touch
UITouch *touch = [[event allTouches] anyObject];
view.center = [touch locationInView:self.view];
}
Does anybody know of any way in which I can implement an autoresizing method for the views that are dynamically created?
The effect that I'm looking for is: instead of the images using the old portrait coordinates, they somehow resize with an orientation change into landscape, so that they're all visible in positions that are relative to the portrait ones.
If I were using interface builder then it would be easy enough, but I can't find an obvious solution when doing this programmatically.
It's not as simple as returning YES for the shouldAutorotateToInterfaceOrientation, either.
I've checked a lot of the question/answer resources for iOS development, and not many of the solutions seem to focus on dynamically created UIImageViews and frames such as in my example.
I would be extremely grateful if anybody could take the time to provide me a solution, or even just point me in the right direction.
Thank you in advance,
Rory
set the auto resizing mask accordingly
every UIView has a property UIViewAutoresizing
view.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight;
I have a UIView and a UIWebView.
The UIWebView is a child view of the UIView.
The UIWebView contains a youtube video and is set as to let the video fit to the UIWebView.
I have a UITapGesture associated with the parent UIView for single tap, say, if a user single tap the whole view, it will invoke A.
So, when the UIWebView loads the youtube video, there is a button on top of the video waiting for users to click to play. But now if I click the button, the youtube is played, but also the A is invoked too. This is what I don't want.
How should I solve it?
I thought the touch/tap event should be in a order and if the button is clicked, it should absorb the event and not give the UIView any more.
I also tried to add another UIView under the UIWebView, and attach that gesture to the underlying view. However, it still doesn't work.
How can I do to let the button over the youtube video independent?
thanks
Try add this code in your view's .m file (if you are using UIView, subclass it):
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
CGPoint p = [self convertPoint:point toView:webView];
if ([webView pointInside:p withEvent:event]) {
return [webView hitTest:p withEvent:event];
}
return [super hitTest:point withEvent:event];
}
Since I did not try this code, tell me if it doesn't work. There are still many ways to deal with the situation. :)
In my app I have a UIView derived class Canvas that uses touchesBegan: withEvent:, touchesMoved: withEvent:, and touchesEnded: withEvent: to draw in the canvas. I also want to use a swipe to load the previous (or next) canvas in an array. I tried setting up the following gesture (and a similar one for right):
UISwipeGestureRecognizer* leftSwipe = [[UISwipeGestureRecognizer alloc] initWithTarget: self action: #selector(pageFlipNext)];
leftSwipe.direction = UISwipeGestureRecognizerDirectionLeft;
leftSwipe.numberOfTouchesRequired = 2;
[_canvas addGestureRecognizer: leftSwipe];
[leftSwipe release];
But my two fingered swipes are still being treated as one-fingered drawing instructions.
How do I get my app to handle the two-fingered swipe correctly?
First of all, I would get started verifying whether _canvas.multipleTouchEnabled is set to YES. If it isn't, set it to YES.
You might also want to consider
leftSwipe.delaysTouchesBegan = YES;
This will delay the touches to be sent to the _canvas until the gesture fails. You can also use a UIPanGestureRecognizer and do something like this,
[pan requireGestureRecognizerToFail:leftSwipe];
I think that you can use UIPanGestureRecognizer
UIPanGestureRecognizer is a concrete
subclass of UIGestureRecognizer that
looks for panning (dragging) gestures.
The user must be pressing one or more
fingers on a view while they pan it.
I'm drawing a graph on a UIView, which is contained by a UIScrollView so that the user can scroll horizontally to look around the entire graph.
Now I want to zoom the graph when a user pinches in with two fingers, but instead of zooming in a view with the same rate for X and Y direction, I want to zoom only in the X direction by changing the X scale, without changing the Y scale.
I think I have to catch the pinch in/out gesture and redraw the graph, overriding the default zooming behavior.
But is there a way to do this?
I've been having a very difficult time to catch the pinch gesture on the UIScrollView, as it cancels the touches when it starts to scroll. I want the zooming to work even after the UIScrollView cancels the touches. :(
Thanks,
Kura
Although you cannot delete the existing pinch gesture recognizer, you can disable it and then add your own:
// Disable existing recognizer
for (UIGestureRecognizer* recognizer in [_scrollView gestureRecognizers]) {
if ([recognizer isKindOfClass:[UIPinchGestureRecognizer class]]) {
[recognizer setEnabled:NO];
}
}
// Add our own
UIPinchGestureRecognizer* pinchRecognizer =
[[UIPinchGestureRecognizer alloc] initWithTarget:self
action:#selector(pinch:)];
[_scrollView addGestureRecognizer:pinchRecognizer];
[pinchRecognizer release];
Then in
- (void) pinch:(UIPinchGestureRecognizer*)recognizer { .. }
use
[recognizer locationOfTouch:0 inView:..]
[recognizer locationOfTouch:1 inView:..]
to figure out if the user is pinching horizontally or vertically.
You should instead access the gestureRecognizers (defined in UIView), there are several of them being used by the scroll view,
figure out which one is the pinch recognizer and call removeGestureRecognizer: on the scroll view, then create your own and have it do the work, add it back with addGestureRecognizer:.
these are all public API,
the recognizers and what order they are in are not (currently),
so program defensively when accessing them
(this is a perfectly valid way to manipulate UIKit views, and Apple won't/shouldn't have issues with it - though they will not guarantee it works in any future release)
You should be able to subclass UIScrollView and override the touchesBegan: method. Don't call [super touchesBegan:] but instead, adjust the zoom as you like:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
//Anything you want. Probably you would want to store all the touches
//or their values, so that you can compare them to the touches
//in the touchesEnded: method,
//thus letting you know what the pinch amount was
}
If you like, you can judge whether it's a pinch or not, and if it's not, call the super method, and only handle it yourself for custom pinches.
Edsko & bshirley answers are good, but they don't tell where to place the code.
First, I placed it in viewDidLoad method, but no Pinch Gesture Recognizer was found in the scrollview (maybe because my scrollview is an IBOutlet).
Then I tried in viewWillAppear or viewDidAppear and the UIPinchGestureRecognizer was here.
I have tried various solutions provided on this site and others to implement touch events on uiwebview. But still I am not able to do this. Actually, i have created a article reader application. Here, I have added a uiwebview on the normal uiview. Now, I want to trace some user touch events on that particular webview.
When I do this on normal view, it works perfectly. But if I try it on webview. it stops working.
The solutions I tried before are
implementing touch methods like
touchbegan
touchended
touchmoved
touch cancelled
2 implementing uigesturerecognizer
3 implementing window touch events like send event
Now If anyone can help me or tell me where I am doing wrong or a new solution(other than this), then I will be thankful.
Put a transparent UIVIew on top of your UIWebView to capture touches. You can then act on them or optionally pass them down to the UIWebView using the touchesBegan deletage method.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[self.myUIWebView touchesBegan:touches withEvent:event];
}
Check this previous post for details: iOS - forward all touches through a view
I subclassed UIWebView and just "leeched" onto its gesture recognizers in subviews 2 levels deep (you could go recursively but thats enough for iOS6-7).
Then you can do whatever you want with the touch location and gesture recognizer's state.
for (UIView* view in self.subviews) {
for (UIGestureRecognizer* recognizer in view.gestureRecognizers) {
[recognizer addTarget:self action:#selector(touchEvent:)];
}
for (UIView* sview in view.subviews) {
for (UIGestureRecognizer* recognizer in sview.gestureRecognizers) {
[recognizer addTarget:self action:#selector(touchEvent:)];
}
}
}