I'm new to iphone development, and I wanted to know. exactly what does the UITouch instance method view do
[touch view];
I read up on the documentation and understand that it returns the view that the touch was 'in', but what if there exists a view hierarchy. I had originally assumed it would return the subview that is furthest up 'front'. Does this assumption always hold true?
What is the recommended way of determining if the touch was on a certain view or not.
Yes, it would return the view that was actually touched (the topmost view) regardless of where the touch is handled.
The only exception I can think of would be if the top-most view was invisible.
Actually, I believe it gives you where the touch 'started' even though the touch may now be in a different view. Something to keep an in mind if you are watching touchMove and touchEnd events.
Related
I am using UIPageViewController to load 10 webpages in a webview.
All the webpages are loading one by one properly. But I am facing a weird problem in
- (UIViewController *)pageViewController:(UIPageViewController *)pageViewController viewControllerAfterViewController:(UIViewController *)viewController
Right now I am in 1st page. So there are no possibility of going back to the
- (UIViewController *)pageViewController:(UIPageViewController *) pageViewController viewControllerBeforeViewController:(UIViewController *)viewController
After I load the 1st page, I was just scrolling up and down with the WebView. Suddenly the viewControllerAfterViewController method is getting called. And strange thing is , it is not moved to the next view i.e (2nd page) is not loaded.
What could be the problem ?
I'm not sure but, I think the problem is in the gesture recognizers, or the location of there CGRects. Basically the page view controller listens for a touches-move. So if the user swipes the scroll (on the webView) page view tries to execute a page turn. (I'm guessing) I think a solution would be to make sure the page view touch location (CGRect) does not overlap with the web view. Apparently you can set the position (region) for the touch for the UIPageViewController" to turn the page.
You can start by looking up "UIGestureRecognizer Class Reference" in the iOS Reference Library. I hope this helps
Here's what I found there:
locationInView:
Returns the point computed as the location in a given view of the gesture represented by the receiver.
(CGPoint)locationInView:(UIView *)view
Parameters
view
A UIView object on which the gesture took place. Specify nil to indicate the window.
Return Value
A point in the local coordinate system of view that identifies the location of the gesture. If nil is specified for view, the method returns the gesture location in the window’s base coordinate system.
Discussion
The returned value is a generic single-point location for the gesture computed by the UIKit framework. It is usually the centroid of the touches involved in the gesture. For objects of the UISwipeGestureRecognizer and UITapGestureRecognizer classes, the location returned by this method has a significance special to the gesture. This significance is documented in the reference for those classes.
Availability
Available in iOS 3.2 and later.
See Also
– locationOfTouch:inView:
Declared In
UIGestureRecognizer.h
locationOfTouch:inView:
Returns the location of one of the gesture’s touches in the local coordinate system of a given view.
(CGPoint)locationOfTouch:(NSUInteger)touchIndex inView:(UIView *)view
Parameters
touchIndex
The index of a UITouch object in a private array maintained by the receiver. This touch object represents a touch of the current gesture.
view
A UIView object on which the gesture took place. Specify nil to indicate the window.
Return Value
A point in the local coordinate system of view that identifies the location of the touch. If nil is specified for view, the method returns the touch location in the window’s base coordinate system.
Availability
Available in iOS 3.2 and later.
See Also
– locationInView:
Declared In
UIGestureRecognizer.h
I have two overlapping custom views that need to both receive touch events (e.g. touchesBegan and touchesMoved). However I can only get one of the Views (the top one) to receive the events. I have tried forwarding the events from one view to the other view using:
[otherView touchesEnded:touches withEvent:event];
but this does not always work.
I need the touch events to be sent to the two views simultaneously. Can anyone help?
If you intercept a touch, you should usually call [super methodYouAreIntercepting] at the end of the method, if you still want the touch to go through to the next layer. If you do this, and the two views are directly on top of each other, then you don't need to manually forward the touches the way you have been doing. because your comment above suggests that you haven't been calling super in the method, I bet this will solve your issue.
I'm using [aSubview touchesBegan] to move aSubview's position around on a screen in relation to its superview. Its superview is not much larger than the subview itself. This is quite straightforward to do as the following snippet shows:
UITouch* touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:[self superview]];
self.center = touchPoint;
However once a aSubview is moved, as soon as any portion of it falls outside the bounds of its superview, touches in that section no longer register. In other words, touchesBegan no longer fire. I want touches in aSubview to register no matter where it's moved in relation to its superview.
Any thoughts?
Howard
mcpunky's answer is almost good, except you can NOT make pointInside function to always return YES. This way the view will intercept all touches.
Instead, one needs to do more fine check:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event{
return CGRectContainsPoint(self.subviewOutsideMe.frame, point) || CGRectContainsPoint(self.bounds, point);
}
I just had this problem with subviews not receiving input because the superview was simply not sending touch events for subviews that are outside it's bounds. Also, it was crucial to keep the bounds of the superview as they were and moving subviews up the hierarchy also wasn't feasible. What did the job for me was overriding superview's
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
to always return YES. Result: pretty much nothing changed in superview, nor the subviews, but input is received outside superview's boundaries.
I've talked to an Apple engineer about this. touchesBegan won't work in the portion of a subview that's not contained w/in its superview because the system clips each subview on the way down the hierarchy as it tries to determine which subview's touchesBegan gets called.
In order to resolve the issue, I removed the intermediate wrapper views that were causing the clipping problem and hoisted the subviews up one level. This necessitated a minor change in logic but ultimately proved to be a cleaner solution -- and more importantly, one that worked.
I'm unsure if this is the correct way, but it's certainly one way.
Listen to the touchesBegan even in your parent object.
If you get an event, pass it onto the childs view by calling its touchesBegan yourself.
So when I see ccTouchesBegan (or touchesBegan for that fact of the matter) I see something like this usually:
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
}
But, what I am not getting is how do you just detect if one object has been touched? For example how would I check if a specific CCSprite I declared has been touched?
Sorry if this is a newbish question but, I just don't understand and if you need more clarification just ask how I can clarify myself more.
I'm not familiar with cocoas2d but in the standard API it sends the touches first to the view touched and then up the view responder chain to a view that has a controller. If that controller does not handle the touch then it goes up to the next view until it ends up at the Window object.
See Responder Objects in the Responder Chain
The best place to trap touches for specific objects is in the object themselves. In the case of sprite-like view, the sprite itself most likely needs to respond to the touch e.g. by moving itself. If you need the touch to be communicated to another object, you should use the delegate pattern so that the sprite can tell its delegate how its been touched.
That last sentence sounded weird.
I don't have the samples in front of me but there should be an example in the Cocos2D download package which demonstrates a touch event and how it propagates down to sprites.
I have a UIScrollView with the requirement that, when zooming, the contentSize.height should remain the same. Zooming in from 200x100 should result in a new contentSize of 400x100 instead of 400x200, for instance. I'd like to do my own drawing while the user is zooming.
I don't think I can use the normal zooming behaviour of UIScrollView to achieve this, so I'm trying to roll my own. (I could just let it do its thing and then redraw my contents when -scrollViewDidEndZooming:withView:atScale: gets called, but that wouldn't be very pretty).
Currently I am subclassing UIScrollView and trying to do my own zooming when two fingers are on the screen:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if ([touches count] != 2) {
[super touchesMoved:touches withEvent:event];
} else {
// do my own stuff
}
}
I thought that by overriding touchesBegan:withEvent:, touchesMoved:withEvent:, touchesEnded:withEvent: and touchesCancelled:withEvent: in this way should work, but it doesn't.
An earlier failed attempt was to place a transparent view on top of the scrollview and send touches that I'm not interested in to the scrollview :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if ([touches count] != 2) {
[self.theScrollView touchesMoved:touches withEvent:event];
} else {
// do my own stuff
}
}
Any help would be appreciated.
Thanks,
Thomas
You probably won't be able to maintain decent performance during zooming if you attempt to redraw your content on every frame of a pinch-zooming event. I'd recommend taking the approach of letting the UIScrollView zoom a scaled version of your drawing in or out, then redraw the content to be sharp at the end of the zoom in the -scrollViewDidEndZooming:withView:atScale: delegate method. This is what I do in my application, and it ends up working very well.
There are some tricks to resizing your content view properly at the end of a zoom, which I describe in this answer. Basically, you need to intercept the setting of a transform to your content view so that you can set it to a scale factor of 1 when you redraw your content. You'll also need to keep track of that scale factor, because the UIScrollView doesn't, and use that scale factor to adjust the transform that UIScrollView tries to apply to your content view with subsequent zoom operations.
You could use a modification of this technique to force a redraw of your content during the pinch-zooming, but in my tests this ended up being far too jerky to provide a good user experience.
I'm not sure what you mean by this:
I thought that by overriding
touchesBegan:withEvent:,
touchesMoved:withEvent:,
touchesEnded:withEvent: and
touchesCancelled:withEvent: in this
way should work, but it doesn't.
Do you not get the events? You should receive the events, but I think there is a logic error in your if statement that may have been preventing this from working.
if ([touches count] != 2)
This is a problem, because the likelihood of the two touches happening precisely the same time is low. You'll want to be able to handle when touches happen independently, as well as when a user holds a finger stationary, and moves the other one. In these scenarios (which are common) you may only get one touch in that NSSet, even though the other is still valid.
The solution to handling touches properly is to remember some things about which touches came in and which touches left. Remember, the address of the UITouch does not change for the life of the touch, so you can safely compare addresses to ensure you are still dealing with the same touch as before, and track it's lifecycle.
If you are not getting the touch events, then that is a different problem altogether, you may need to turn set multiTouchEnabled:YES
I'm trying to do the same thing as this and I really want to be able to redraw while it's zooming. Fixing it up at the end in scrollViewDidEndZooming:withView:atScale is not really good enough.
The way I do it is pass a dummy view in viewForZoomingInScrollView: and read the height of this dummy view and set the frame on the actual content view to whatever I want. Because the frame changes, it means that drawRect gets called everytime. It seems fine on the simulator, I'm just drawing a few lines. But I don't actually own a device though, so I can't test it properly.
Also in the code you've got, you have touchesBegan:withEvent: but then you are forwarding to super touchesMoved:withEvent: instead of touchesBegan:withEvent: