Calling a method in MainViewController from custom class file - iphone

I am using a custom gesture recognizer (as given in the link Intercepting/Hijacking iPhone Touch Events for MKMapView) for detecting touch events on MKMapView. gesture recognizer is defined in WildcardGestureRecognizer.h and implemented in WildcardGestureRecognizer.m file. When this gesture recognizer is added to MKMapview, one can read from following method any touch events
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if (touchesBeganCallback)
touchesBeganCallback(touches, event);
NSLog(#"touchesBegan");
}
Based on this touch detection I want to call method tapMethod from MainViewController(Containing MKMapView).
-(void) tapMethod
{
dontUpdateLocation =1;// a variable to check stoppoing of location update.
NSLog(#" map tapped");
}
I tried following
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if (touchesBeganCallback)
touchesBeganCallback(touches, event);
NSLog(#"touchesBegan");
MainViewController *updateController = [[MainViewController alloc]init ];
[updateController tapMethod];
[updateController release];
}
It does print "map tapped" but doesn't change value of variable dontUpdateLocation.
How can I do it?

From what I understand from your comments, I think the problem is due to the fact that when you do this:
MainViewController *updateController = [[MainViewController alloc]init ];
[updateController tapMethod];
[updateController release];
you're not creating a reference to the existing mainviewcontroller, but you are creating a different pointer that points to another object in memory.
You may use the appDelegate to store(set #property, and #synthesize) the variable and then access like this:
YourAppDelegate *appDel=(YourAppDelegate *)[[UIApplication sharedApplication] delegate];
appDel.dontUpdateLocation=1;
I suggest you take a look and depth of these patterns: Singletons, MVC, and Delegation
Hope this helps.

Related

UIGestureRecognizer sample code

I cannot figure out what is wrong. Below is my code, and it calls the delegate methods once then stops.
What should I do? I haven;t been able to find the sample code that uses these delegate methods. All I've found were gesture recognizers for swipes and taps, using different delegates.
Code so far:
-(void)initTouchesRecognizer{
DLog(#"");
recognizer = [[UIGestureRecognizer alloc] init];
[self addGestureRecognizer:recognizer];
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
DLog(#"");
NSSet *allTouches = [event allTouches];
for (UITouch *touch in allTouches)
{
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
DLog(#"");
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesEnded:touches withEvent:event];
}
I call initTouchesRecognizer from initwithrect for my image view.
What am i doing fundamentally wrong?
UIGestureRecognizer is an abstract class, you're not supposed to add it directly to your view. You need to use a concrete subclass that inherits from UIGestureRecognizer, like UITapGestureRecognizer or UIPanGestureRecognizer for example. You could also make your own concrete subclass but that usually isn't necessary.
Here is an example of adding a UIPanGestureRecognizer to your view (in your view class code, often the gesture is added to the view from the controller):
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(mySelector:)];
[self addGestureRecognizer:panGesture];
In this case, the selector will be called when ever the user pans in this view. If you added a UITapGestureRecognizer, the selector would be called when the user tapped.
You can check out the apple docs for more info:
http://developer.apple.com/library/ios/#documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/GestureRecognizer_basics/GestureRecognizer_basics.html#//apple_ref/doc/uid/TP40009541-CH2-SW2
Also, I find Paul Hagerty's Stanford lectures to be great, here's one on gesture recognizers:
https://itunes.apple.com/ca/course/6.-views-gestures-january/id593208016?i=132123597&mt=2
You should also understand that none of the methods that you posted are delegate methods, and none of them have anything to do with the UIGestureRecognizer that you added in your code. Those are instance methods of UIResponder (a class that UIView inherits from) that you're overriding. The abstract UIGestureRecognizer also has instance methods with those same names, but it is not the UIGestureRecognizer methods that are getting called in your class.
There was no need to add the gesture recognizer. By overriding the touchesMoved, touchesEnded and touchesBegan methods, I was able to track the user's finger across the screen.
simply do not call the:
-(void)initTouchesRecognizer
code, and the code I originally posted will work.

UIScrollView sending touches to subviews

Note: I already read some questions about the UIScrollView sending touches to the subviews (this included and although I have up voted, it's not working as I intended anymore).
What I have: I have a UIScrollView with a Custom UIView (let's call it A) inside which covers the entire UIScrollView. I am also allowed to put other custom UIViews inside the A.
On the code I am doing this:
[scrollView setDelaysContentTouches:YES];
scrollView.canCancelContentTouches = NO;
What is happening: At the moment my only issue is that, if I want to move a subview inside A, I have to touch it, wait, and then move it. Exactly as stated here:
Now, the behaviour changes depending on the "length in time" of the
first touch on the UIView. If it's short, then the relative dragging
is managed as it was a scroll for the UIScrollView. If it's long, then
I'm getting the touchesMoved: events inside my UIView.
What I want: The subviews inside A should always receive priority and I shouldn't have to touch and wait. If I touch A and not a subview of it, I want the UIScrollView to receive the touches, like panning and moving around (the contentSize is bigger than the frame).
Edit 1.0
The only reason for me to have this A view inside a generic UIScrollView, is because I want to be able to zoom in/out on the A view. So I am doing the following:
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView {
return customView; // this is the A view
}
In the beginning I didn't had the A view inside the UIScrollView and the only thing I did was adding the A as a subView of my UIViewController's root view and everything went well. If there is another way to enable zoom in/out I will gladly accept the answer.
Note: Thank you all for your contributions, specially to Aaron Hayman.
I was able to figure it out by doing the following on the UIScrollView sub-class I had:
-(BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer
{
CGPoint pointOfContact = [gestureRecognizer locationInView:self];
// The view with a tag of 200 is my A view.
return (![[self hitTest:pointOfContact withEvent:nil] isEqual:[self viewWithTag:200]]);
}
I haven't tested this, but I believe how you are handling the touch events in View A (or it's subviews) will determine how touch events are passed on. Specifically, if you're trying to use the methods: touchesBegan, touchesMoves, touchesEnded, etc instead of a UIGestureRecognizer you won't receive the touches in the way you want. Apple design the UIGestureRecognizer to handle problems like the one you're facing. Specifically, the UIScrollView uses UIPanGestureRecognizer to handle the scrolling. If you add a UIPanGestureRecognizer to each of the subviews of View A any "panning" that occurs on one of those subviews should be sent to that subview instead of the UIScrollView. However, if you're simply using the "raw" touches methods, the UIPanGestureRecognizer in UIScrollView will never be cancelled.
In general, it's almost always best to use a UIGestureRecognizer instead of processing the touches directly in the view. If you need touches processed in a way that no standard UIGestureRecognizer can provide, subclass UIGestureRecognizer and process the touches there. That way you get all the the functionality of a UIGestureRecognizer along with your own custom touch processing. I really think Apple intended for UIGestureRecognizer to replace most (if not all) of the custom touch processing code that developers use on UIView. It allows for code-reuse and it's a lot easier to deal with when mitigating what code processes what touch event.
Jacky, I needed a similar thing: Within a building plan (your A, in my case a subclass of UIScrollView), let the user place and resize objects (call them Bs). Here's a sketch of what it took me to get at this behavior:
In the superview's (A) initWithFrame: method, set these two:
self.canCancelContentTouches = YES;
self.delaysContentTouches = NO;
This will ensure taps on B are immediately routed to the Bs.
In the embedded B, stop the superview A from cancelling taps, so it does not interfere with a gesture started on the B.
In the touchesBegan: method, search the view hierarchy upwards (using superview property of the views) until you find a UIScrollView, and set its canCancelContentTouches to NO. Remember the superview you changed, and restore this property in the touchesEnded and touchesCancelled methods of B.
I'd be interested whether this works for you as well. Good Luck!
nobi
I think you had better use "touchesBegan,touchesMoved,touchesEnded" to pass the event.
you can do like this:
you should make a mainView . It has 2 property. One is yourScrollView A , and One is yourCustomView.
`[yourScrollView addSubviews:yourCustomView];
[mainView addSubviews:yourScrollView];`
and then write your touches method in the mainView.m like this (ignor the scrollView statment)
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
if ([[touches allObjects] isKindOfClass:[yourCustomView class]])
{
//do whatever you want
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
if ([[touches allObjects] isKindOfClass:[yourCustomView class]])
{
//do whatever you want
}
}
The last step: pass the event to the subview of the scrollView(your A).
#import "yourScrollView.h"
#implementation yourScrollView
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
if (self) {
// Initialization code.
}
return self;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
[super touchesBegan:touches withEvent:event];
if(!self.dragging)
[[self nextResponder] touchesBegan:touches withEvent:event];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
[super touchesMoved:touches withEvent:event];
if(!self.dragging)
[[self nextResponder] touchesMoved:touches withEvent:event];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
[super touchesEnded:touches withEvent:event];
if(!self.dragging)
[[self nextResponder] touchesEnded:touches withEvent:event];
}
- (void)dealloc {
[super dealloc];
}
#end
wish to help you

Recognise all touches in interface

I want to be able to recognise ALL touches in an interface, no matter what was touched.
I've tried:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
.. but this just recognises when the user taps on something that doesn't respond to taps (uiimages for instance)
The reason i need this ability is that I want to kick in a slide show if the user doesn't touch the screen for 5 minutes, so I want to reset the timer whenever they touch. It seems wrong to put this reset code in each UI event individually.
There are several possible solutions, but as said #omz - overriding the sendEvent: it is the best one.
#interface YourWindow : UIWindow {
NSDate timeOfLastTouch;
}
#end
#implementation YourWindow
- (void)sendEvent:(UIEvent *)event {
[super sendEvent:event];
NSSet *touches = [event allTouches];
UITouch *touch = [touches anyObject];
if( touch.phase == UITouchPhaseEnded ){
timeOfLastTouch = [NSDate date];
}
}
#end
Do not forget replace UIWindow with YourWindow.
You could subclass UIWindow and override the sendEvent: method.
you could use a tap gesture
In your interface add the UIGestureRecognizerDelegate
#interface ViewController : UIViewController <UIGestureRecognizerDelegate> {
then in your viewDidLoad add this
UITapGestureRecognizer *tapped = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapMethod)];
tapped.delegate=self;
tapped.numberOfTapsRequired = 1;
[self.view addGestureRecognizer:tapped];
then do your timer code in the tapped method
-(void)tapped {
//timer code
}
Make sure you UI elements have setUserInteractionEnabled:YES
You can subclass the UIWindow and override the sendEvent: method like this:
- (void)sendEvent:(UIEvent *)event {
if (event.type == UIEventTypeTouches) {
// You got a touch, do whatever you like
};
[super sendEvent:event]; // Let the window do the propagation of the event
}
As #Alladinian said in one of the comments, iOS Reference Documentation mentions that subclassing UIApplication is the right application and thus, seems preferred to subclassing UIWindow. cf. https://developer.apple.com/library/ios/#documentation/UIKit/Reference/UIApplication_Class/Reference/Reference.html :
You might decide to subclass UIApplication to override sendEvent: or
sendAction:to:from:forEvent: to implement custom event and action
dispatching.

UIMapView: UIPinchGestureRecognizer not called

I implemented gesture recognizer in UIMapView just as described in the accepted answer to this question: How to intercept touches events on a MKMapView or UIWebView objects?
Single touches are recognized correctly. However, when I changed the superclass of my class from UIGestureRecognizer to UIPinchGestureRecognizer in order to recognize map scaling, everything stopped working.
Now TouchesEnded event occurs only when the user double tap the annotation on map (don't know, why!) and doesn't occur when the user pinches the map (zoom in or out doesn't matter).
PS I'm using iOS SDK 4.3 and testing my app in simulator if that matters.
The code of mapViewController.m - viewDidLoad method:
- (void)viewDidLoad
{
[super viewDidLoad];
MapGestureRecognizer *changeMapPositionRecognizer = [[MapGestureRecognizer alloc] init];
changeMapPositionRecognizer.touchesEndedCallback = ^(NSSet * touches, UIEvent * event)
{
...
};
[self.mapView addGestureRecognizer:changeMapPositionRecognizer];
[changeMapPositionRecognizer release];
}
The code of MapGestureRecognizer.h:
#import <UIKit/UIKit.h>
typedef void (^TouchesEventBlock) (NSSet * touches, UIEvent * event);
#interface MapGestureRecognizer : UIPinchGestureRecognizer
#property(nonatomic, copy) TouchesEventBlock touchesEndedCallback;
#end
The code of MapGestureRecognizer.m:
#import "MapGestureRecognizer.h"
#implementation MapGestureRecognizer
#synthesize touchesEndedCallback = _touchesEndedCallback;
- (id)init
{
self = [super init];
if (self) {
self.cancelsTouchesInView = NO;
}
return self;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if (self.touchesEndedCallback)
{
self.touchesEndedCallback(touches, event);
NSLog(#"Touches ended, callback done");
}
else
{
NSLog(#"Touches ended, callback skipped");
}
}
- (void) dealloc
{
[super dealloc];
}
#end
What should I correct in to make pinch gesture to be recognized?
I'm not sure why you need to subclass UIPinchGestureRecognizer instead of using it directly as-is.
Also not sure why you need the gesture recognizer to detect map scaling which you could do by using the delegate methods regionWillChangeAnimated and regionDidChangeAnimated and comparing the span before and after. Unless you are trying to detect the scaling as it is happening (and not wanting to wait until user finishes the gesture)
The gesture recognizer may not be getting called because the map view's own pinch gesture recognizer is getting called instead.
To have your recognizer called as well as the map view's, implement the UIGestureRecognizer delegate method shouldRecognizeSimultaneouslyWithGestureRecognizer and return YES:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldRecognizeSimultaneouslyWithGestureRecognizer:
(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Make sure the gesture recognizer's delegate property is set or that method won't get called either.

Handling touches inside UIWebview

I have created a subclass of UIWebView , and have implemented the
touchesBegan, touchesMoved and touchesEnded methods.
but the webview subclass is not handling the touch events.
Is there any method to handle the touch events inside the UIWebView subclass ???
No subclassing needed, just add a UITapGestureRecognizer :
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(didTapMethod)];
[tap setNumberOfTapsRequired:1]; // Set your own number here
[tap setDelegate:self]; // Add the <UIGestureRecognizerDelegate> protocol
[self.myWebView addGestureRecognizer:tap];
Add the <UIGestureRecognizerDelegate> protocol in the header file, and add this method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
If all you need is to handle gestures, while leaving the rest of the UIWebView functionality intact, you can subclass UIWebView and use this strategy:
in the init method of your UIWebView subclass, add a gesture recognizer, e.g.:
UISwipeGestureRecognizer * swipeRight = [[UISwipeGestureRecognizer alloc]initWithTarget:self action:#selector(handleSwipeGestureRightMethod)];
swipeRight.direction = UISwipeGestureRecognizerDirectionRight;
[self addGestureRecognizer:swipeRight];
swipeRight.delegate = self;
then, add this method to your class:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer{
return YES;
}
Add and handle your designated selector to the class, in this case "handleSwipeGestureRightMethod" and you are good to go...
You could put an UIView over your UIWebView, and overide the touchesDidBegin etc, then send them to your webview. Ex:
User touches your UIView, which provokes a
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Execute your code then send a touchesBegan to your webview like so:
[webView touchesBegan:touches withEvent:event];
return;
}
your UIView has to be over the webview.
I'm not sure if this is what you want (it's not what you asked for, but it might work depending on what your end game is), but you could instead interpret the touches in JavaScript from inside the UIWebView, and get javascript to do
document.location='http://null/'+xCoord+'/'+yCoord; // Null is arbitrary.
Then you can catch that using the UIWebView's delegate method
- (BOOL)webView:(UIWebView *)webView shouldStartLoadWithRequest:(NSURLRequest *)request navigationType:(UIWebViewNavigationType)navigationType
And if the request.URL.host (or whatever it is) isEqualToString:#"null" take the relevant action (and return NO instead of YES). You can even add the JS to each page by doing something like:
- (void)webViewDidFinishLoad:(UIWebView *)webView {
[webView stringByEvaluatingJavaScriptFromString:#"window.ontouchstart=function(/* ... */);"];
}
Hope this helps?
Handling gestures on a UIWebView is discussed in this Apple Developer forum thread.
Using the info given there, there will be no need for an extra view in most or all cases, and as mentioned here before, overriding UIWebView is not the way to go.
Copypaste of the most important post in the thread:
This is a known issue. The UIWebView has its own UITapGestureRecognizers, and they're on a private subview of the UIWebView itself. UIGestureRecognizer precedence defines that gestures attached to views deeper in the view hierarchy will exclude ones on superviews, so the web view's tap gestures will always win over yours.
If it's okay in your case to allow your tap to happen along with the web view's normal tap your best solution would be to implement the UIGestureRecognizerDelegate method gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer and return YES for other tap gestures. This way you'll get your tap handler called, and the web view will still get its called.
If you need to be the only one handling the tap you'll have to subclass UITapGestureRecognizer so you can use the one-way overrides in UIGestureRecognizerSubclass.h, an you can then return NO from canBePreventedByGestureRecognizer: when asked if the web view's tap gesture recognizer can prevent yours.
In any case, we know about this and hope to make it easier in the future.
I've just found that UIWebView does check whether it responds to the - (void)webViewDidNotClick: (id)webBrowserView selector, once one taps on the view area (not on hyperref, or any other area that should be handled specifically). So you may implement that selector with your handling code :)
Do you mean your sub-classed implementation is not called when touchesBegan, touchesMoved and touchesEnded are called?
It sounds like a problem with how you've created an instance of the object. More details are required I think.
(taken form comments)
Header File
#import <UIKit/UIKit.h>
#interface MyWebView : UIWebView { } #end
Implementation File
#import "MyWebView.h"
#implementation MyWebView
- (id)initWithFrame:(CGRect)frame {
if (self = [super initWithFrame:frame]) { } return self;
}
- (void)drawRect:(CGRect)rect {
NSLog(#"MyWebView is loaded");
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"touches began");
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"Touches ended");
}
- (void)dealloc {
[super dealloc];
}
#end
I would try overriding -sendEvent: on UIWindow, to see if you can intercept those touch events.
Following on from what Unfalkster said, you can use the hitTest method to achieve what you want, but you don't have to subclass UIWindow. Just put this in your web view subclass. You will get a compile time warning but it does work:
- (void)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
if (event.type == UIEventTypeTouches) {
// get location info
CGFloat x = point.x;
CGFloat y = point.y;
// get touches
NSSet *touches = [event allTouches];
// individual touches
for (UITouch *touch in touches) {
if (touch.phase == UITouchPhaseBegan) {
// touches began
} else if (touch.phase == UITouchPhaseMoved) {
}
// etc etc
}
}
// call the super
[super hitTest:point withEvent:event];
}
Hope that helps!
If you want to detect your own taps but disable the UIWebView's taps then you can use my solution:
-(void)recursivelyDisableTapsOnView:(UIView*)v{
for(UIView* view in v.subviews){
for(UIGestureRecognizer* g in view.gestureRecognizers){
if(g == self.ownTapRecognizer){
continue;
}
if([g isKindOfClass:[UITapGestureRecognizer class]] ||
[g isKindOfClass:[UILongPressGestureRecognizer class]] ||
[g isKindOfClass:NSClassFromString(#"UITapAndAHalfRecognizer")]){
g.enabled = NO;
}
}
[self recursivelyDisableTapsOnView:view];
}
}
- (void)webViewDidFinishLoad:(UIWebView *)webView{
[self recursivelyDisableTapsOnView:webView];
//disable selection
[webView stringByEvaluatingJavaScriptFromString:#"document.documentElement.style.webkitUserSelect='none';"];
// Disable callout
[webView stringByEvaluatingJavaScriptFromString:#"document.documentElement.style.webkitTouchCallout='none';"];
}