How to detect touch inside an UITableViewCell - iphone

I have a custom UITableViewCell with four UIImageView inside it. I want to detect touch on these imageViews and notify my ViewController (which is containing the UITableView) which ImageView inside which cell has been tapped. How may I do that?

Try this:
-(void)createGestureRecognizers {
for(/*each of the UIImageViews*/) {
UITapGestureRecognizer *singleFingerTap = [[UITapGestureRecognizer alloc]
initWithTarget:self action:#selector(handleSingleTap:)];
singleFingerTap.numberOfTapsRequired = 1;
[imageView addGestureRecognizer:singleFingerTap];
imageView.tag = i;
[singleFingerTap release];
}
}
- (void)handleSingleTap:(UIGestureRecognizer *)sender {
UIImageView *imageView = (UIImageView *)sender.view;//this should be your image view
}

There are a couple variants to hit the target. In fact it doesn't matter at all how you are doing it at the cell's level. It may be buttons, image view with tap gesture recognizers or custom drawing and proceeding touch events. I will not provide you with code now as you have a lot of code already given and may find a lot more by little search, thought it may be provided by demand. The real "problem" is in transporting message to the controller. For that purpose I've found only two reasonable solutions. First is the notifications and second is the delegations. Note that the notifications method may lead to visible lag between tap and actual event occurrence as sometimes notifications are brought to objects with little delay...
Hope that it helps.

Use four buttons and set them with different tag values. Then just check the tag value to see which one was pressed.

Related

Creating Photo library like iPhone, but UIImage tap event handler not working

Hi i am new to iOS programming.
I need to create an photo library like iPhone native photo app. I have found a library MWPhotoBrowser
that provides nearly same UI for photo browsing and its perfect for my requirement. But now i have 2 problems
First is that i need to create a grid layout with thumbnails. Clicking on an thumbnail should display image in full screen with browsing functionality. I tried to dynamically adding UIImageViews in UIScrollView, but UIScrollView is not scrolling and images are going out of screen.
Second is i could not get any tap handler on UIImageView so that i can open an image in full screen.
Looking for some tips here, i am really stuck here.
You can give a shot to this library, it has both features which you are looking for.
https://github.com/gdavis/FGallery-iPhone
Hope it helps :)
For ScrollView scrolling issue you have to increase scrlViewMain.contentSize dynamically. Create a for loop and put bellow code at the end of loop.
scrlViewMain.contentSize = CGSizeMake(0, incermentAsPerYourNeed);
For the tapping issue you have to add TapGesture. Put bellow code when your ImageView creates.
imgView.tag = giveAsPerYourRecordsID;
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleImageTap:)];
tap.cancelsTouchesInView = YES;
tap.numberOfTapsRequired = 1;
tap.delegate = self;
[imgView addGestureRecognizer:tap];
[tap release];
And bellow your catch method for tapping.
- (void) handleImageTap:(UIGestureRecognizer *)gestureRecognizer {
UIImageView *tmpImgView = (UIImageView *)gestureRecognizer.view;
}
I have also developed similar functionality a few days ago for my project.
For tap issue you should use UIButton instead of UIImageView, it gives you tap action event. For Scrollview not scrolling, just make sure you update contentSize property of UIScrollView, otherwise it would not scroll according to your subviews.

Using Image to Perform an Action instead of Button in iPhone Applciation

How can i use image to Perform an Action instead of Using Button or adding an image to Button, just wanna click button and perform particular set of instructions.
(Thanks in Advance)
Use a UIGestureRecogniser.
// IF your image view is called myImage
UIImageView *myImage = ...;
// Add a tap gesture recogniser
UITapGestureRecogniser *g = [[UITapGestureRecogniser alloc] initWithTarget:self action:#selector(imagePressed:)];
[myImage addGestureRecogniser:g];
[g release];
When your image is tapped, this method will get called
- (void)imagePressed:(UIGestureRecogniser *)recogniser {
NSLog(#"%#", recogniser);
}
Why don't you want to use a UIButton - it inherits from UIControl and has a lot of code that you probably don't even know exists? And it can just contain an image so it would look exactly the same?
Well, the pro about using UIButtons is that it got all touch events built right in. You can use UIImageViews, but you'll need to subclass them, while in most situations, a UIButton using a background-image would just fit.

UIButton with two state - touch and long touch

I'd like to make a button perform different methods, depending on if the user taps or does a long tap.
I've tried:
UILongPressGestureRecognizer *longPress = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(doRewind)];
[uiNextButton addGestureRecognizer:longPress];
[longPress release];
But the app registers my touch only when I touch the button and move the finger a little bit.
What am I doing wrong?
If you are setting both a "normal" tap and a "long" tap gesture, there could be interactions between the two of them.
Have you tried setting the minimumPressDuration property for UILongPressGestureRecognizer?
Also, using requireGestureRecognizerToFail: can be useful to make one of the two gesture handler fire only if the other one did not.
Have a look at the relevant document for those two methods.
If this does not help, please, give more details about your view and all the gesture handlers you are defining.

Two fingered swipe v. one fingered drag in iOS

In my app I have a UIView derived class Canvas that uses touchesBegan: withEvent:, touchesMoved: withEvent:, and touchesEnded: withEvent: to draw in the canvas. I also want to use a swipe to load the previous (or next) canvas in an array. I tried setting up the following gesture (and a similar one for right):
UISwipeGestureRecognizer* leftSwipe = [[UISwipeGestureRecognizer alloc] initWithTarget: self action: #selector(pageFlipNext)];
leftSwipe.direction = UISwipeGestureRecognizerDirectionLeft;
leftSwipe.numberOfTouchesRequired = 2;
[_canvas addGestureRecognizer: leftSwipe];
[leftSwipe release];
But my two fingered swipes are still being treated as one-fingered drawing instructions.
How do I get my app to handle the two-fingered swipe correctly?
First of all, I would get started verifying whether _canvas.multipleTouchEnabled is set to YES. If it isn't, set it to YES.
You might also want to consider
leftSwipe.delaysTouchesBegan = YES;
This will delay the touches to be sent to the _canvas until the gesture fails. You can also use a UIPanGestureRecognizer and do something like this,
[pan requireGestureRecognizerToFail:leftSwipe];
I think that you can use UIPanGestureRecognizer
UIPanGestureRecognizer is a concrete
subclass of UIGestureRecognizer that
looks for panning (dragging) gestures.
The user must be pressing one or more
fingers on a view while they pan it.

Custom actions for UIGestureRecognizers (with custom parameters)

Short version of my problem:
I cannot figure out how to make the "action" for my UITapGestureRecognizer take additional parameters, and actually use them.
Here's the rundown of my problem:
I am trying to make it so that my iPad app records (with NSLog) the coordinates of the UITouch that occurs whenever they press one of my app's UIButtons. The location of the touch needs to be relative to the button that was touched.
What I've done:
I have implemented a UITapGestureRecognizer and added it to each of my buttons. My problem is with the action to use, since it needs to be dynamic for each and every button.
I currently have this code:
UITapGestureRecognizer *iconClickRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(logIcon:withTag:)];
[iconClickRecognizer setNumberOfTapsRequired:1];
[iconClickRecognizer setNumberOfTouchesRequired:1];
[iconClickRecognizer setDelegate:self];
[[self.view viewWithTag:1] addGestureRecognizer:iconClickRecognizer];
[iconClickRecognizer release];
When I know that this works, I will use a for-loop to add the iconClickRecognizer to all of the buttons by their tag.
The logIcon:(int)withTag method is shown here:
-(void)logIcon:(UIGestureRecognizer *)gestureRecognizer withTag:(int)tag {
NSLog(#"tag X: %f", [gestureRecognizer locationInView:(UIView*)[self.view viewWithTag:tag]].x);
NSLog(#"tag Y: %f", [gestureRecognizer locationInView:(UIView*)[self.view viewWithTag:tag]].y);
}
What isn't working:
When I hard-code a tag into logIcon method, it records the information correctly. However, I do not know how to make this method dynamic, and actually use the "tag" parameter.
Any help would be greatly appreciated.
Thanks,
Alex
The issue is that you can only get one argument from target / action style registration, that is the sender (in this case the gesture recognizer itself). You're not able to pass arbitrary contexts. But you could, in your action, check the recognizer's view property and examine that view's tag.
The docs for UIGestureRecognizer class specify that the action must be of the form:
- (void)handleGesture;
- (void)handleGesture:(UIGestureRecognizer *)gestureRecognizer;
not your form of:
- (void)logIcon:(UIGestureRecognizer *)gestureRecognizer withTag:(int)tag
So you could ask the gestureRecognizer where it is on the whole window, then compare to your buttons, or you could walk through your buttons and ask the gesture where it is with respect to each button.
Probably best would be to subclass UIButton and make each button itself the target; then you know exactly what view you're in.