Working in a ViewController that has a few views which were added to it as subviews and I have a touchesBegan method:
UIImageView *testImage = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"test.png"]];
testImage.frame = CGRectMake(0, 0, 480, 280);
[self.view addSubview:testImage];
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint point;
UITouch *touch = [touches anyObject];
point.x = [touch locationInView:self.view].x;
point.y = [touch locationInView:self.view].y;
if ( point.y >= 280 && point.y <= 320 )
{
if ( point.x >= 0 && point.x <= 160 )
{
[self menu1];
}
if ( point.x >= 161 && point.x <= 320 )
{
[self menu2];
}
if ( point.x >= 321 && point.x <= 480 )
{
[self menu3];
}
}
}
My question is how in that method can I discern which view was clicked? I've been doing it with those screen coordinates, but that won't work if I also move those views at runtime.
Is there a way to see which view was clicked in the touches or event or in this code from above:
UITouch *touch = [touches anyObject];
Any help appreciated // :)
Say you have a view controller with these ivars (connect to controls in Interface Builder)
IBOutlet UILabel *label;
IBOutlet UIImageView *image;
To tell if a touch hit these items or the background, view add this method to your view controller.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
{
UITouch *touch = [[event allTouches] anyObject];
if ([touch view] == label) {
NSLog(#"touched the label");
}
if ([touch view] == image) {
NSLog(#"touched the image");
}
if ([touch view] == self.view) {
NSLog(#"touched the background");
}
}
Any UIView subclass like a UIView, UILabel or UIImageView that you want to respond to touches must have the .userInteractionEnabled property set to YES.
[touch view] will give you the view in which the touch initially occured (i.e., this view will remain the same even if the user moves the finger away from the view during the touch).
If that is not the behavior you require, use:
[self.view hitTest:[touch locationInView:self.view] withEvent:event];
I might be missing something, but wouldn't your "menuX" elements have their own rects describing their sizes and locations? Then all you do is ask if the point is within those rectangles.
Why are you implementing your own hit-testing? It's trivial just to place transparent buttons wherever you want them.
Related
I have subclassed UIView and there initially my view will be in a default color and i need to fill some different color on touch (from x axis = 0 to user touched point),here the problem is touchesMoved even if i drag out of my self view bounds it is getting those points,how to restrict it to only for my self view bounds.
I googled & tried below snippets but of no luck
if([self pointInside:point withEvent:nil]){
[self fillColor];
}
My touchesMoved method is as below,
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
endPoint = point;
NSLog(#"moved x: %f,y: %f",point.x,point.y);
if(CGRectContainsPoint([self frame], endPoint)){ // this also not working
[self fillColor];
}
}
Any help is appreciated in advance.
just set tag in viewDidLoad: method and use bellow logic..
fillColorView.tag = 111;
and use bellow logic in touchesMoved: method like bellow..
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *tap = [touches anyObject];
CGPoint pointToMove = [tap locationInView:fillColorView];
if([tap.view isKindOfClass:[UIView class]])
{
UIView *tempView=(UIView *) tap.view;
if (tempView.tag == 111){
[self fillColor];
}
}
}
hope this help you...
In your touchesMoved method, CGPoint point = [touch locationInView:self]; replcae self by the view in which you wants the touch to be worked.
self will get the complete view, you should pass your drawingView at there, so that it will detetc touch only on that view.
I've added some UIImageView dynamically and filled it with different images, what I am trying to do is "Allow user to set position for any UIImageView", for that I used
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//Here I want the object of particular UIImageView on which user touched.
}
In that method I'm doing,
NSLog(#"%#",[touches anyObject]);
It returns output
<UITouch: 0x68b95e0> phase: Began tap count: 1 window: <UIWindow: 0x68875d0; frame = (0 0; 320 480); layer = <UIWindowLayer: 0x68b6470>> view: <UIImageView: 0x6a74cf0; frame = (83.7763 83.7763; 182.447 182.447); transform = [0.968912, -0.247404, 0.247404, 0.968912, 0, 0]; alpha = 0.8; opaque = NO; tag = 3; layer = <CALayer: 0x6a74980>> location in window: {161, 230} previous location in window: {161, 230} location in view: {52.7761, 105.448} previous location in view: {52.7761, 105.448}
Note, in above output, it showing my UIImageView object on which I touched. But I want that object from it!!!
I want my UIImageView on which user touched?, I have already set property userInteractionEnabled=YES so the problem isn't with it!
I used below code to get it so, but it wont work.
NSInteger tag=[[[touches anyObject] view] tag]; //It only returns tag of UIView tag
I Google it but doesn't come with solution!
Thank you in advance for any help!
Here you go:
this is only for one imageview you can detect the other by the same if statement.
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touch_point = [touch locationInView:self.view];
if (![imageView pointInside:touch_point withEvent:event])
{
NSLog(#"point inside imageview");
}
}
or you can also do this :p
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
if (touch.view == iv)
{
NSLog(#"i got you");
}
}
like this: (iv and iv2 are 2 different UIImageView`s)
if (touch.view == iv)
{
NSLog(#"i got you");
}
if (touch.view == iv2)
{
NSLog(#"i got you too :p");
}
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch=[touches anyObject];
if([[touch valueForKey:#"view"] isKindOfClass:[UIImageView class]])
{
UIImageView *viewSelected=(UIImageView *)[touch valueForKey:#"view"]; //it returns touched object
//for further differences can user viewSelected.tag
}
}
Code to get only the X and Y coords from the UIImageView:
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint touch_point = [touch locationInView:self.imgView];
if ([imgView pointInside:touch_point withEvent:event])
{
NSLog(#"point inside imageview");
cords=[NSString stringWithFormat:#"%f,%f",touch_point.x,touch_point.y];
NSLog(#"cords are%#",cords);
}
}
You can use UIButton with Image properties instead of UIImageView.
If you would like to call your own function ,It's pretty easy to handle many event like Touch up inside or Touch Cancel by adding selector.
UIButton *yourBtn = [UIButton buttonWithType:UIButtonTypeRoundedRect];
/* Depend on your dynamic programming handle */
yourBtn.frame = CGRectMake(40, 140, 240, 30);
/* If you prefer just only image ,no need to set Title name */
[yourBtn setTitle:#"Your Button Title" forState:UIControlStateNormal];
/* choose yourFunction for each UIButton */
[yourBtn addTarget:self action:#selector(yourFunction) forControlEvents:UIControlEventTouchUpInside];
/* your button will be appear in the position that you have define */
[self.view addSubview:yourBtn];
Hope It helps you!
1 Subclass UIImageView and implement:
Responding to Touch Events
– touchesBegan:withEvent:
– touchesMoved:withEvent:
– touchesEnded:withEvent:
– touchesCancelled:withEvent:
Responding to Motion Events
– motionBegan:withEvent:
– motionEnded:withEvent:
– motionCancelled:withEvent:
or:
2 Add UIGestureRecognizer to each UIImageView.
You could add the View that you add into an NSMutableArray and then just compare like this:
I am not in my mac, but It's something similar to this:
NSInteger viewID = [_views indexOfObject:[[touches anyObject] view]];
This return and number if not isn't exist do this:
if (viewID != NSNotFound) {
//it exist the view and its in the array
}
I believe I may be dealing with some view issues which are not allowing me to detect and add UIImageView objects to an array. I could really use some suggestions.
I've got a number of UIImageViews with images linked to a UIView that sits on top of a UIViewController (the UIView was added to help with drawRect and some additional advantages). So, when I touch an image and 'drag and drop' it, I want to add that image to an array upon being dropped.
My touchesBegan gets the location of the touch and checks UIImageView is being touched and then centers on that view as follows:
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self]; //can't use self.view in a UIView, this may be causing issues?
startLocation = location; // for reference as needed
NSLog(#"Image x coord: %f", location.x);
NSLog(#"Image y coord: %f", location.y);
if ([touch view] == obiWan_Block)
{obiWan_Block.center = location;}
else if ([touch view] == r2d2_Block)
{r2d2_Block.center = location;}
else if ([touch view] == you_Block)
{you_Block.center = location;}
}
Then, I drag the UIImageView around with touchesMoved and finally, 'drop' the image with touchesEnded. When the UIImageView is drop in a certain area of the screen, I 'snap' it to a specific location. It's at this point I want to place this UIImageView into an array, but I'm having no luck. I believe I'm getting confused on the various views being touched and what's getting added via addObject to my NSMutableArray. Or, I could be missing something completely. Here's my touchedEnded method:
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self];
UIView *temp = [touch view];
UIImageView *currentImageView = (UIImageView *)[touch view]; //DON'T THINK THIS IS RIGHT
NSLog(#"Current Image View is: %#", currentImageView);
if ((location.x > dropZone.origin.x) && (location.y >= dropZone.origin.y) && ([touch view] != NULL))
{
CGRect frame = temp.frame;
if (touchCount == 0) {
frame.origin.x = 15;
frame.origin.y = 180;
temp.frame = frame;
[quoteArray addObject: currentImageView];
touchCount++;
}
else if (touchCount == 1) {
frame.origin.x = 70;
frame.origin.y = 180;
temp.frame = frame;
[quoteArray addObject: currentImageView];
touchCount++;
}
....
I have an NSLog statement to check if addObject is working as follows:
NSLog(#"the quote array contains %d items. Contents = %#",[quoteArray count], quoteArray);
The log always says:
the quote array contains 0 items. Contents = (null)
Please advise and thanks!
Your last part of code shows that you have uninitialized quoteArray. Check code when you create it, I guess you missed something in your init method. Because if array was correct then NSLog should show below:
NSArray *quoteArray = [NSArray array];
NSLog(#"%#", quoteArray);
2011-11-17 17:37:00.506 TestApp[1382:207] ( )
I have a project where a UITextView (for multilines) can be dragged around the screen. So far my solution to this has been an overlay of an invisible UIButton which when dragged its center is the same as the UITextView's center.
However I've seen apps that seem to just allow the UITextView to be dragged and edited on the fly so it seems there might not be an overlay in those but I'm not sure.
Thoughts?
By the way, c in this code is the UIButton and this is how I have moved it thus far:
- (void) draggedOut: (UIControl *) c withEvent: (UIEvent *) ev
{
if(self.interfaceOrientation == UIInterfaceOrientationPortrait)
{
c.center = [[[ev allTouches] anyObject] locationInView:self.view];
AddedText.center = c.center;
}
else if(self.interfaceOrientation == UIInterfaceOrientationPortraitUpsideDown)
{
c.center = [[[ev allTouches] anyObject] locationInView:self.view];
AddedText.center = c.center;
}
else if(self.interfaceOrientation == UIInterfaceOrientationLandscapeLeft)
{
c.center = [[[ev allTouches] anyObject] locationInView:self.view];
AddedText.center = c.center;
}
else if(self.interfaceOrientation == UIInterfaceOrientationLandscapeRight)
{
c.center = [[[ev allTouches] anyObject] locationInView:self.view];
AddedText.center = c.center;
}
}
- (void)panTextView:(UIPanGestureRecognizer *)recognizer {
NSLog(#"panning");
location1 = [recognizer translationInView:draggableTextView];
recognizer.view.center = CGPointMake(recognizer.view.center.x + location1.x,
recognizer.view.center.y + location1.y);
[recognizer setTranslation:CGPointMake(0,0) inView:draggableTextView];
location1 =[recognizer locationInView:draggableTextView];
NSLog(#"tranlation %#",NSStringFromCGPoint(location1));
[_imgpic addSubview:recognizer.view];
appDelegate.txt=draggableTextView.text;
}
call this method after creating textview.
Well have not been able to manipulate the actual uitextview.
First tried making a button overlay that could be moved and could be pressed to start editing, but it wasn't centered properly.
Then tried the above method to move the UITextView itself. But it would only work on touches or drags. (Note this was a modified form of touchesBegan & touchesMoved)
Ended up with a UIScrollView with the UITextView as a subview. Now it can move smoothly just that it can be moved from any place on the screen. Not optimal but is the best result to thus keep everything else intact.
Does the textView need to support scrolling? If so, this could get complicated.
But if not, there are two approaches. 1) subclass the textview and override touchesBegan, touchesMoved, touchesEnded. 2) write a gesture recognizer that processes the same messages and attach it to the textview.
Here's an example of a Gesture recognizer that will do the job:
#interface TouchMoveGestureRecognizer : UIGestureRecognizer
{
CGPoint _ptOffset;
}
#end
#implementation TouchMoveGestureRecognizer
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* t = [touches anyObject];
_ptOffset = [t locationInView: self.view];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* t = [touches anyObject];
CGPoint pt = [t locationInView: self.view.superview];
pt.x -= _ptOffset.x;
pt.y -= _ptOffset.y;
CGRect r = self.view.frame;
r.origin = pt;
self.view.frame = r;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
_ptOffset = CGPointMake(-1, -1);
}
#end
and, how to use it:
- (void)viewDidLoad {
[super viewDidLoad];
_textView.scrollEnabled = NO;
TouchMoveGestureRecognizer* gr = [[[TouchMoveGestureRecognizer alloc] init] autorelease];
[_textView addGestureRecognizer: gr];
}
I have a UIScrollView which contains some small UIView subclass. UIScrollView is scroll enabled, and I want each UIView can be dragged inside UIScrollView freely.
My UIView subclass has this method:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
if ([touch view] != self) {
return;
}
CGPoint touchPoint = [touch locationInView:self.superview];
originalX = self.center.x;
originalY = self.center.y;
offsetX = originalX - touchPoint.x;
offsetY = originalY - touchPoint.y;
[self.superview bringSubviewToFront:self];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
if ([touch view] == self) {
CGPoint location = [touch locationInView:self.superview];
CGFloat x = location.x + offsetX;
CGFloat y = location.y + offsetY;
self.center = CGPointMake(x, y);
return;
}
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
if ([touch view] == self) {
self.center = CGPointMake(originalX, originalY);
}
}
I found touchesCancelled:withEvent will be called each time I just drag UIView several pixels. But these codes will work correctly if it is subclass of UIControl.
Why?
Thanks in advance!
UIScrollView tries to determine what kind of interaction the user has in mind. If you tap a view inside a scroll view, that view gets the touch began. If the user then drags, the scroll view decides that the user wants to scroll, so it sends touchesCancelled to the view which first got the event. It then handles the dragging itself.
To enable your own dragging of subviews, you can subclass UIScrollView and override touchesShouldBegin:withEvent:inContentView: and touchesShouldCancelInContentView:.