I am bit stacked with a functionality that should perform a certain task after double taping on a certain place on UIView. I know how to count number of taps, but do not know how to determinate which place has been tapped and I guess to compare with CGRect of view which was specified for doing this action.
thanx in advance
We can detect with touchesBegan
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSUInteger numTaps = [[touches anyObject] tapCount];
UITouch* t;
if([[event allTouches] count]==2)//double tap
{
t=[[[event allTouches] allObjects] objectAtIndex:0];
CGPoint p1=[t locationInView:self.view];
}
numTaps gives the nuber of taps .
P1 has the point where it is tapped.
All the best.
use
Point point = [touch locationInView:self.view];
Related
in my application I'm using touchesMoved method to detect swipe left/right.
when the user swipe to left and right continuously,image animation updates automatically.
I was able to detect swipe action but sometimes when I start to swipe left and right continuously,screen doesn't detect the touchmoved event.
in the swipe area I have placed one hidden button and few ImageViews for animations.
I want to know why it happens.please help me.
thank you.
code:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
[super touchesMoved:touches withEvent:event];
UITouch *touch = [touches anyObject];
CGPoint newLocation = [touch locationInView:self.view];
CGPoint oldLocation = [touch previousLocationInView:self.view];
if(newLocation.x-oldLocation.x>0){
swipe_direction = 1;
//NSLog(#"left");
}
else{
swipe_direction = 2;
//NSLog(#"right");
}
if(swipe_direction == 1){
//animate images
}
else if(swipe_direction == 2){
//animate images
}
}
touchesMoved Only detects on the empty part of the View. Therefore, it will not detect over the objects you used.
Place a SwipeGestureRecognizer over the view and use it from there.
Have you considered using a SwipeGestureRecognizer instead of the touchesMoved?
Check the Documentation
I have a view control and inside I plan to place some controls like buttons textbox etc... I can drag my view along the x axis like:
1)
2)
with the following code:
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
if( [touch view] == ViewMain)
{
CGPoint location = [touch locationInView:self.view];
displaceX = location.x - ViewMain.center.x;
displaceY = ViewMain.center.y;
startPosX = location.x - displaceX;
}
CurrentTime = [[NSDate date] timeIntervalSince1970];
}
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
if( [touch view] == ViewMain)
{
CGPoint location = [touch locationInView:self.view];
location.x =location.x - displaceX;
location.y = displaceY;
ViewMain.center = location;
}
}
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
double time = [[NSDate date] timeIntervalSince1970]-CurrentTime;
UITouch *touch = [[event allTouches] anyObject];
if( [touch view] == ViewMain)
{
CGPoint location = [touch locationInView:self.view];
location.x =location.x - displaceX;
location.y = displaceY;
ViewMain.center = location;
double speed = (ViewMain.center.x-startPosX)/(time*2);
NSLog(#"speed: %f", speed);
}
}
not that I have to add the global variables:
float displaceX = 0;
float displaceY = 0;
float startPosX = 0;
float startPosY = 0;
double CurrentTime;
the reason why I created those variables is so that when I start dragging the view the view moves from the point where I touch it instead of from the middle.
Anyways if I touch a button or image the view will not drag even though the images have transparency on the background. I want to be able to still be able to drag the view regardless if there is an image on top of the view. I where thinking that maybe I need to place a large transparent view on top of everything but I need to have buttons, images etc. I want to be able to drag a view just like you can with:
note that I was able to drag the view regardless of wither I first touched an app/image or text. How could I do that?
I think your problem is that if you touch a UIButton or a UIImageView with interaction enabled, it doesn't pass the touch along.
For the images, uncheck the User Interaction Enabledproperty in IB.
For the buttons that are causing touchesBegan:withEvent:, etc. to not get called, then look at the following link: Is there a way to pass touches through on the iPhone?.
You may want to consider a different approach to this problem. Rather than trying to manually manage the content scrolling yourself you would probably be better off using a UIScrollView with the pagingEnabled property set to YES. This is the method Apple recommends (and it's probably the method used by Springboard.app in your last screenshot). If you are a member of the iOS developer program check out the WWDC 2010 session on UIScrollView for an example of this. I think they may have also posted sample code on developer.apple.com.
I got an Icon of UIImageView named IconView. I want to make sure when I touch and move this Icon, its position only change inside the boundary of another UIImageView named backgroundView.
I thought after I add the IconView as the subview of backgroundView, the boundary is automatically set. But it seems to be wrong.
[backgroundView addSubview:IconView];
after this, I can still move the Icon to the outside of backgroundView.
how can I set the limitation? thanks.
You try like this,
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if(CGRectContainsPoint(urBoundryBackgroundImage.frame, location)) {
//do your moving stuff
}
}
I have the following code to determine if a touch is within an image view in my table cell. However, it doesn't work. I compared the two with CGRectContainsPoint however, it doesn't work. Here is the code:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Declare the touch and get it's location
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self];
if (CGRectContainsPoint(myImageView.frame, touchLocation))
{
NSLog(#"Tapped image view");
}
}
Thanks for the help!
However, it doesn't work.
Please be more specific.
UITouch *touch = [touches anyObject];
Why not examine every touch, and not simply a random* pick of them?
*The documentation for anyObject says that you are not guaranteed which one it will give you. You are not even guaranteed that it will be random; it could be the same object every time. Murphy's Law says that, whether it is random or not, it will be the wrong one.
CGPoint touchLocation = [touch locationInView:self];
if (CGRectContainsPoint(myImageView.frame, touchLocation))
Your frame is in your superview's co-ordinate system; [touch locationInView:self] returns the touch point in your co-ordinate system. You want to test within bounds, which is in your co-ordinate system. The documentation explains the difference.
The problem is that you need to be calling [touch locationInView:myImageView] to get the point in the image views coordinate system. Then do your check to see if it's within the frame.
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self];
if ([touch view]==view) {
view.center=location;
}
write it in the touch moved event. thanx
Remember that when you're asking a touch for locationInView:, you're getting out a point relative to that view's frame. So, assuming the code snippet you gave was contained in a subclass of UIViewController you should be asking for
CGPoint touchLocation = [touch locationInView:self.view];
Which will give you a point relative to your view. The reason you want a point relative to your current view is because the frame of your image view is also relative to its parent view - the same view. So now it should work.
if (CGRectContainsPoint(myImageView.frame, touchLocation)) {
NSLog(#"Tapped image view");
}
Trying to get some basic drag/drop functionality happening for an iPhone application.
My current code for trying to do this is as follows:
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
self.center = location;
}
This code has the result of a the touched UIView flickering while it follows the touch around. After playing around with it a bit, I also noticed that the UIView seemed to flicker from the 0,0 position on the screen to the currently touched location.
Any ideas what I'm doing wrong?
The center property needs to be specified in the coordinate system of the superview, but you've asked the touch event for the location in terms of your subview. Instead ask for them based on the superview's coordinates, like this:
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.superview]; // <--- note self.superview
self.center = location;
}