I used the codes below to detect touch on the object
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSArray *allTouches = [touches allObjects];
for (UITouch *touch in allTouches)
{
NSLog(#"TOUCH DETECT");
}
}
but it is not triggered
Welcome any comment
What kind of object? touchesBegan:withEvent: is only called for UIResponder subclasses.
Also, if it's a UIView, make sure userInteractionEnabled is YES.
Related
how to get a touch event only on specified view.Not on other view
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
currentPoint = [touch locationInView:drawImage];
}
In this code touch is on anyObject.
is there any why i can give touches to only specifies view.
You can use the property view on the UITouch, to find which view it occurred on
#property(nonatomic, readonly, retain) UIView *view
UITouch
Or you can find all the touched for a particular view, from the UIEvent param
- (NSSet *)touchesForView:(UIView *)view
UIEvent
I'm developing a "match the correct pair" answer app.
Here there will be two sections.(for example animal and legs).
In animal section there will hen,dog,cow,fox etc.(These options will be static.)
Now under the legs section there will be 2,4,5,6 etc.
The 2,4,5,6 etc will be random order.
The user will suppose select "2" and move to "hen" then the match is correct and will award a point.
If user select "2" and moved to "dog" the match is wrong and no point is awarded.
So how can i implement the moving function and also to check the correct match.
I hope i have explained my doubt in detail and if not please forgive me and im ready to explain more if needed.
Can anyone please help me.
Thanks in advance
Since it would be static, you will have the frame positions of each label (leg) I guess. You can find which label has been selected by using CGRectContainsPoint in touchesBegan: withEvent: method.
You can move the label by setting its frame position continuously in touchesMoved: withEvent: method.
Again you have to find whether this label intersects any label in animal side.
Edit:
For moving a view, pl. refer Listing 2-4 in this link
if you don't need to drag it, just set some buttons for your legs section, and set title (or any image) for them useing your random variables. In callbacks for this buttons "remember" what of them was pressed (optional, set other buttons userIntractionAnabled property to "NO"). then, make more buttons for your animal section. when user pressed one of tham, just move your legs-buttons to any position:
[UIView animateWithDuration:1 animations:^{
[button setCenter:CGPointMake(X, Y)];
}];
just in case:
in your buttons callback you have a "sender" value. this is pressed button. so you can use it:
[UIView animateWithDuration:1 animations:^{
[sender setCenter:CGPointMake(X, Y)];
}];
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
touchDone=YES;
UITouch *touch = [touches anyObject];
currentPoint = [touch locationInView:self.view];
[self loadingView];
}
-(void) loadingView{
if (touchDone){
drawImage.frame = CGRectMake(lastPoint.x, lastPoint.y, 70, 70);
}
}
.h viewController
#interface TouchViewController : UIViewController {
IBOutlet UIView *viewImage;
IBOutlet UILabel *touchesLabel;
CGPoint lastPoint;
}
#property (nonatomic, retain) IBOutlet UIView *viewImage;
#property (nonatomic, retain) IBOutlet UILabel *touchesLabel;
-(void) redrawView:(CGPoint) lpoint;
#end
in .m Implementation file
#implementation TouchViewController
#synthesize touchesLabel,viewImage;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
lastPoint = [touch locationInView:self.view];
[self redrawView:lastPoint];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
lastPoint = [touch locationInView:self.view];
[self redrawView:lastPoint];
}
-(void) redrawView:(CGPoint) lpoint
{
viewImage.frame=CGRectMake(lpoint.x, lpoint.y, 40, 30);
}
- (void)dealloc {
[viewImage release];
[touchesLabel release];
[super dealloc];
}
#end
*Note ViewImage is UIView Type but contains the image...can you how to upload on stackoverflow so that i will upload my project here if you want.
How would I be able to call a method, according to the y coordinate of a UIImage view?
To be more specific, I have a dragable UIImageView. When the y co-ordinate of it is greater than a certain value, I want the colour of the screen to change.
code for dragging:
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
if([touch view] == toggle)
{
CGPoint location = [touch locationInView:self.view];
CGPoint newLocation = CGPointMake(toggle.center.x, location.y);
toggle.center = newLocation;
NSLog(#"%f \n",toggle.center.y);
}
}
I could think of two ways of doing this. First is Key-Value Observing via the following:
[self.toggle addObserver: inspector
forKeyPath: #"frame"
options: NSKeyValueObservingOptionNew
context: NULL];
I'm not sure if the frame property is Key-Value Coding compliant, though.
Or, and this is what I would recommend, you can:
(1) Create a [Prefix]ToggleViewLocationDelegate protocol, and add a locationDelegate property to your view/view controller, like so:
// YourViewOrViewControllerClass.h
// ...
#class YourViewOrViewControllerClass
#protocol [Prefix]ToggleViewLocationDelegate
- (void) toggleViewDidChangeFrame: (UIView*) toggleView;
#end
#class YourViewOrViewControllerClass
#property (nonatomic, assign) id<[Prefix]ToggleViewLocationDelegate> locationDelegate;
...
#end
(2) Make the class interested in the coordinate changes conform to the protocol, and
(3) In the touchesMoved:withEvent: method, inform the locationDelegate, like so:
- (void) touchesMoved: (NSSet*) touches
withEvent: (UIEvent*) event
{
UITouch* touch = [touches anyObject];
if (touch.view == toggle)
{
/* Fetch the new touch location */
CGPoint location = [touch locationInView: self.view];
/* Set the toggle's location */
CGPoint newLocation = CGPointMake(toggle.center.x, location.y);
toggle.center = newLocation;
/* Inform the delegate when applicable */
if (self.locationDelegate != nil)
{
[self.locationDelegate toggleViewDidChangeFrame: toggle];
}
}
}
In the following code the
self.viewDelegate is 0x0 and I don't know how to solve
this problem. Should I have to go to the interface Builder?
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
if ([touch tapCount] == 2) {
if ([self.viewDelegate respondsToSelector:#selector(openFlowView:doubleTapOnIndex:itemView:)])
[self.viewDelegate openFlowView:self doubleTapOnIndex:selectedCoverView.number itemView:selectedCoverView];
}
viewDelegate is not defined.
Usually, in Apple class, view delegate is just call delegate. You should have taken this code from somewhere.
You have to defined viewDelegate in your code (not IB)
self.viewDelegate=xxxx;
xxxx must be your delegate. Delegate could be self !
See Apple doc for delegate explanation
I am working to make my iPhone app compatible for the iPad. The user can tap or swipe the screen to activate certain functions, and it works fine on my iPhone version. There is a UIScrollView on the page which I have subclassed to make it "swipeable," i.e. it passes up all of its touch functions to its superview as such:
#implementation SwipeableScrollView
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
[self.superview touchesBegan:touches withEvent:event];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesMoved:touches withEvent:event];
[self.superview touchesMoved:touches withEvent:event];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesEnded:touches withEvent:event];
[self.superview touchesEnded:touches withEvent:event];
}
#end
This works fine on the iPhone version, passing both taps and swipe gestures, but on the iPad, I get the following strange behavior:
Taps are passed to the superview properly.
But, swipe gestures are not passed at all.
Any idea why this would be working on the iPhone but not the iPad?
The problem is, that UIScrollView on the iPad cancels content touches very fast, even if canCancelContentTouches is set to NO. Also, overwriting -touchesShouldCancelInContentView: does not help. Read more here: link text
While overriding touch events was necessary on the iPhone, for the iPad Apple has introduced UIGestureRecognizers that make tasks like this much more straightforward and easy to implement. You will probably need to refactor your code to use them.
you can make custom uiscrollView and implements it's delegate so you can do what you want with scrollview it works with me fine
#import <UIKit/UIKit.h>
#protocol ScrollingDelegate <NSObject>
#required
- (void)scrolltouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)scrolltouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)scrolltouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
#end
Implementation
#interface CustomScrollView : UIScrollView
{
id <ScrollingDelegate> delegate;
}
#property (nonatomic,strong) id scrollDelegate;
#end
#import "CustomScrollView.h"
#implementation CustomScrollView
#synthesize scrollDelegate;
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
}
return self;
}
- (id)initWithCoder:(NSCoder *)aDecoder
{
self=[super initWithCoder:aDecoder];
if (self)
{
}
return self;
}
-(BOOL)touchesShouldCancelInContentView:(UIView *)view
{
return NO;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[self.scrollDelegate scrolltouchesBegan:touches withEvent:event];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[self.scrollDelegate scrolltouchesMoved:touches withEvent:event];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[self.scrollDelegate scrolltouchesEnded:touches withEvent:event];
}
-(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
}
#end