This seems quite complicated to me, and I cannot get it to work.
Basically what I want to do, is I have created 9 objects on the screen, and I want to be able to drag all of them, but I only want to be able to drag one at a time.
Eg, drag item 1, stop dragging item 1. Drag item 3, stop dragging item 3. Drag item 2, stop dragging item 2.
I will put the 'VERY' simplified code below, but it will probably be pretty much useless, I just don't want to put all my code in, as it is very badly written at the moment, and doesn't make much sense.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
self.center = CGPointMake(pos.x+difference.x, pos.y+difference.y);
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
}
Set exclusiveTouch to YES for all 9 view objects. Then it will cause only one view to be touched at a time.
See the reference
Related
I am working on detecting alphabetic gestures in my app. So when the user draws a C in the screen there is a special action that takes place and so on. I am using recognizer class that has pre defined data about each alphabets touch points and the detection is ok. I want this feature in all my screens so i add the below methods to appDelegate class and detect touches in the window only, what happens here is that other views like tableview ,scrollview inside screens block the touch events from being sent to the window - If that happens perfectly then my code would work like a charm. Any help is appreciated.
- (void)processGestureData
{
NSString *gestureName = [recognizer findBestMatchCenter:¢er angle:&angle score:&score];
NSLog(#"gesture Name: %#",gestureName);
if ([gestureName isEqualToString:#"N"] || [gestureName isEqualToString:#"n"])
{//handle N gesture
}
if ([gestureName isEqualToString:#"C"])
{//handle C gesture
}
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[recognizer resetTouches];
[recognizer addTouches:touches fromView:self.window];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[recognizer addTouches:touches fromView:self.window];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[recognizer addTouches:touches fromView:self.window];
[self processGestureData];
}
I think you need some touch intercepting window that will sit about all touches.
If your gesture is recognised, process that, else pass on touch to your view controller. Refer this link for details
I am already visiting array of UIGestureRecognizer in the scrollview and make maximumNumberOfTouches to be 2 to allow both one/two finger swipe/drag gesture for my custom scrollview.
The thing I want to do is to identify between when it's a one finger drag and when it's a two finger drag. Is there a way to achieve this?
The method -(NSUInteger)numberOfTouches of UIGestureRecognizer could tell you how many touches on it.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
int count = [[event allTouches] count]; // this should be 1 or 2
}
I have a UIScrollView that is set to have a clear background. Part of the scrollview does have content, but part does not (so it shows other views behind it). I would like to be able to click through the UIScrollView and to the MKMapView behind, but only for the transparent portion of the UIScrollView.
I have found some code which I am having a real hard time understanding how to get working:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if (![self yourMethodThatDeterminesInterestingTouches:touches withEvent:event])
[self.nextResponder touchesBegan:touches withEvent:event];
}
Could someone help me wrap my mind around how to forward a touch event to a view that is behind another view? Can I call - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event from a UIViewController?
What we did was to subclass UIScrollView and implement logic that passes responsibility down to views under it, if the touch happens inside of the transparent area.
In our case the transparent area is defined by contentOffset of 120 on Y axis, meaning our content starts 120 points below the start of the UIScrollView, and the code looks like this:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
if (self.contentOffset.y < 0 && point.y < 0.0) {
return NO;
} else {
return YES;
}
}
Obviously this response is well past its prime but hopefully this is helpful to anyone searching for a solution.
Basically, it's up to you to determine what touch events you care to forward to another responder. If you simply want to forward all touch events, just remove that if statement in the code you posted so the next responder will receive all the touch events.
I'm making an small game for iPhone in openGL.
First I removed the "status bar" by writting
[[UIApplication sharedApplication] setStatusBarHidden:YES];
Which worked, but only removed the status bar when my app began to run. Then I modified my project.plist
<key>UIStatusBarHidden</key>
<true/>
And now the status bar is never show, just how I wanted. The problem is that I'm reading touches without problem in any portion of the screen, except for the zone where the status bar used to be.
// This method deals with events when one or more fingers touch the screen
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[myProject newTouch:touches withEvent:event];
[self.nextResponder touchesEnded: touches withEvent:event];
}
// This method deals with events when one or more fingers moves while touching the screen
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[myProject movingTouch:touches withEvent:event ];
}
// This method deals with events when one or more fingers stops touching the screen
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[myProject oldTouchEnded:touches withEvent:event ];
}
// This method deals with events when the system is interrupted ( for example an incomming call)
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
//
}
I guess that hiding the bar is not enough and it must be removed, but how can I do it ?, or there Is another solution ?
What's the size of the view you're reading in? Sometimes people hide the status bar but forget to resize their view to cover the appropriate area. The complete screen is 320x480 - make sure your height is the full 480px, not 460 or smaller.
There is a bug in the simulator: it doesn't register touches where the status bar is (or would be). It works properly on the device, though.
Are you testing on the simulator or on the device?
I have the following code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
NSUInteger touchCount = 0;
// Enumerates through all touch objects
for (UITouch *touchb in touches){
touchCount++;
}
// When multiple touches, report the number of touches.
if (touchCount > 1) {
lblStatustouch.text = [NSString stringWithFormat:#"Tracking %d touches", touchCount];
} else {
lblStatustouch.text = [NSString stringWithFormat:#"Tracking 1 touch", touchCount];
}
When I run it, it never detects more than one touch. Is there some setting that may prevent my app from taking multiple touches? Or am I missing something here?
You need to enable "Multiple Touch" on your View in InterfaceBuilder
alt text http://img.skitch.com/20090227-rpkafsxtg56pujk1h1583if88i.jpg
or if you have created the View in code it's set with
[theView setMultipleTouchEnabled:YES];
Also know that you will only get multiple touches in touchesMoved if both fingers are moving at the same time. If you have a single finger fixed on the screen, and move a second finger around, the phone will only report the finger that is moving.