I'm making an small game for iPhone in openGL.
First I removed the "status bar" by writting
[[UIApplication sharedApplication] setStatusBarHidden:YES];
Which worked, but only removed the status bar when my app began to run. Then I modified my project.plist
<key>UIStatusBarHidden</key>
<true/>
And now the status bar is never show, just how I wanted. The problem is that I'm reading touches without problem in any portion of the screen, except for the zone where the status bar used to be.
// This method deals with events when one or more fingers touch the screen
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[myProject newTouch:touches withEvent:event];
[self.nextResponder touchesEnded: touches withEvent:event];
}
// This method deals with events when one or more fingers moves while touching the screen
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[myProject movingTouch:touches withEvent:event ];
}
// This method deals with events when one or more fingers stops touching the screen
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[myProject oldTouchEnded:touches withEvent:event ];
}
// This method deals with events when the system is interrupted ( for example an incomming call)
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
//
}
I guess that hiding the bar is not enough and it must be removed, but how can I do it ?, or there Is another solution ?
What's the size of the view you're reading in? Sometimes people hide the status bar but forget to resize their view to cover the appropriate area. The complete screen is 320x480 - make sure your height is the full 480px, not 460 or smaller.
There is a bug in the simulator: it doesn't register touches where the status bar is (or would be). It works properly on the device, though.
Are you testing on the simulator or on the device?
Related
I am working on detecting alphabetic gestures in my app. So when the user draws a C in the screen there is a special action that takes place and so on. I am using recognizer class that has pre defined data about each alphabets touch points and the detection is ok. I want this feature in all my screens so i add the below methods to appDelegate class and detect touches in the window only, what happens here is that other views like tableview ,scrollview inside screens block the touch events from being sent to the window - If that happens perfectly then my code would work like a charm. Any help is appreciated.
- (void)processGestureData
{
NSString *gestureName = [recognizer findBestMatchCenter:¢er angle:&angle score:&score];
NSLog(#"gesture Name: %#",gestureName);
if ([gestureName isEqualToString:#"N"] || [gestureName isEqualToString:#"n"])
{//handle N gesture
}
if ([gestureName isEqualToString:#"C"])
{//handle C gesture
}
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[recognizer resetTouches];
[recognizer addTouches:touches fromView:self.window];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[recognizer addTouches:touches fromView:self.window];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[recognizer addTouches:touches fromView:self.window];
[self processGestureData];
}
I think you need some touch intercepting window that will sit about all touches.
If your gesture is recognised, process that, else pass on touch to your view controller. Refer this link for details
This seems quite complicated to me, and I cannot get it to work.
Basically what I want to do, is I have created 9 objects on the screen, and I want to be able to drag all of them, but I only want to be able to drag one at a time.
Eg, drag item 1, stop dragging item 1. Drag item 3, stop dragging item 3. Drag item 2, stop dragging item 2.
I will put the 'VERY' simplified code below, but it will probably be pretty much useless, I just don't want to put all my code in, as it is very badly written at the moment, and doesn't make much sense.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
self.center = CGPointMake(pos.x+difference.x, pos.y+difference.y);
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
}
Set exclusiveTouch to YES for all 9 view objects. Then it will cause only one view to be touched at a time.
See the reference
I have a UIScrollView that is set to have a clear background. Part of the scrollview does have content, but part does not (so it shows other views behind it). I would like to be able to click through the UIScrollView and to the MKMapView behind, but only for the transparent portion of the UIScrollView.
I have found some code which I am having a real hard time understanding how to get working:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if (![self yourMethodThatDeterminesInterestingTouches:touches withEvent:event])
[self.nextResponder touchesBegan:touches withEvent:event];
}
Could someone help me wrap my mind around how to forward a touch event to a view that is behind another view? Can I call - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event from a UIViewController?
What we did was to subclass UIScrollView and implement logic that passes responsibility down to views under it, if the touch happens inside of the transparent area.
In our case the transparent area is defined by contentOffset of 120 on Y axis, meaning our content starts 120 points below the start of the UIScrollView, and the code looks like this:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
if (self.contentOffset.y < 0 && point.y < 0.0) {
return NO;
} else {
return YES;
}
}
Obviously this response is well past its prime but hopefully this is helpful to anyone searching for a solution.
Basically, it's up to you to determine what touch events you care to forward to another responder. If you simply want to forward all touch events, just remove that if statement in the code you posted so the next responder will receive all the touch events.
I am writing an application where the user has to move some stuff on the screen using his fingers and drop them. To do this, I am using the touchesBegan,touchesEnded... function of each view that has to be moved.
The problem is that sometimes the views are covered by a view displayed using the [UIViewController presentModalViewController] function. As soon as that happens, the UIView that I was moving stops receiving the touch events, since it was covered up. But there is no event telling me that it stopped receiving the events, so I can reset the state of the moved view.
The following is an example that demonstrates this. The functions are part of a UIView that is being shown in the main window. It listens to touch events and when I drag the finger for some distance, it presents a modal view that covers everything. In the Run Log, it prints what touch events are received.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"touchesBegan");
touchStart=[[touches anyObject] locationInView:self];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint touchAt=[[touches anyObject] locationInView:self];
float xx=(touchAt.x-touchStart.x)*(touchAt.x-touchStart.x);
float yy=(touchAt.y-touchStart.y)*(touchAt.y-touchStart.y);
float rr=xx+yy;
NSLog(#"touchesMoved %f",rr);
if(rr > 100) {
NSLog(#"Show modal");
[viewController presentModalViewController:[UIViewController new] animated:NO];
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"touchesEnded");
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"touchesCancelled");
}
But when I test the application and trigger the modal dialog to be displayed, the following is the output in the Run Log.
[Session started at 2010-03-27
16:17:14 -0700.] 2010-03-27
16:17:18.831
modelTouchCancel[2594:207]
touchesBegan 2010-03-27 16:17:19.485
modelTouchCancel[2594:207]
touchesMoved 2.000000 2010-03-27
16:17:19.504
modelTouchCancel[2594:207]
touchesMoved 4.000000 2010-03-27
16:17:19.523
modelTouchCancel[2594:207]
touchesMoved 16.000000 2010-03-27
16:17:19.538
modelTouchCancel[2594:207]
touchesMoved 26.000000 2010-03-27
16:17:19.596
modelTouchCancel[2594:207]
touchesMoved 68.000000 2010-03-27
16:17:19.624
modelTouchCancel[2594:207]
touchesMoved 85.000000 2010-03-27
16:17:19.640
modelTouchCancel[2594:207]
touchesMoved 125.000000 2010-03-27
16:17:19.641
modelTouchCancel[2594:207] Show modal
Any suggestions on how to reset the state of a UIView when its touch events are interrupted by a modal view?
If you are controling when the modal view is being displayed, can you also send a notification at the same time to tell the rest of your app that they should reset the moved view?
I have the following code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
NSUInteger touchCount = 0;
// Enumerates through all touch objects
for (UITouch *touchb in touches){
touchCount++;
}
// When multiple touches, report the number of touches.
if (touchCount > 1) {
lblStatustouch.text = [NSString stringWithFormat:#"Tracking %d touches", touchCount];
} else {
lblStatustouch.text = [NSString stringWithFormat:#"Tracking 1 touch", touchCount];
}
When I run it, it never detects more than one touch. Is there some setting that may prevent my app from taking multiple touches? Or am I missing something here?
You need to enable "Multiple Touch" on your View in InterfaceBuilder
alt text http://img.skitch.com/20090227-rpkafsxtg56pujk1h1583if88i.jpg
or if you have created the View in code it's set with
[theView setMultipleTouchEnabled:YES];
Also know that you will only get multiple touches in touchesMoved if both fingers are moving at the same time. If you have a single finger fixed on the screen, and move a second finger around, the phone will only report the finger that is moving.