Touch Controls working on Simulator, not on Device - iphone

See topic. It's just with one application -- 2 of them work fine, but the third (and largest, go figure) doesn't respond to touch events. I tried changing a UIImageView's location on TouchesBegan, and that doesn't show up (but it does in the Simulator!)
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
gestureStartPoint = [touch locationInView:self.view];
mothership.center = CGPointMake(80,80);
//etc...
I've tried both debug and release modes. Any idea on what would cause this? The rest of the game runs fine (enemy ships appear and shoot at you, so I know the rest of the code is working). Any advice is much appreciated. Thank you!

Don't respond to touches in the individual views. This can be problematic. Instead use the backing view to handle all touches. A good sample of this technique is in the Apple Sample Code: Touches
(requires dev login)

Weirdly enough, updating my ipod to 3.0 did the trick!

Related

how to move view with my finger in iphone Objective-c

i have this view that the size of him is 1280 X 345 and im moving it left and right in my screen.
now, my main question is how do i make it move with my finger (not swipeLeft / swipeRight) i want it to move with my finger like the home screen of the iPhone IOS.
now, im using this code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:infoView];
//[self.view setTransform:CGAffineTransformTranslate(CGAffineTransformIdentity, (location.x + self.view.frame.origin.x - dx), 0.0)];
NSLog(#"touch: %f",location.x);
infoView.frame = CGRectMake(infoView.frame.origin.x + location.x, 0, 1280, 345);
}
that should be the current way but i cant figur it out.
i have also try to find an answer in google and here, but, as you know.. i didn't found something useful.
i've also made this so you can understand it better.
Don't write code to do this yourself. Use UIScrollView. This is what it is designed for.
If you want to have the same effect as the iPhone/iPad home screen you should use a UIScrollView and enable paging.
If you really want to handle it yourself for some reason, avoiding UIScrollView, there are UIGestureRecognizers, specifically UIPanGestureRecognizer which can leverage most of the job handling multiple touch events in one place.
i used the UIPagecontrol and UIScrollView to make it. very easy and smart object !

How to determine x&y of last touch in multitouch scenario?

I'm new to this site and to iOS programming.
I am working on a percussion app. For this I want to know the x and y location of every finger that touches the screen. I thought this was straightforward, but multitouch is making things confusing for me.
Suppose the user has two fingers pressed on the screen and the user presses a third finger on the screen. How do I determine the location of this third finger?
My feeling is that I need to implement touchesBegan
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
To determine the x and y location I have to look at the touch that triggered this call to touchesBegan. But the touches are presented in an unordered set. If the third finger triggered this touchesBegan, then I have three touches in the NSSet. But since the set is unordered, how do I determine the touch that triggered this third call to touchesBegan? If I understand my documentation correctly it could be any of those three touches.
Many thanks in advance
Maybe you can add a simple counter property and increase its value in touchesBegan and decrease in touchesEnd.
Okay, it now turns out I have been mis-interpreting my test-data. If two fingers already touch the device when a third finger touches the device, only one UITouch object is part of the NSSet in the call to touchesBegan, and not three as I seemed to experience. This one UITouch represents the last fingertouch.
The only time when more than one UITouch object is passed to touchesBegan is when in fact multiple fingers begin to touch the device at the same time.
Since, in my case, I need to handle all new touches based on their location, I need to handle all UITouch objects in the NSSet.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for ( UITouch *touch in touches)
{
CGPoint location = [touch locationInView:self.view];
// Handle finger touch at given location
// ...
}
}

iPad Simulator not receiving touch events outside of iPhone's 320x480 frame

-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
gestureStartPoint = [touch locationInView:self.view];
NSLog(#"test x:%f",gestureStartPoint.x);
NSLog(#"test y:%f",gestureStartPoint.y);
etc..
Srangely, I'm not receiving any log statements if I click outside a 320x480 frame (starting from upper-left corner). Elsewhere in touchesBegan I call other methods passing in the touch and these weren't responding, so put these NSLogs in.
What do I have to do to receive touch events from the full 1024x768 view?
Is your UIView actually the full size of the window?
I think there's a bug in the "Upgrade Current Target for iPad" task.
I fixed this issue by creating a new "Window XIB" with iPad as the product, then replacing the Window object in MainWindow-iPad.xib with the Window object in the new XIB. (Be sure to update the "window" outlet of your app delegate.)

Detecting image touch (cocos2d)?

In cocos2d how would you detect a touch on an image? I'm having a lot of trouble with this so thanks in advance!
You implement the ccTouchesBegan/Ended/Moved methods within your Layer class, and then check the touch location against the container of the nodes you wish to detect touches for.
For example:
-(BOOL)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch touch = [touches anyObject];
CGPoint location = [[Director sharedDirector] convertCoordinate: [touch locationInView: [touch view]]];
CGRect mySurface = CGRectMake(100, 100, 50, 50);
if(CGRectContainsPoint(mySurface)) {
// do something
return kEventHandled;
}
return kEventIgnored;
}
Now, this all changes in Cocos2D 0.8 (which is in active beta now) by using 'Touch Delegates' and examples can be seen in the 'Touches Test' (which appears to be a pong game from the source I just looked over).
I'm not sure why Corey said to use UIKit controls to detect touches, since Cocos2D has it's own way of handling them.
Only layers can receive touches - it is not advised that you use a Layer for each touchable 'game object' (ie; players and objects) ...
You need to overly invisible touch surfaces on top of the game using standard UIKit classes.
You then detect and interpret touches through those objects and pass the controls to your game.
If you have a more specific problem, you can provide more info or ask another question.
This post will give you the answer
Problem with cocos2D for iPhone and touch detection
Problem with cocos2D for iPhone and touch detection

What happens when dragging off screen?

I have an object that can be dragged around. Once the user's finger goes off screen and comes back, I loose the ability to drag the object. If the user then does another touch and drag, everything is fine.
How can I get notified once the user's finger drags back onto the screen? Since touchesBegan doesn't fire, I don't get any notification.
Here is my touchesMoved, which I call in the touchesBegan:
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
//stop object dragging at edge of screen
if(location.x > 35){
myboject.center = location;}
}
The described behaviour seems normal to me, and all built-in Apple apps behave the same way. Since there’s no touch screen outside of the touch screen (yep), I believe there’s no way the device can distinguish between touch beginning or moving from outside of the screen.