In my game if I play a particular game for several times, my touches need more time to be detected.
It stores all touches and then applies those touches all at the same time.
Can anybody tell me what's the problem?
In touchesBegan I wrote:
if (CGRectContainsPoint([tapView frame], [touch locationInView:self])
&& tapView.alpha == 1) {
[self callTapCode];
}
This is the code of touchesEnded. If I tapped and release the tapped it shows one tapping event.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if (checkTap == TRUE && tapView.alpha == 1 )
tap_effect_view.alpha = 0;
}
- (void)callTapCode {
// Move player code by 6 pixels
// not possible to write all code
}
In tapView I continuously tap. callTapCode moves the player by six pixels. But after some time my touches detected very slowly, so that the player looks like he's jumping around. I played the game continuously 15 to 16 times.
You might work through this tutorial to learn how to use the Leaks Instrument. This is part of the Instruments suite that comes with Xcode, which will, among other things, help you track down memory leaks and general performance issues with your application.
I found the solution to my problem. In my game I had enabled the tapView.multipleTouchEnabled = TRUE
tapView is the view where I was continuously tapping.
When I make it FALSE it works.
i.e.
tapView.multipleTouchEnabled = FALSE;
I exactly dont know how. But it works.
Thanks for the replies.
Try to look for any memory leaks. Maybe the iPhone has to use virtual memory a lot.
Related
Hello I've made an iOS game named 'Racing Horses' and published it to App Store. It was fine with playing on iOS 8.x.x, but after I installed iOS 9 Beta 3, in the same game (same codes), iPhone cannot recognize multiple touches. I have to leave my finger to make the next touch. But it was not like this, I could make a new tap even if I still hold my previous tap. What is the problem, what should I do?
I had the same problem on a game launched this summer.
I had to explicitly enable multiple touch in the SKScene:
-(void)didMoveToView:(SKView *)view {
self.view.multipleTouchEnabled = YES;
}
Here's more detail -
The game uses sub-classes of SKSpriteNode.
They test for number of touches depending on the the sprite.
In the sub-class:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"TapCount = %lu", (unsigned long)touches.count);
if (touches.count == 2) {
// do something
}
}
It looks like as of ios 9 multitouch has to be explicitly enabled. I don't think this used to be the case. I now have this issue on all my spritekit apps. Just adding self.view.multipleTouchEnabled = YES; in viewDidLoad, fixes it for me.
Just a simple mistake, I've enabled multitouch at interface builder, problem solved. But I don't know how it turned off by itself :)
I'm trying to do an app where a short sound sample is supposed to be played while the person using the app is dragging his/her finger(s) across the screen. When the finger(s) are lifted away from the screen - the sound will stop.
This is the current function that triggers the sound (I've tried various methods):
-(BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event{
NSLog(#"Ja, den börjar...");
return YES;
}
-(void)ccTouchMoved:(NSSet *)touch withEvent:(UIEvent *)event{
soundFile = [[CDAudioManager sharedManager] audioSourceForChannel:kASC_Right];
[soundFile load:#"sound.wav"];
soundFile.backgroundMusic = NO;
soundSourceForFile:#"sound.wav"]retain];
}
This is the function that stops the sound:
-(void)ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event{
[soundFile stop];
}
I first started out using the ccTouchBegan (Just to get some kind of sound working), which looped the sound seamlessly. At this point the ccTouchEnded worked together with the "Touch Up Inside" event.
The point, as I said, is the the sound is supposed to be played when the user drags his/her finger(s) across the screen. But when I tried to the tie the playSound function to the "ccTouchMoved" the sound loops repeatedly over itself, instead of one at the time, making it hell to use. The stopSound function doesn't work after i changed to the ccTouchMoved.
I tried to use NSTimer to create some kind of way to handle the loops, but without any success.
I started this project with the regular iOS SDK, and found my limitations when i found out i wasn't able to handle pitch & gain manipulation without Cocos2d.
I got everything working in the regular SDK by wrapping it in a if-statement:
if(![mySound isPlaying]{
[mySound play];
}
This, as I said, worked perfectly fine in the regular SDK, but not now when I'm using Cocos2d.
ccTouchMoved will be called continuously as the finger moves along the screen. The problem you are having here is that each time this is called you are loading a new sound file and they are overlapping because they are newly created individual objects. You only have a reference to the final sound you load (which is what soundFile is pointing at) and you aren't freeing up the memory either.
Example:
(as you drag your finger)
LoadedSoundA created and starts playing
soundfile points to LoadedSoundA
// finger moves
LoadedSoundB created and starts playing
soundfile points to LoadedSoundB
// finger moves
LoadedSoundC created and starts playing
soundfile points to LoadedSoundC
... etc
the only sound you have a pointer to at the moment is the last created sound, since you reassign soundfile each time. So you can only 'stop' the sound you created last.
You are also leaking a lot of memory since you are retaining all of these sounds and never releasing them.
I would suggest a different tactic:
In touchesBegan you should load the sound and have it play on loop and record the time of the touch into a class level iVar.
Now, in TouchesMoved you should get the time of the current touch and see if it is close enough to the time you recorded. If it is within say, 0.5 seconds then just update the recorded timestamp and continue; However, if it has been too long since the last touch you stop the sound that is playing.
This way you have a seamless sound being played, it is only created once and you maintain your ownership of it.
Hope this helps
I'm currently developing a game on iPhone using the Cocos2D API. It's going well. One issue I'm having though is that I have to recompile each time I want to change my variables. This is very tedious, especially now that I am tweaking gameplay.
Is there any implementation for a sort of developer console screen? What I mean is: I want to have a type of game screen that I load, that contains a list of variables that I register to the game screen(with scroller). And I want to be able to modify these variables on the spot.
I remember there being a presentation on a WWDC event in which they showed such a screen on an ipad. The developer would just press a button and the gamescreen would change to a developer console like screen. I know this presentation had nothing to do with Cocos2D, but still, if this already exists in some shape or form, I would love to re-use this code instead of writing it on my own.
Though if I had to write it on my own, I wouldn't really know where to start. So any help there would be appreciated as well.
Thx!
It was (I believe) Graeme Devine at Apple's WWDC last year who had some suggestions on how to implement such a developer console (check the video on iTunes University). An example called Game Console is included with the example code of WWDC 2010 (232 MB). I've also added a link (57 kb) to GameConsole.zip from DropBox, for convenience.
This is a seriously backdated reply but we implemented a developer console for Mega Run to test out various stages and modify player properties at run time. Implementation was to tap in the top left corner of the screen at any point in the game to bring up the console. From there you could modify to your needs. The skeleton of the implementation was to override EAGLView and handle the touchesBegan touch callback yourself. Here is the implementation...
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
const CGFloat DEV_DASHBOARD_ENABLE_TOUCH_AREA = 20.0f;
for (UITouch* t in touches)
{
CGPoint pt = [t locationInView:self];
if (pt.x < DEV_DASHBOARD_ENABLE_TOUCH_AREA && pt.y < DEV_DASHBOARD_ENABLE_TOUCH_AREA)
{
ToolSelectorContainer* editorViewController = [[ToolSelectorContainer alloc] initWithNibName:#"ToolSelectorContainer" bundle:nil];
if (editorViewController != nil)
{
CCScene* g = [CCDirector sharedDirector].runningScene;
// Pause the game if we're in it playing
//
if ([g isKindOfClass:[Game class]])
[((Game *)g) menuPause];
[[GSGCocos2d sharedInstance].navigationController pushViewController:editorViewController animated:YES];
[editorViewController release];
break;
}
}
}
#endif
[super touchesBegan:touches withEvent:event];
}
ifdef is used to not compile this code for production builds.
I started using the MoveMe sample to get touch input working.
basically, I define these two callback functions to get my touch input:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for ( UITouch* touch in touches )
{
printf("touch down");
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
for ( UITouch* touch in touches )
{
printf("touch up");
}
}
This works fine, until you have more than 5 touches on the screen at once. then it stops working properly, you won't get the "touch down" message if there are more than 5 touches on the screen. what is even worse is that you won't reliably get all of the "touch up" messages until you have removed ALL your fingers from the screen.
If you touch with 6 fingers, then release 3, then touch again with the other 3 still down, you will get the "touch down" but if you release it, some times you get the "touch up" sometimes you don't.
This pretty much makes it impossible to track touches, and usually results in a touch getting 'stuck' permanently down, when passed to my Touch Manager.
Are there some better apis to use to get touch input? is there at very least a function you can call to reliably get whether the screen is currently touched or not? that way I could reset my manager when all fingers are released.
EDIT:
right, there must be something I'm missing. because currently the calculator does something I cannot do with those callbacks.
it only accepts one touch at a time, if there is more than one touch on the screen it "cancels" all touches, but it must keep track of them to know that there is "more than one" touch on the screen.
if I touch the screen the button goes down, now if I add another touch to the screen, the button releases, cool, not allowed more than one touch. now, if I add 4 more fingers to the screen, for a total of 6, the screen should break, and when I release those 6 fingers, the app shouldn't get any of the "up" callbacks. yet when I release all of them and touch again, the button depresses, so it knows I released all those fingers!! how??
The problem you have is that the iPhone and iPod touch only support up to five touches at the same time (being fingers still touching the screen). This is probably a hardware limit.
(As St3fan told you already.)
The system will cancel all touches if there are more than 5 at the same time:
touchesCancelled:withEvent:
(This is probably what causes the odd behavior with only some touches calling touchesEnded:withEvent:)
If you want to know if a touch ended and it ended because it was lifted then make sure to check the UITouch's phase property.
It stops working because 5 is the max amount of touches that the iPhone and iPod currently support. No way around that I'm afraid.
It's really strange. I have a blank UIImageView subclass that implements the -touchesEnded:, -touchesMoved, and -touchesBegan: methods. The implementations of these methods are empty. They just do nothing. However, when I run Instruments with "Leaks", and I touch the UIImageView and move my finger outside of that UIImageView while still touching the screen, I get an Memory Leak warning from Instruments.
In my demo app there's no object allocation happening when doing that. The methods are empty. Everything I read in Instruments is related to Foundation and Run Loop stuff. I've checked my class twice and removed any object allocation. It's just an skelleton that only shows an image, but that image is not changed when touching it or moving the finger on the screen. That makes no sense.
Did anyone else encounter problems like this?
UPDATE: I testet a little more around and figured out, that memory leaks happen at any spot on the screen when tapping fast around with 5 fingers. Everything I get from Instruments.app is regarding some run and event loops. It seems like if the device can't handle the touches fast enough and then gets stuck at some point with releasing allocated objects. Please try it out and report here if you can see the same problems.
UPDATE: I've tested now a few Apple example apps as well. When I hack with 3 - 5 fingers around on the screen, like a normal user does (yes, they will do!), then Instrument shows up memory leaks regarding event and run loops. Definitely there's a big in the framework, or in instruments. Tested with iPhone OS 2.2.1.
As reading on an apple forum, it's an unsolved problem in the SDK. It happens when the accelerometer delegate is not nil. Touch event objects are allocated but never freed. The faster the accelerometer delegate is called, the faster those allocation failures happen. Many of the apple sample codes show the same problem. I had the accelerometer turned on.
But I also encountered, that this kind of leaks happen when a touch is tracked from one view onto another. If I keep touching one and the same view and moving my finger on that view without leaving it, I'll not get that problem.
Solutions: Turn accelerometer off (delegate set to nil), reduce the amount of views in your app. I don't know if they fixed that issue in iPhone OS 3.0.
Unfortunately, this will not help:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[[UIAccelerometer sharedAccelerometer] setDelegate:nil]; // because of framework bug
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[[UIAccelerometer sharedAccelerometer] setDelegate:self]; // because of framework bug
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[[UIAccelerometer sharedAccelerometer] setDelegate:self]; // because of framework bug
}
More info at: http://discussions.apple.com/thread.jspa?messageID=9396584t