ccTouchMoved triggers sound repeatedly & ccTouchEnded doesn't work - iphone

I'm trying to do an app where a short sound sample is supposed to be played while the person using the app is dragging his/her finger(s) across the screen. When the finger(s) are lifted away from the screen - the sound will stop.
This is the current function that triggers the sound (I've tried various methods):
-(BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event{
NSLog(#"Ja, den börjar...");
return YES;
}
-(void)ccTouchMoved:(NSSet *)touch withEvent:(UIEvent *)event{
soundFile = [[CDAudioManager sharedManager] audioSourceForChannel:kASC_Right];
[soundFile load:#"sound.wav"];
soundFile.backgroundMusic = NO;
soundSourceForFile:#"sound.wav"]retain];
}
This is the function that stops the sound:
-(void)ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event{
[soundFile stop];
}
I first started out using the ccTouchBegan (Just to get some kind of sound working), which looped the sound seamlessly. At this point the ccTouchEnded worked together with the "Touch Up Inside" event.
The point, as I said, is the the sound is supposed to be played when the user drags his/her finger(s) across the screen. But when I tried to the tie the playSound function to the "ccTouchMoved" the sound loops repeatedly over itself, instead of one at the time, making it hell to use. The stopSound function doesn't work after i changed to the ccTouchMoved.
I tried to use NSTimer to create some kind of way to handle the loops, but without any success.
I started this project with the regular iOS SDK, and found my limitations when i found out i wasn't able to handle pitch & gain manipulation without Cocos2d.
I got everything working in the regular SDK by wrapping it in a if-statement:
if(![mySound isPlaying]{
[mySound play];
}
This, as I said, worked perfectly fine in the regular SDK, but not now when I'm using Cocos2d.

ccTouchMoved will be called continuously as the finger moves along the screen. The problem you are having here is that each time this is called you are loading a new sound file and they are overlapping because they are newly created individual objects. You only have a reference to the final sound you load (which is what soundFile is pointing at) and you aren't freeing up the memory either.
Example:
(as you drag your finger)
LoadedSoundA created and starts playing
soundfile points to LoadedSoundA
// finger moves
LoadedSoundB created and starts playing
soundfile points to LoadedSoundB
// finger moves
LoadedSoundC created and starts playing
soundfile points to LoadedSoundC
... etc
the only sound you have a pointer to at the moment is the last created sound, since you reassign soundfile each time. So you can only 'stop' the sound you created last.
You are also leaking a lot of memory since you are retaining all of these sounds and never releasing them.
I would suggest a different tactic:
In touchesBegan you should load the sound and have it play on loop and record the time of the touch into a class level iVar.
Now, in TouchesMoved you should get the time of the current touch and see if it is close enough to the time you recorded. If it is within say, 0.5 seconds then just update the recorded timestamp and continue; However, if it has been too long since the last touch you stop the sound that is playing.
This way you have a seamless sound being played, it is only created once and you maintain your ownership of it.
Hope this helps

Related

SpriteKit + AVAudioPlayer in touchesMoved: kills fps

In a SpriteKit app, I am playing a click sound when the user moves a block on the screen, but this causes terrible lagging (fps dropping to near zero).
Code:
• self.audioPlayers is a strong NSMutableArray which holds currently playing AVAudioPlayers and removes them when the audioPlayerDidFinishPlaying: delegate method is called.
• The blockShouldMove method compares the touch location to its previous location and only returns YES if the user has moved the cursor enough distance so we only play a maximum of around 8 sounds simultaneously.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *oneTouch = [touches anyObject];
CGPoint location = [oneTouch locationInNode:self];
if (![self blockShouldMove:location]) {
return;
}
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:#"click_01.mp3" ofType:nil];
AVAudioPlayer *audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:soundFilePath] error:nil];
[audioPlayer prepareToPlay];
audioPlayer.volume = 1.0;
audioPlayer.numberOfLoops = 0;
audioPlayer.delegate = self;
[audioPlayer play];
[self.audioPlayers addObject:audioPlayer];
}
On both simulators and real devices (iOS and tvOS), if I make circles with my finger, the fps drops to almost nothing until well after I have even released my finger.
If I remove the whole AVAudioPlayer and use [SKAction playSoundFileNamed:#"click_01.mp3" waitForCompletion:NO], everything works fine. But unfortunately, SKAction sound handling is terrible for my purpose because the volume cannot be set.
Is there anything that can be done to make this better?
This is an indirect sort of "answer", more a comment but it's too long to write it in that form.
I ran into something like this issue with a SpriteKit game. I solved it by dropping the AVAudioPlayers and instead using the scene's audio engine. I connected a bunch of AVAudioPlayerNodes to the main mixer node and started them playing (nothing). Whenever I'd want a sound effect of some sort, I'd grab the next audio player node (round robin fashion, and there were enough of them so that I was sure it would be idle) and schedule a preloaded sound buffer for it.
The point is that for more complicated audio in SpriteKit, you may need to go through the audio engine and build a sound graph that's appropriate for what you want to do. Multiple AVAudioPlayers existing at once seem to be stepping on each other somehow and causing lag.
However if you didn't try the SKAudioNode functionality yet, I'd say do that first. You should be able to run an action on the audio node to adjust volume. I didn't go that route since I couldn't get the stereo panning to work the way I wanted with that setup. If that works for what you want to do, it's probably simpler than setting up your own sound graph.
If you want to see the solution I eventually used, you can look at https://github.com/bg2b/RockRats/blob/master/Asteroids/Sounds.swift

Is CCScene replace during CCTransition possible?

Calling replaceScene: with a CCScene on CCDirector which is during a CCTransition results in dealloc not being called on any scene used, moreover, after this operation any scene is not being displayed.
Here is a link to the sample project
Shortest way to obtain this behavior is such method (I mean reproducing the problem):
SceneOne *destScene = [SceneOne node];
CCTransitionFade *transition = [[CCTransitionFade alloc]initWithDuration:2 scene:destScene];
[[CCDirector sharedDirector]replaceScene:transition];
double delayInSeconds = 1.0;
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, (int64_t)(delayInSeconds * NSEC_PER_SEC));
dispatch_after(popTime, dispatch_get_main_queue(), ^(void){
[[CCDirector sharedDirector]replaceScene:[SceneTwo node]];
});
My question is: is there any feasible way to replace scenes when the CCTransition is being performed by the CCDirector?
I have implemented a delegate callback from CCDirector informing me about ending scene replacement, but this is never called if I push the iPhone home button during CCTransition.
If the answer is no, is there a cocos2d-iphone way to achieve goal described below?
Originally, this problem arised when I wanted to add "loading scene" when applicationDidEnterBackground (as a background task) or applicationWillEnterForeground, but I realized it has nothing to do with background execution. My ultimate goal is to provide seamless user experience while a game is waiting for Game Center authentication handler being called, and a "loading scene" being displayed from the very start of the app awaked from background would be sufficient (not only preventing user interaction which can be done many ways, but not showing previous game UI to the user). However, this solution is susceptible to the problem described above - if user taps the home button during scene transition he is going to get a very strange screen after bringing the game from background.
EDIT: After some more research I didn't find any satisfactory solution to replacing scene during CCTransition, however, problem described above was solved by not calling replaceScene but by adding a "loading" CCLayer to the visible CCScene, as #LearnCocos2D (thanks Steffen) suggested. This is not perfect, since adding a child during scene transition still has some narrow window (in the sense of running time) when strange results occur, but it is much better than replacing the scene. I would like to mention that it only concerns me when testing my game on 3GS, since newer devices are significantly faster and it is very difficult to reproduce the problem of "home button" clicking during CCTransition on something faster than 3GS.

reliable way to get iPhone touch input?

I started using the MoveMe sample to get touch input working.
basically, I define these two callback functions to get my touch input:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for ( UITouch* touch in touches )
{
printf("touch down");
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
for ( UITouch* touch in touches )
{
printf("touch up");
}
}
This works fine, until you have more than 5 touches on the screen at once. then it stops working properly, you won't get the "touch down" message if there are more than 5 touches on the screen. what is even worse is that you won't reliably get all of the "touch up" messages until you have removed ALL your fingers from the screen.
If you touch with 6 fingers, then release 3, then touch again with the other 3 still down, you will get the "touch down" but if you release it, some times you get the "touch up" sometimes you don't.
This pretty much makes it impossible to track touches, and usually results in a touch getting 'stuck' permanently down, when passed to my Touch Manager.
Are there some better apis to use to get touch input? is there at very least a function you can call to reliably get whether the screen is currently touched or not? that way I could reset my manager when all fingers are released.
EDIT:
right, there must be something I'm missing. because currently the calculator does something I cannot do with those callbacks.
it only accepts one touch at a time, if there is more than one touch on the screen it "cancels" all touches, but it must keep track of them to know that there is "more than one" touch on the screen.
if I touch the screen the button goes down, now if I add another touch to the screen, the button releases, cool, not allowed more than one touch. now, if I add 4 more fingers to the screen, for a total of 6, the screen should break, and when I release those 6 fingers, the app shouldn't get any of the "up" callbacks. yet when I release all of them and touch again, the button depresses, so it knows I released all those fingers!! how??
The problem you have is that the iPhone and iPod touch only support up to five touches at the same time (being fingers still touching the screen). This is probably a hardware limit.
(As St3fan told you already.)
The system will cancel all touches if there are more than 5 at the same time:
touchesCancelled:withEvent:
(This is probably what causes the odd behavior with only some touches calling touchesEnded:withEvent:)
If you want to know if a touch ended and it ended because it was lifted then make sure to check the UITouch's phase property.
It stops working because 5 is the max amount of touches that the iPhone and iPod currently support. No way around that I'm afraid.

AVAudioPlayer to play short sounds on my iPhone?

I want to use AVAudioPlayer to play short sounds... because I have more control than System Sound.
I have several buttons which are hooked up to play several different short sounds.
Each sound file is about 1.5 - 2.0 seconds.
Everything works fine ...except ...I have to wait for the sound to stop before I can press the button and the sound will play again.
Currently the individual AVAudioPlayers are created on viewDidLoad ... and the sounds are called to play when the buttons are pressed
I tried creating the players when the button is pressed... this solves the above problem... but after a while the app crashes... all sound stops working.
Any ideas?
Thanks
Jonathan
If AVAudioPlayer stops playing sounds, it is usually because there are too many player instances created, usually leaked. You should make sure you release some after you are done.
You need to set the currentTime of the player to 0 before playing. Calling [player play] when it is already playing has no effect.
player.currentTime = 0;
[player play];
Creating new ones is fine, but make sure your header declares and does this:
- (void) audioPlayerDidFinishPlaying: (AVAudioPlayer *) player successfully: (BOOL) completed {
[player release];
}
if your app is simple, that will release the memory of anything done that's no longer necessary, as per mahboudz' suggestion.
This might not totally prevent crashing, depending on your sound file sizes, and if you dealloc the scene. You might want to add
if (self.view.superview) { //release }
or something.
This is all memory handling. You may want to brush up on pointers to know how to handle your player objects, literally within any function, and still feel comfortable placing release in dealloc OR the didFinish: delegate function, and know why it doesn't belong in the other one.
The iPhone can be interrupted, stopped, and messed with at any time, so you have to know how to handle the memory and how to deal with it when it happens. I just had a crash of a similar nature, and everything above applied to it. NSXML delegates would be a nightmare without knowing pointers... same for AVAudioPlayer
If you plan to use a lof of short sounds, you might want to think about switching to OpenAL.
It's not a lot more complicated and you will save yourselve some trouble in more complicated audio settings later on.
There's a tutorial and some useful code available here (archived here).
this is MAYBE because you have created the AVAudioPlayers inside the viewDidLoad(). I remember I had the same problem. Just initialize them inside the class but not there. hope it works!
EDIT:
ok i saw the date... 2009...

Why is detecting touches getting slower and slower?

In my game if I play a particular game for several times, my touches need more time to be detected.
It stores all touches and then applies those touches all at the same time.
Can anybody tell me what's the problem?
In touchesBegan I wrote:
if (CGRectContainsPoint([tapView frame], [touch locationInView:self])
&& tapView.alpha == 1) {
[self callTapCode];
}
This is the code of touchesEnded. If I tapped and release the tapped it shows one tapping event.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if (checkTap == TRUE && tapView.alpha == 1 )
tap_effect_view.alpha = 0;
}
- (void)callTapCode {
// Move player code by 6 pixels
// not possible to write all code
}
In tapView I continuously tap. callTapCode moves the player by six pixels. But after some time my touches detected very slowly, so that the player looks like he's jumping around. I played the game continuously 15 to 16 times.
You might work through this tutorial to learn how to use the Leaks Instrument. This is part of the Instruments suite that comes with Xcode, which will, among other things, help you track down memory leaks and general performance issues with your application.
I found the solution to my problem. In my game I had enabled the tapView.multipleTouchEnabled = TRUE
tapView is the view where I was continuously tapping.
When I make it FALSE it works.
i.e.
tapView.multipleTouchEnabled = FALSE;
I exactly dont know how. But it works.
Thanks for the replies.
Try to look for any memory leaks. Maybe the iPhone has to use virtual memory a lot.