In very specific, but reproducible cases, I'm getting audioPlayerDecodeErrorDidOccur:error: with the following NSError:
Error Domain=NSOSStatusErrorDomain Code=-50 "The operation couldn’t be completed. (OSStatus error -50.)"
This occurs in a game that also uses OpenAL; we're playing sounds using OpenAL, but attempting to leverage hardware AAC decoding. However, the occurence of the above does not appear linked to anything we do in OpenAL.
This happens about 2-3 seconds after we perform scene (game mode) switching -- but only with certain combinations of from-and-to scenes. It is even stranger since we do nothing important audio related on these events. I've verified that we do nothing with AVAudioPlayer, but it doesn't seem that anything important is done with OpenAL either.
I've tried to resolve this by releasing the AVAudioPlayer, and replacing it with another one that references the same file, uses the same volume and resumes from the same time in the file. However, after a few seconds, this player also throws the aforementioned error. Switching to a new song, on the other hand, creates a fully functional player, that does not have any problems.
My question is: what does error -50 mean in this context, or how would you go about figuring out what it means?
(If it means anything, the game must run at minimum on iOS 3.1.2).
We have tried disabling OpenAL part of the code; it did not help.
Audio library code is publicly available at http://libxal.svn.sf.net/svnroot/libxal/trunk/
We managed to mess up something on the C++ level. This probably caused memory corruption in AVAudioPlayer without actually crashing the game, and behaving the same on the Simulator and the device. We fixed this and the AVAudioPlayer now works.
Related
i made an iPhone app that uses camera. It works fine in iPhone, no memory warnings at all. It is also running on iPod touch 4G but gives memory warnings and crashes after some time after getting level 2 warning.
If someone can point me the possible reason for this. Thanks.
The only way you are going to fix this is by being able to debug it on the device. I wrote this blog to explain how to debug EXC_BAD_ACCESS, which is what I assume you are getting
http://loufranco.com/blog/files/Understanding-EXC_BAD_ACCESS.html
The simplest things to do:
Run a Build and Analyze and fix every problem it finds (or at least rewrite it so that B&A doesn't think it's a problem) -- Having a clean B&A is a really good way to making sure you catch these problems early
Turn on Zombies and run your program -- this makes the last release sent to an object turn it into a zombie rather than dealloc it. Your program will leak tons of memory, but if you ever send a message to a zombie it will stop right there and you will see a bug that you need to fix, because in your real version, that would be a crash (sending a message to a dealloced object).
More techniques described at the link
It crashes on a specific operation or randomly?
if randomly,
use instrument to check your memory leaks and memory usage. It's hard to figure out where the problem lies in without going through all your app.
My app main thread is displaying a movie at the beginning , while other thread are doing background tasks.
Some however, are using PerformSelectorInMainThread to do some stuff.
What happens is that sometimes the movie just get stuck indefinitely , sometimes not , and sometimes get free after a couple seconds.
I'm trying to debug it, however when I'm pausing XCode while the app is stuck all I see is assembly code and I can't really understand anything from it. (I guess something like "symbols" on windows would be cool)
Is there a way to analyze more thoroughly what is running on the main thread and might stuck my video while it's playing ?
Moreover, how come the video get stuck anyway, if i'm playing a video from the main thread while other thread calls PerformSelectorInMainThread , what is really happening (I assumed it would add the selector as an event but won't disturb the movie from playing till the end) ?
Thanks for your help!!
Have you tried using Shark (one of the Instruments tools) to analyze samples?
When things are getting "stuck", it likely means the CPU is churning. Shark samples the CPU every so often (well, in human terms, VERY frequently) during a short burst (I would keep it under 5-10 seconds) and tells you what percentage of the time the CPU is spending on what tasks.
It does exactly what you mention - reverse engineers all the assembly code to look more like the debugger (well, not 100%, but enough).
That would be the first step - identifying the processor-heavy task that your performSelectorOnMainThread: code is calling that causes the video to gum up. Then, once you know what it is - the answer will either be obvious - or you'll have to change your architecture :)
I am researching for an application at the moment. One of the interesting ideas that came up were to record from both front facing camera and the facetime camera at the same time. Any of you know if this is feasible?
Thanks :)
EDIT:
I mean to say front and rear cameras. I want to record from both cameras at once to two separate streams. I hope I'm a little clearer.
It's something the API does allow for. I tried three approaches on an iPhone 4 running the latest iOS, 4.2.1.
Firstly, I tried using a single capture session with both video devices attached as inputs. Attaching the second device produces an exception:
Terminating app due to uncaught
exception
'NSInvalidArgumentException', reason:
'* Multiple audio/video
AVCaptureInputs are not currently
supported.'
Secondly, I tried setting up two different sessions, each with only one camera and starting them at the same time. This caused the first session to report frames for about a second, but as soon as the second starts the first stops of its own volition. The order in which you send 'startRunning' dictates which of the sessions ultimately manages to force the other out.
Finally, I tried a simple ping pong approach. So I create two sessions, start the first and as soon as it reports a frame, stop it and start the second. Then stop the second and start the first, ad infinitum. Sadly the latency between requesting a session start and receiving the first frame left me with about one frame every two seconds.
Of course it's possible I erred in my code, but I'm inclined to say that it's not possible on the current hardware or OS. I shall hook the AVCaptureSession notifications to see if I'm given an explicit reason why one stops and update this post.
Additions: my program receives only the AVCaptureSessionDidStartRunningNotification notifications, one from each capture session. The one that halts doesn't report an error, interruption or other stoppage. I am also unable to find an issue with my code, such as an object or dispatch queue reuse, that might conceivably cause this problem.
It appears to be possible to record from multiple video inputs using the AVFoundation API. According to the documentation, multiple AVCaptureDevice inputs can be used in an AVCaptureSession. In an iPhone 4 this means that a session could have both AVCaptureDevices for both cameras. In practice, it might not be feasible. I haven't tried it so I can't tell with certainty.
I don't think the standard UIImagePickerController can be used to record from both at the same time.
Is it possible to still play sound / music even if audio has been interrupted, or more precisely: even if MyInterruptionListener got called from the OS with the interruption state kAudioSessionBeginInterruption ?
Yeah I know that's not good idea to do. But want to know anyways.
By the time you get an interrupt message, the audio resources you've been using have been shut down.
For AVAudioPlayer, playback will be stopped until you start it again.
For OpenAL, your context will be invalid. All OpenAL commands will fail with an error until you clear the current context and set it to current again.
For Audio Units, your graph will be in an invalid state. No sound will be played until you set the graph inactive and then active.
I don't know what would happen if you tried to start your audio resources again upon getting an interrupt start message, but at the very least you'd get rejected in the app store if the reviewers ever discovered this behavior.
I've noticed that several iPhone apps, especially some more graphically intensive games take quite a while to exit when the home button is pressed.
My question is, weather it's possible to artificially recreate this situation, the reason being, that I'm trying to implement a sort of "phone protector" that starts making loud noises when some accelerometer data is read. The idea would be to have the AVAudioPlayer keep playing the sound for as long as possible (ie. until the iPhoneOS decides to kill the process for good).
I tried something like this in my app delegate, just to see how it reacts:
-(void)applicationWillTerminate:(UIApplication *)application{
NSInteger i;
while(true) {
i++;
}
}
What happens though, is that the home screen comes up immediately and the sound stops playing (the AVAudioPlayer instance is in a view controller), but the applications process is still in memory and in fact stops me from launching a new instance of the app until the old one is killed manually (this is all in the Simulator).
Any ideas?
You really need to test this on a real device.
I have the feeling that you get 6 seconds to exit, after which you will be killed.
By the way, AVAudioPlayer, might be being a good citizen and getting out of the way.
It does some strange and even annoying things under the hood. In this case you'll need something lower level, like a remote io audio unit. I know for a fact that if you don't stop this in applicationWillTerminate then it will happily go on making sound for a moment in the home screen.
As it turns out, AVAudioPlayer doesn't actually behave like a good citizen, however I was using it's callback when finished playing to loop the sound, but for some reason that didn't work after applicationWillTerminate got called. Using numberOfLoops = -1 and adding an loop (i++; didn't work, NSLog(#"qwerty") did) to the app delegate did.
Funny enough, the sound actually continues to play forever. I'm using the official SDK, but with a jailbroken phone.
Unfortunately I couldn't associate the cookie based user I have at my work computer with my OpenID, so I couldn't reply properly.
Anyway, I'm using 3.0 and the same exact thing happens on both the simulator and the device. I can't test on a non-jail broken phone at the moment, because Apple is taking forever and then some to approve our license, even though we already have several apps coded and ready for launch...
I'll let you know as soon as I find out.