I am currently using openAl in my game . I am getting the memory leak whose responsible frame is `
OALSource:AddPlaybackMessage
` What I am doing is that after my game finishes i delete all the buffers allocated to the Source and free other pcmdata pointers etc .. before the starting the game i re-initialize the source and their buffers with the audio data. At this stage am getting this leak. Any idea why its happening ..... Some posts say its a bug in apple's OpenAl library but i dont think so .. Apple must have done something about it ....
Thanks
ok Now am not getting any leaks from openAL .. What i have done is :-
While initializing sources again delete them first with their respective buffers and free any pcmData pointer.
regenerate sources and buffers.
Dont ever Delete openALContext and openALDevice . get context and device only once when your app starts . Delete them on Dealloc().
Hope this solves your openAl leak issues.
Cheers
Related
I need to play sounds (~5 seconds each) throughout my iphone application. When they're triggered, they need to play immediately.
For the moment I'm using AudioServices and (as you probably know) the first time you play a sound it lags, then every time there after it's perfect. Is there some code available that's clever enough to preload an AudioServices sound (by playing it silently maybe?). I've read adjusting the system volume programmatically will get your app rejected, so that's not an option. Seems AudioServices isn't made for volume correction from what I can see.
I've looked into OpenAL and while feasible seems a little over kill. AVAudioPlayer seems like a little bit of a better option, I'm using that for background music at present. Extending my music player to handle a 'sound board' might be my last resort.
On the topic of OpenAL, does anyone know of a place with a decent (app store friendly) OpenAL wrapper for the iPhone?
Thanks in advance
Finch could be perfect for you. It’s a tiny wrapper around OpenAL with very low latency and simple API. See also all SO questions tagged ‘Finch’.
If you use an AVAudioPlayer, you can call prepareToPlay when you initialize the object to reduce the delay between calling play and having the audio start.
Does it load the entire file at a time? Or does it act lazy and load a chunk at a time? I'm primarily interested in knowing how much memory my audio uses.
From what I understand of it, it buffers the file and you can have even more control over this process by using Audio Queues. Most implementations of OpenAL will load the entire file all at once which can be pretty intensive.
You can pre-buffer using prepareToPlay but honestly, I've never had any noticeable lag using wav's, caff's or mp4's. Calling the play method has to pre-buffer the audio anyways.
In all of my uses of AVAudioPlayer, it normally only causes a temporary jump in memory allocation. As long as you release the player after it is finished, memory isn't a problem. I've played as many as 15 sounds at once and never had issue.
More info on audio: http://developer.apple.com/iphone/library/documentation/iphone/conceptual/iphoneosprogrammingguide/AudioandVideoTechnologies/AudioandVideoTechnologies.html
I'm using Media Player Framework to access the user's music library on iPhone. I would like to set the playback starting position so that I can start playing a song from 30 second mark, for example.
I have trouble finding out how to do this. The MPMediaPlayerController only offers beginSeekingForward but that's not quite what I'm looking for as it simply accelerates the playback speed.
There is probably something really simple that I'm missing.
MPMusicPlayerController's property currentPlaybackTime is a writeable property, so adjusting the playback starting point can be done with player.currentPlaybackTime = 30.0
You can use player.currentPlaybackTime to set the time, before you start playing and playback will start at your desired point.
UPDATE
2009 me had some real problems. He didn't really understand properties and missed the fact that MPMusicPlayerController.currentPlaybackTime is writable! And he was angry. Angry because iOS3.0 had promised iPod Library "Access" and instead delivered MPMusicPlayerController. He had been hoping for speedy access to the music packet data upon which he would have built many fascinating and magical audio applications. Luckily, iOS4.1's AVAssetReader came along 1 year later and he was finally able to stop hating.
WRONG 2009 ANSWER
Nope, this API is deliberately crippled, which is why you don't see any functions for
opening, or streaming from, the media file.
Your only hope is lowering the volume and calling beginSeekingForward until currentPlaybackTime returns >= 30s.
Enjoy!
I have a performance-intensive iPhone game I would like to add sounds to. There seem to be about three main choices: (1) AVAudioPlayer, (2) Audio Queues and (3) OpenAL. I’d hate to write pages of low-level code just to play a sample, so that I would like to use AVAudioPlayer. The problem is that it seems to kill the performace – I’ve done a simple measuring using CFAbsoluteTimeGetCurrent and the play message seems to take somewhere from 9 to 30 ms to finish. That’s quite miserable, considering that 25 ms == 40 fps.
Of course there is the prepareToPlay method that should speed things up. That’s why I wrote a simple class that keeps several AVAudioPlayers at its disposal, prepares them beforehand and then plays the sample using the prepared player. No cigar, still it takes the ~20 ms I mentioned above.
Such performance is unusable for games, so what do you use to play sounds with a decent performance on iPhone? Am I doing something wrong with the AVAudioPlayer? Do you play sounds with Audio Queues? (I’ve written something akin to AVAudioPlayer before 2.2 came out and I would love to spare that experience.) Do you use OpenAL? If yes, is there a simple way to play sounds with OpenAL, or do you have to write pages of code?
Update: Yes, playing sounds with OpenAL is fairly simple.
AVAudioPlayer is very marginal for game audio. Tackling AudioQueue or OpenAL by adapting one of the examples is definitely the way to go. latency is much more controllable that way.
If you're calling play on the main thread, try running it on a separate thread. What I ended up doing is:
#include <dispatch/dispatch.h>
dispatch_queue_t playQueue = dispatch_queue_create("com.example.playqueue", NULL);
AVAudioPlayer* player = ...
dispatch_async(playQueue, ^{
[player play];
});
which fixed the worst of the framerate stuttering I was experiencing.
I use OpenAL and the classes that came with the CrashLanding sample code. It's worked fine so far to play samples and play looped music all at the same time. I'm currently learning how to release the memory I've allocated for a sound (.wav file) when, for example, I want to play some intro music just once.
Use CocosDenshion – it’s free, easy, and works. It wraps AVAudioPlayer for background tracks and OpenAL for sounds.
Do you want to check the buffering with the implementation you're using? It might be somehow related to the 20ms delay you're experiencing. i.e., try to play around with the buffer size.
I'm trying to build a video recorder without jailbreaking my iPhone (i've a Developer license).
I began using PhotoLibrary private framework, but i can only reach 2ftp (too slow).
Cycoder app have a fps of 15, i think it uses a different approach.
I tried to create a bitmap from the previewView of the CameraController, but it always returns e black bitmap.
I wonder if there's a way to directly access the video buffer, maybe with IOKit framework.
Thanks
Marco
Here is the code:
image = [window _createCGImageRefRepresentationInFrame:rectToCapture];
Marco
That is the big problem. So far i've solved using some temp fixed size buffers and detach a thread for every buffer when is full. The thread will save the buffer content in the Flash memory. Launching some heavy threads, heavy beacause each thread access the flash, will slow the device down and the refresh of the camera view.
Buffers cannot be big, because you will get memory warning, and cannot be small because you will freeze the device, because of too many threads and accesses to the flash memory at a time.
The solution resides in balancing buffer size and number of threads.
I haven't already tried to use sqlite3 db to store images binary data, but i don't if will be a better solution.
PS: to speed up class methods call, avoid the common solution [object method] because of how method call works, but try to get and save the method address as below.
From Apple ObjC doc:
"The example below shows how the procedure that implements the setFilled: method might be
called:
void (*setter)(id, SEL, BOOL);
int i;
setter = (void (*)(id, SEL, BOOL))[target methodForSelector:#selector(setFilled:)];
for ( i = 0; i < 1000, i++ )
setter(targetList[i], #selector(setFilled:), YES); "
Marco
If you're intending to ever release your app on the App Store, using a private framework will ensure that it will be rejected. Video, using the SDK, simply isn't supported.
To capture the video you can see when the Camera is active requires fairly sophisticate techniques, not exposed by any framework/lib out of the box.
I used a non documented UIWindow method to get the current displayed frame as CGImageRef.
Now it works successfully!!
If you would, and if i'm allowded, i can post the code that do the trick.
Marco