CFRelease but still leaking - iphone

this takes my head in. I read audio from the iPod Library for analysis of the audio samples and I can do what I want, the buffer is always leaking, I get a Low Memory Warning and the app is killed.
I tried all suggestions but to no success. The code below is incorporated in a static library and reading the audio works fine, just the buffer gets never released. I use ARC and also tried NOT to call CFRelease but same thing ... thanks for any suggestion, I am completely stuck!!!
- (NSInteger) getMP3Samples:(SInt16*)address{
AudioBufferList audioBufferList;
if (_assetReader == nil) {
return 0;
}
_mp3Control.currentSampleBufferCount = 0;
CMSampleBufferRef nextBuffer =[_assetReaderOutput copyNextSampleBuffer];
// Is the Song ended
if (nextBuffer == nil){
if ([_assetReader status] == AVAssetReaderStatusCompleted) {
[_assetReader cancelReading];
}
_assetReader = nil;
_assetReaderOutput = nil;
return _mp3Control.currentSampleBufferCount;
}
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
nextBuffer,
NULL,
&audioBufferList,
sizeof(audioBufferList),
NULL,
NULL,
kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
&_mp3Control.blockBuffer);
if (nextBuffer) {
CMSampleBufferInvalidate(nextBuffer);
CFRelease(nextBuffer);
nextBuffer=NULL;
}
for (int b=0; b < audioBufferList.mNumberBuffers; b++) {
memcpy((void *)(address+_mp3Control.currentSampleBufferCount),(void *)audioBufferList.mBuffers[b].mData,audioBufferList.mBuffers[b].mDataByteSize);
_mp3Control.currentSampleBufferCount+=audioBufferList.mBuffers[b].mDataByteSize;
}
///
/// Return samples and not bytes!!
///
return _mp3Control.currentSampleBufferCount/2;
}

Are you using & releasing the block buffer returned by CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer in the (not posted) calling code?
If you are not releasing the object stored in &_mp3Control.blockBuffer after calling getMP3Samples:, this could be your memory management problem. (Core Foundation-style objects don't participate in ARC)
You could also run your code through the Allocation & Leaks Instruments to see further details (I am just guessing here :) ).

Related

"pointer being freed was not allocated" error with core plot array

I'm drawing a waveform of the incoming audio in the microphone by using core plot.
It works great but i sometime get this error :
malloc: * error for object 0x175a1550: pointer being freed was not allocated
* set a breakpoint in malloc_error_break to debug.
It happens occasionally (after 1 minute, 10 minutes, 1 hour...it depends!) when the core plot array is being cleared.
-(void)refreshScope: (NSTimer*)theTimer;
{
if ([[audio dataForScope] count] == 500)
{
[self performSelectorOnMainThread:#selector(reloadDataOnMainThread) withObject:nil waitUntilDone:YES];
[[audio dataForScope] removeAllObjects]; // HERE !!!!
}
}
-(void) reloadDataOnMainThread
{
[audioScope reloadData];
}
The dataForScope array (mutable) is alloc/init in the audio class of my code. It is filled with integer of the audio buffer.
I have tried a lot of different things but nothing seems to work. I always get the same error
Any ideas ?
Thank you.
EDIT :
-(void)processAudio:(AudioBufferList *)bufferList{
AudioBuffer sourceBuffer = bufferList->mBuffers[0];
memcpy(tempBuffer.mData, bufferList->mBuffers[0].mData, bufferList->mBuffers[0].mDataByteSize);
int16_t* samples = (int16_t*)(tempBuffer.mData);
#autoreleasepool
{
for ( int i = 0; i < tempBuffer.mDataByteSize / 2; ++i )
{
if(i % 5 == 0)
{
if ([dataForScope count] == 500)
{
scopeIndex = 0;
}
else
{
float scopeTime = scopeIndex * 1000.0 / SampleRate;
id xScope = [NSNumber numberWithFloat: scopeTime];
id yScope = [NSNumber numberWithInt: samples[i]/100];
[dataForScope addObject:[NSMutableDictionary dictionaryWithObjectsAndKeys:xScope, #"xScope", yScope, #"yScope", nil]];
}
scopeIndex = scopeIndex + 1;
}
}
}
Make sure [[audio dataForScope] removeAllObjects]; executes before you call -reloadData on the plot. This ensures the dataForScope array is ready whenever the plot requests its new data.
The plot datasource methods are called on the main thread. If they read the dataForScope array directly, you need to make sure all accesses to that array (reads and writes) occurs on the main thread so there are no conflicts.

How to check if sound off on iphone [duplicate]

I am developing an application. In that i want to detect through coding that "is iPhone on silent mode or not?". I am developing it by using cocoa with Objective-C.
If anyone knows it kindly reply.
The reason Pirripli's code does not work is that the simulator does not support the test and the code does not check for errors. Corrected code would look like:
CFStringRef state = nil;
UInt32 propertySize = sizeof(CFStringRef);
AudioSessionInitialize(NULL, NULL, NULL, NULL);
OSStatus status = AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &state);
if (status == kAudioSessionNoError)
{
return (CFStringGetLength(state) == 0); // YES = silent
}
return NO;
It's possible by testing for a NULL audio route using AudioToolBox:
UInt32 routeSize = sizeof (CFStringRef);
CFStringRef route;
AudioSessionGetProperty (
kAudioSessionProperty_AudioRoute,
&routeSize,
&route
);
if (route == NULL) {
NSLog(#"Silent switch is on");
}
If route is NULL then there's no available audio outputs. If it's "Headset" or "Headphones" then the silent ringer switch could still be on. However, it will never be on when it's set to "Speaker".
You're probably best testing for this in your audio route change property listener, which is set below:
AudioSessionAddPropertyListener (
kAudioSessionProperty_AudioRouteChange,
audioRouteChangeListenerCallback,
self
);
Note: If you're doing anything funky like overriding audio routes, then this answer may not apply.
Setting up and tearing down an audio session in its entirety is probably beyond the scope of this answer.
For completeness, building off this link from Dan Bon, I implement the following method to solve this problem in my apps. One thing to note is that the code checks for the iPhone simulator first - executing the below code will crash the simulator. Anyone know why?
-(BOOL)silenced {
#if TARGET_IPHONE_SIMULATOR
// return NO in simulator. Code causes crashes for some reason.
return NO;
#endif
CFStringRef state;
UInt32 propertySize = sizeof(CFStringRef);
AudioSessionInitialize(NULL, NULL, NULL, NULL);
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &state);
if(CFStringGetLength(state) > 0)
return NO;
else
return YES;
}
Declaring this right in the view controller, you'd simply check
if ([self silenced]) {
NSLog(#"silenced");
else {
NSLog(#"not silenced");
}
Or, obviously, you could declare it in some kind of helper class. A more elegant solution might be a category addition on UIApplication or some such other class...
You can use Audio Route property as suggested by the previous answers, but keep in mind that:
- It works only if the Audio Category is AmbientSound
- You should not initialize Audio Session more than once in your app (see Audio Session Programming Guide)
- You should release those CFStringRef to avoid mem leaks
In case the current audio category is not AmbientSound though, you can think of changing it temporarily, perform the check on Audio Route property, and then restoring the original Audio Category.
Note that changing Audio Category will restore the default Audio Route for that category, given the current hardware configuration (i.e. whether there are headphones plugged in or not, etc).

Getting Potential Leak of an object in CVPixelBuffer

i am creating a app to screen capture from the iphone. So after i did the coding i used profiling and analyzing to check memory leaks. I am getting only one memory leak in one section in the code. Here is my code which gives me the memory leak.
-(void) writeSample: (NSTimer*) _timer {
if (assetWriterInput.readyForMoreMediaData) {
// CMSampleBufferRef sample = nil;
CVReturn cvErr = kCVReturnSuccess;
// get screenshot image!
CGImageRef image = (CGImageRef) [[self screenshot] CGImage];
NSLog (#"made screenshot");
// prepare the pixel buffer
CVPixelBufferRef pixelBuffer = NULL;
CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
NSLog (#"copied image data");
cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
FRAME_WIDTH,
FRAME_HEIGHT,
kCVPixelFormatType_32BGRA,
(void*)CFDataGetBytePtr(imageData),
CGImageGetBytesPerRow(image),
NULL,
NULL,
NULL,
&pixelBuffer);
NSLog (#"CVPixelBufferCreateWithBytes returned %d", cvErr);
// calculate the time
CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
NSLog (#"elapsedTime: %f", elapsedTime);
CMTime presentationTime = CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);
// write the sample
BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];
if (appended) {
NSLog (#"appended sample at time %lf", CMTimeGetSeconds(presentationTime));
} else {
NSLog (#"failed to append");
[self stopRecording];
self.startStopButton.selected = NO;
}
}
}
it says Potential leak of an object stored into 'imageData'. Can any one help me with finding the error in the above code. There is a memory leak in above code when i check it with the memory management tools too. If any one can help me it would be a great help.
Thanks in Advance !!
From comments -
Do a CFRelease on your imageData when your done with it?
You can put it right before or right after NSLog (#"CVPixelBufferCreateWithBytes returned %d", cvErr);
CFRelease(imageData);
http://developer.apple.com/library/mac/#documentation/QuartzCore/Reference/CVPixelBufferRef/Reference/reference.html
I am not sure about the rest of the code you have, but generally when there is a call with Crete as a word in it, it has to have a corresponding release statement. Please check the documentation above.
CVPixelBufferRelease
Releases a pixel buffer.
void CVPixelBufferRelease (
CVPixelBufferRef texture
);

Can I determine / how, if a device has vibration or not?

I have some settings that enable/disable vibration for certain actions, but I find it pointless to display them if the device doesn't have the ability to vibrate. Is there a way to check if the person is using an iPod touch and if it has vibration?
I'm not sure there is a way to do this other than doing model checks which is probably not a great approach. I do know that apple provides:
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
If the device can vibrate, it will. On devices without vibration, it will do nothing. There is another call:
AudioServicesPlayAlertSound(kSystemSoundID_Vibrate);
This one will vibrate the device if it hash the capability or the device will beep.
It might be better to just have the settings and have some explanation around the setting because a user may want the beep when they do not have a vibrating device. Maybe call the setting something other than "Vibration Alert On/Off".
This code should do it - be aware it 'assumes' the iPhone is the only device with Vibration capability. Which it is for the moment...
- (NSString *)machine
{
static NSString *machine = nil;
// we keep name around (its like 10 bytes....) forever to stop lots of little mallocs;
if(machine == nil)
{
char * name = nil;
size_t size;
// Set 'oldp' parameter to NULL to get the size of the data
// returned so we can allocate appropriate amount of space
sysctlbyname("hw.machine", NULL, &size, NULL, 0);
// Allocate the space to store name
name = malloc(size);
// Get the platform name
sysctlbyname("hw.machine", name, &size, NULL, 0);
// Place name into a string
machine = [[NSString stringWithUTF8String:name] retain];
// Done with this
free(name);
}
return machine;
}
-(BOOL)hasVibration
{
NSString * machine = [self machine];
if([[machine uppercaseString] rangeOfString:#"IPHONE"].location != NSNotFound)
{
return YES;
}
return NO;
}
Just edited to stop the machine call from doing lots of small mallocs each time its called.

iOS SDK 4.3 OpenAL alGenSources results in AL_INVALID_OPERATION

I'm trying to get to grips with OpenAL, working through a tutorial here: http://benbritten.com/2008/11/06/openal-sound-on-the-iphone/
My problem is that the sound does not play, although there are no iOS errors thrown. There is an OpenAL error though. The code sample below is the body of an IBAction method, and results in an AL_INVALID_OPERATION at alGenSources(1, &sourceID). sourceID reports as NULL.
I've tried this on the device and the simulator.
This code sample seems to be in pretty wide use, but I can't find anybody complaining of this particular problem. Can anybody throw any light on this? Many thanks for any help,
NSString *audioFileName = [[NSBundle mainBundle] pathForResource:#"1" ofType:#"caf"];
AudioFileID fileID = [self openAudioFile:audioFileName];
UInt32 filesize = [self audioFileSize:fileID];
unsigned char *outData = malloc(filesize);
OSStatus result = noErr;
result = AudioFileReadBytes(fileID, false, 0, &filesize, outData);
AudioFileClose(fileID);
if (result != 0) {
NSLog(#"Can't load file..");
}
NSUInteger bufferID;
//NSLog(#"bufferID %#", [NSNumber numberWithUnsignedInteger:bufferID]);
alGenBuffers(1, &bufferID);
//NSLog(#"bufferID %#", [NSNumber numberWithUnsignedInteger:bufferID]);
alBufferData(bufferID, AL_FORMAT_STEREO16, outData, filesize, 44100);
[bufferStorageArray addObject:[NSNumber numberWithUnsignedInteger:bufferID]];
alGetError();
ALuint sourceID;
alGenSources(1, &sourceID);
if(alGetError() == AL_INVALID_OPERATION)
{
printf("\n++++ Error creating buffers INVALID_OPERATION!!\n");
//exit(1);
}
else
{
printf("No errors yet.");
}
alSourcei(sourceID, AL_BUFFER, bufferID);
alSourcef(sourceID, AL_PITCH, 1.0f);
alSourcef(sourceID, AL_GAIN, 1.0f);
if (loops) {
alSourcei(sourceID, AL_LOOPING, AL_TRUE);
}
[soundDictionary setObject:[NSNumber numberWithUnsignedInt:sourceID] forKey:#"sound"];
if (outData) {
free(outData);
outData = NULL;
}
[self playSound:#"sound"];
For your pitch problem, make sure the sound file you are loading matches the sample rate you are feeding into alBufferData. Your caf file is probably saved at 22050 Hz.
AudioStreamBasicDescription's mSampleRate will tell you what the audio file's sample rate really is.
You should also check mChannelsPerFrame to make sure it really is stereo sound.
Also, OpenAL by default on iOS only generates 4 stereo sources. If you try to load more than 4 sources with stereo data, your audio will sound like garbage. You can change that by specifying attributes ALC_STEREO_SOURCES and ALC_MONO_SOURCES when you create a context. You have a maximum of 32 sources (by default it sets up 28 mono and 4 stereo sources).
Stupid mistake on my part - I had initialised OpenAL in initWithNibName, which was never being called. Moving the init into viewDidLoad has got everything working, although playback is chipmunk-style high speed