i am creating a app to screen capture from the iphone. So after i did the coding i used profiling and analyzing to check memory leaks. I am getting only one memory leak in one section in the code. Here is my code which gives me the memory leak.
-(void) writeSample: (NSTimer*) _timer {
if (assetWriterInput.readyForMoreMediaData) {
// CMSampleBufferRef sample = nil;
CVReturn cvErr = kCVReturnSuccess;
// get screenshot image!
CGImageRef image = (CGImageRef) [[self screenshot] CGImage];
NSLog (#"made screenshot");
// prepare the pixel buffer
CVPixelBufferRef pixelBuffer = NULL;
CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
NSLog (#"copied image data");
cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
FRAME_WIDTH,
FRAME_HEIGHT,
kCVPixelFormatType_32BGRA,
(void*)CFDataGetBytePtr(imageData),
CGImageGetBytesPerRow(image),
NULL,
NULL,
NULL,
&pixelBuffer);
NSLog (#"CVPixelBufferCreateWithBytes returned %d", cvErr);
// calculate the time
CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
NSLog (#"elapsedTime: %f", elapsedTime);
CMTime presentationTime = CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);
// write the sample
BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];
if (appended) {
NSLog (#"appended sample at time %lf", CMTimeGetSeconds(presentationTime));
} else {
NSLog (#"failed to append");
[self stopRecording];
self.startStopButton.selected = NO;
}
}
}
it says Potential leak of an object stored into 'imageData'. Can any one help me with finding the error in the above code. There is a memory leak in above code when i check it with the memory management tools too. If any one can help me it would be a great help.
Thanks in Advance !!
From comments -
Do a CFRelease on your imageData when your done with it?
You can put it right before or right after NSLog (#"CVPixelBufferCreateWithBytes returned %d", cvErr);
CFRelease(imageData);
http://developer.apple.com/library/mac/#documentation/QuartzCore/Reference/CVPixelBufferRef/Reference/reference.html
I am not sure about the rest of the code you have, but generally when there is a call with Crete as a word in it, it has to have a corresponding release statement. Please check the documentation above.
CVPixelBufferRelease
Releases a pixel buffer.
void CVPixelBufferRelease (
CVPixelBufferRef texture
);
Related
this takes my head in. I read audio from the iPod Library for analysis of the audio samples and I can do what I want, the buffer is always leaking, I get a Low Memory Warning and the app is killed.
I tried all suggestions but to no success. The code below is incorporated in a static library and reading the audio works fine, just the buffer gets never released. I use ARC and also tried NOT to call CFRelease but same thing ... thanks for any suggestion, I am completely stuck!!!
- (NSInteger) getMP3Samples:(SInt16*)address{
AudioBufferList audioBufferList;
if (_assetReader == nil) {
return 0;
}
_mp3Control.currentSampleBufferCount = 0;
CMSampleBufferRef nextBuffer =[_assetReaderOutput copyNextSampleBuffer];
// Is the Song ended
if (nextBuffer == nil){
if ([_assetReader status] == AVAssetReaderStatusCompleted) {
[_assetReader cancelReading];
}
_assetReader = nil;
_assetReaderOutput = nil;
return _mp3Control.currentSampleBufferCount;
}
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
nextBuffer,
NULL,
&audioBufferList,
sizeof(audioBufferList),
NULL,
NULL,
kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
&_mp3Control.blockBuffer);
if (nextBuffer) {
CMSampleBufferInvalidate(nextBuffer);
CFRelease(nextBuffer);
nextBuffer=NULL;
}
for (int b=0; b < audioBufferList.mNumberBuffers; b++) {
memcpy((void *)(address+_mp3Control.currentSampleBufferCount),(void *)audioBufferList.mBuffers[b].mData,audioBufferList.mBuffers[b].mDataByteSize);
_mp3Control.currentSampleBufferCount+=audioBufferList.mBuffers[b].mDataByteSize;
}
///
/// Return samples and not bytes!!
///
return _mp3Control.currentSampleBufferCount/2;
}
Are you using & releasing the block buffer returned by CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer in the (not posted) calling code?
If you are not releasing the object stored in &_mp3Control.blockBuffer after calling getMP3Samples:, this could be your memory management problem. (Core Foundation-style objects don't participate in ARC)
You could also run your code through the Allocation & Leaks Instruments to see further details (I am just guessing here :) ).
I'm trying to get to grips with OpenAL, working through a tutorial here: http://benbritten.com/2008/11/06/openal-sound-on-the-iphone/
My problem is that the sound does not play, although there are no iOS errors thrown. There is an OpenAL error though. The code sample below is the body of an IBAction method, and results in an AL_INVALID_OPERATION at alGenSources(1, &sourceID). sourceID reports as NULL.
I've tried this on the device and the simulator.
This code sample seems to be in pretty wide use, but I can't find anybody complaining of this particular problem. Can anybody throw any light on this? Many thanks for any help,
NSString *audioFileName = [[NSBundle mainBundle] pathForResource:#"1" ofType:#"caf"];
AudioFileID fileID = [self openAudioFile:audioFileName];
UInt32 filesize = [self audioFileSize:fileID];
unsigned char *outData = malloc(filesize);
OSStatus result = noErr;
result = AudioFileReadBytes(fileID, false, 0, &filesize, outData);
AudioFileClose(fileID);
if (result != 0) {
NSLog(#"Can't load file..");
}
NSUInteger bufferID;
//NSLog(#"bufferID %#", [NSNumber numberWithUnsignedInteger:bufferID]);
alGenBuffers(1, &bufferID);
//NSLog(#"bufferID %#", [NSNumber numberWithUnsignedInteger:bufferID]);
alBufferData(bufferID, AL_FORMAT_STEREO16, outData, filesize, 44100);
[bufferStorageArray addObject:[NSNumber numberWithUnsignedInteger:bufferID]];
alGetError();
ALuint sourceID;
alGenSources(1, &sourceID);
if(alGetError() == AL_INVALID_OPERATION)
{
printf("\n++++ Error creating buffers INVALID_OPERATION!!\n");
//exit(1);
}
else
{
printf("No errors yet.");
}
alSourcei(sourceID, AL_BUFFER, bufferID);
alSourcef(sourceID, AL_PITCH, 1.0f);
alSourcef(sourceID, AL_GAIN, 1.0f);
if (loops) {
alSourcei(sourceID, AL_LOOPING, AL_TRUE);
}
[soundDictionary setObject:[NSNumber numberWithUnsignedInt:sourceID] forKey:#"sound"];
if (outData) {
free(outData);
outData = NULL;
}
[self playSound:#"sound"];
For your pitch problem, make sure the sound file you are loading matches the sample rate you are feeding into alBufferData. Your caf file is probably saved at 22050 Hz.
AudioStreamBasicDescription's mSampleRate will tell you what the audio file's sample rate really is.
You should also check mChannelsPerFrame to make sure it really is stereo sound.
Also, OpenAL by default on iOS only generates 4 stereo sources. If you try to load more than 4 sources with stereo data, your audio will sound like garbage. You can change that by specifying attributes ALC_STEREO_SOURCES and ALC_MONO_SOURCES when you create a context. You have a maximum of 32 sources (by default it sets up 28 mono and 4 stereo sources).
Stupid mistake on my part - I had initialised OpenAL in initWithNibName, which was never being called. Moving the init into viewDidLoad has got everything working, although playback is chipmunk-style high speed
I am going insane with this one - have looked everywhere and tried anything and everything I can thinks of.
Am making an iPhone app that uses AVFoundation - specifically AVCapture to capture video using the iPhone camera.
I need to have a custom image that is overlayed on the video feed included in the recording.
So far I have the AVCapture session set up, can display the feed, access the frame, save it as a UIImage and marge the overlay Image onto it. Then convert this new UIImage into a CVPixelBufferRef. annnd to double check that the bufferRef is working I converted it back to a UIImage and it displays the image fine still.
The trouble starts when I try to convert the CVPixelBufferRef into a CMSampleBufferRef to append to the AVCaptureSessions assetWriterInput. The CMSampleBufferRef always returning NULL when I attempt to create it.
Here is the -(void)captureOutput function
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
UIImage *botImage = [self imageFromSampleBuffer:sampleBuffer];
UIImage *wheel = [self imageFromView:wheelView];
UIImage *finalImage = [self overlaidImage:botImage :wheel];
//[previewImage setImage:finalImage]; <- works -- the image is being merged into one UIImage
CVPixelBufferRef pixelBuffer = NULL;
CGImageRef cgImage = CGImageCreateCopy(finalImage.CGImage);
CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage));
int status = CVPixelBufferCreateWithBytes(NULL,
self.view.bounds.size.width,
self.view.bounds.size.height,
kCVPixelFormatType_32BGRA,
(void*)CFDataGetBytePtr(image),
CGImageGetBytesPerRow(cgImage),
NULL,
0,
NULL,
&pixelBuffer);
if(status == 0){
OSStatus result = 0;
CMVideoFormatDescriptionRef videoInfo = NULL;
result = CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixelBuffer, &videoInfo);
NSParameterAssert(result == 0 && videoInfo != NULL);
CMSampleBufferRef myBuffer = NULL;
result = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
pixelBuffer, true, NULL, NULL, videoInfo, NULL, &myBuffer);
NSParameterAssert(result == 0 && myBuffer != NULL);//always null :S
NSLog(#"Trying to append");
if (!CMSampleBufferDataIsReady(myBuffer)){
NSLog(#"sampleBuffer data is not ready");
return;
}
if (![assetWriterInput isReadyForMoreMediaData]){
NSLog(#"Not ready for data :(");
return;
}
if (![assetWriterInput appendSampleBuffer:myBuffer]){
NSLog(#"Failed to append pixel buffer");
}
}
}
Another solution I keep hearing about is using a AVAssetWriterInputPixelBufferAdaptor which eliminates the need to do the messy CMSampleBufferRef wrapping. However I have scoured stacked and apple developer forums and docs and can't find a clear description or example on how to set this up or how to use it. If anyone has a working example of it could you please show me or help me nut out the above issue - have been working on this non-stop for a week and am at wits end.
Let me know if you need any other info
Thanks in advance,
Michael
You need AVAssetWriterInputPixelBufferAdaptor, here is the code to create it :
// Create dictionary for pixel buffer adaptor
NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil];
// Create pixel buffer adaptor
m_pixelsBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:assetWriterInput sourcePixelBufferAttributes:bufferAttributes];
And the code to use it :
// If ready to have more media data
if (m_pixelsBufferAdaptor.assetWriterInput.readyForMoreMediaData) {
// Create a pixel buffer
CVPixelBufferRef pixelsBuffer = NULL;
CVPixelBufferPoolCreatePixelBuffer(NULL, m_pixelsBufferAdaptor.pixelBufferPool, &pixelsBuffer);
// Lock pixel buffer address
CVPixelBufferLockBaseAddress(pixelsBuffer, 0);
// Create your function to set your pixels data in the buffer (in your case, fill with your finalImage data)
[self yourFunctionToPutDataInPixelBuffer:CVPixelBufferGetBaseAddress(pixelsBuffer)];
// Unlock pixel buffer address
CVPixelBufferUnlockBaseAddress(pixelsBuffer, 0);
// Append pixel buffer (calculate currentFrameTime with your needing, the most simplest way is to have a frame time starting at 0 and increment each time you write a frame with the time of a frame (inverse of your framerate))
[m_pixelsBufferAdaptor appendPixelBuffer:pixelsBuffer withPresentationTime:currentFrameTime];
// Release pixel buffer
CVPixelBufferRelease(pixelsBuffer);
}
And don't forget to release your pixelsBufferAdaptor.
I do it by using CMSampleBufferCreateForImageBuffer() .
OSStatus ret = 0;
CMSampleBufferRef sample = NULL;
CMVideoFormatDescriptionRef videoInfo = NULL;
CMSampleTimingInfo timingInfo = kCMTimingInfoInvalid;
timingInfo.presentationTimeStamp = pts;
timingInfo.duration = duration;
ret = CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixel, &videoInfo);
if (ret != 0) {
NSLog(#"CMVideoFormatDescriptionCreateForImageBuffer failed! %d", (int)ret);
goto done;
}
ret = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixel, true, NULL, NULL,
videoInfo, &timingInfo, &sample);
if (ret != 0) {
NSLog(#"CMSampleBufferCreateForImageBuffer failed! %d", (int)ret);
goto done;
}
I have a piece of code that sets up a capture session from the camera to process the frames using OpenCV and then set the image property of a UIImageView with a generated UIImage from the frame. When the app starts, the image view's image is nil and no frames show up until I push another view controller on the stack and then pop it off. Then the image stays the same until I do it again. NSLog statements show that the callback is called at approximately the correct frame rate. Any ideas why it doesn't show up? I reduced the framerate all the way to 2 frames a second. Is it not processing fast enough?
Here's the code:
- (void)setupCaptureSession {
NSError *error = nil;
// Create the session
AVCaptureSession *session = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPresetLow;
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// Handling the error appropriately.
}
[session addInput:input];
// Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
output.alwaysDiscardsLateVideoFrames = YES;
[session addOutput:output];
// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// Specify the pixel format
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
// If you wish to cap the frame rate to a known value, such as 15 fps, set
// minFrameDuration.
output.minFrameDuration = CMTimeMake(1, 1);
// Start the session running to start the flow of data
[session startRunning];
// Assign session to an ivar.
[self setSession:session];
}
// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer,0);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
if (!colorSpace)
{
NSLog(#"CGColorSpaceCreateDeviceRGB failure");
return nil;
}
// Get the base address of the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the data size for contiguous planes of the pixel buffer.
size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);
// Create a Quartz direct-access data provider that uses data we supply
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, baseAddress, bufferSize,
NULL);
// Create a bitmap image from data supplied by our data provider
CGImageRef cgImage =
CGImageCreate(width,
height,
8,
32,
bytesPerRow,
colorSpace,
kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
provider,
NULL,
true,
kCGRenderingIntentDefault);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
// Create and return an image object representing the specified Quartz image
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
return image;
}
// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
// Create a UIImage from the sample buffer data
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
[self.delegate cameraCaptureGotFrame:image];
}
This could be related to threading—Try:
[self.delegate performSelectorOnMainThread:#selector(cameraCaptureGotFrame:) withObject:image waitUntilDone:NO];
This looks like a threading issue. You cannot update your views in any other thread than in the main thread. In your setup, which is good, the delegate function captureOutput:didOutputSampleBuffer: is called in a secondary thread. So you cannot set the image view from there. Art Gillespie's answer is one way of solving it if you can get rid of the bad access error.
Another way is to modify the sample buffer in captureOutput:didOutputSampleBuffer: and have is shown by adding a AVCaptureVideoPreviewLayer instance to your capture session. That's certainly the preferred way if you only modify a small part of the image such as highlighting something.
BTW: Your bad access error could arise because you don't retain the created image in the secondary thread and so it will be freed before cameraCaptureGotFrame is called on the main thread.
Update:
To properly retain the image, increase the reference count in captureOutput:didOutputSampleBuffer: (in the secondary thread) and decrement it in cameraCaptureGotFrame: (in the main thread).
// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// Create a UIImage from the sample buffer data
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
// increment ref count
[image retain];
[self.delegate performSelectorOnMainThread:#selector(cameraCaptureGotFrame:)
withObject:image waitUntilDone:NO];
}
- (void) cameraCaptureGotFrame:(UIImage*)image
{
// whatever this function does, e.g.:
imageView.image = image;
// decrement ref count
[image release];
}
If you don't increment the reference count, the image is freed by the auto release pool of the second thread before the cameraCaptureGotFrame: is called in the main thread. If you don't decrement it in the main thread, the images are never freed and you run out of memory within a few seconds.
Are you doing a setNeedsDisplay on the UIImageView after each new image property update?
Edit:
Where and when are you updating the background image property in your image view?
I am making an application which saves the camera roll images as blob in sqlite3 database.
When the image is retrieved, its dimensions changes(height and width get interchanged). This happens only on iphone and not on simulator. Please help.
Basically when i retrieve an image from db and run it on iphone device it changes from potrait to landscape mode
Hi,
Please see the code below,
The first snippet is storing image in db, the 2nd is retrieving and the 3rd is drawing.
-(int) InsertImageInImagesTable:(UIImage *)taggedImage{
NSData *imgData=UIImagePNGRepresentation(taggedImage);
//unsigned char aBuufer[[imgData length]];
//[imgData getBytes:aBuufer length:[imgData length]];
NSString *sql=[NSString stringWithFormat:#"INSERT INTO tblImages(Image)values(?)"];
sqlite3_stmt *statement;
int imageId=-1;
if(statement=[self prepare:sql])
{
sqlite3_bind_blob(statement, 1, [imgData bytes], [imgData length],nil);
//sqlite3_bind_int((statement, 2, [imgData bytes], [imgData length],nil);
sqlite3_step(statement);
imageId=sqlite3_last_insert_rowid(dbh);
}
sqlite3_finalize(statement);
return imageId;
}
-(NSDictionary *)SelectImagesFromImagesTable{
NSMutableArray *imgArr=[[NSMutableArray alloc]init];
NSMutableArray *idArr=[[NSMutableArray alloc]init];
NSString *slctSql=#"Select ImageId, Image from tblImages order by ImageId asc";
sqlite3_stmt *statement;
//NSData *imgData;
int imageId=-1;
if(statement=[self prepare:slctSql])
{
//sqlite3_
while(sqlite3_step(statement)==SQLITE_ROW)
{
imageId=sqlite3_column_int(statement, 0);
NSData *imgData=[[NSData alloc] initWithBytes:sqlite3_column_blob(statement, 1) length:sqlite3_column_bytes(statement, 1)];
UIImage *im=[UIImage imageWithData:imgData];
[imgArr addObject:[UIImage imageWithData:imgData]];
[idArr addObject:[NSString stringWithFormat:#"%d",imageId]];
[imgData release];
}
sqlite3_finalize(statement);
}
NSMutableDictionary *imgDict=[NSMutableDictionary dictionaryWithObjects:imgArr forKeys:idArr];
[imgArr release];
[idArr release];
//array=idArr;
return imgDict;
}
-(void)drawRect:(CGRect)rect
{
float newHeight;
float newWidth;
float ratio=1.0;
CGContextRef ctx = UIGraphicsGetCurrentContext();
if (myPic != NULL)
{
int hh=myPic.size.height;
int ww=myPic.size.width;
if(myPic.size.height>367)
{
ratio = myPic.size.height/367;
if (myPic.size.width/320 > ratio) {
ratio = myPic.size.width/320;
}
}
newHeight = myPic.size.height/ratio;
newWidth = myPic.size.width/ratio;
[myPic drawInRect:CGRectMake(self.center.x-(newWidth/2),self.center.y-newHeight/2),newWidth,newHeight)];
}
}
It's hard to say without code but off the top of my head, there are a couple of possible sources for this problem:
(1) The problem might be with the image view and not the image itself. Something in the code might be resizing the view. Check the views resizing properties. If the image view is full screen, then the view controller might actually think it is landscape orientation and it is actually displaying the image correctly but for the wrong orientation. (Confusing orientation constants is an easy way to make that happen.)
(2) The images are misformatted when they are saved. When the UIImage regenerates them, the width and height are swapped. I've never seen that happen but an image in the wrong format could in theory cause that.
Since this only happens on the device, I would bet that the problem is (1). Ignore the images for a bit and instead look at how the view controllers handle their orientation and sizing.