Anyway of having captureOutput callback on a Background Thread? - iphone

Is there anyway to have the OS callback the delegate
- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)psampleBuffer fromConnection:(AVCaptureConnection *)pconnection
on a background thread instead of the main thread? The issue is the time that it is taking to copy off the data is affecting the UI. This seems to have to be done on the main thread since the captureOutput seems to be gone if trying to copy it in a background thread, etc... Am I missing something here?
CMFormatDescriptionRef format;
format = CMSampleBufferGetFormatDescription(sampleBuffer);
bufSize = CMSampleBufferGetNumSamples(sampleBuffer);
sampleSize = CMSampleBufferGetSampleSize(sampleBuffer,0);
sampleLength = CMSampleBufferGetTotalSampleSize(sampleBuffer);
blockbuff = CMSampleBufferGetDataBuffer(sampleBuffer);
CMBlockBufferCopyDataBytes(blockbuff, 0, tocopy*_depth, buffInUse+(offset*2));

I use CVPixelBuffer functions to lock/unlock the buffer and get image format information. I use memcpy to copy the data (while the buffer is locked) and call performSelectorInBackground to process the data.

For a good answer, this seems to get the location to fix the callback thread
[audioOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
Just need to change the queue.

Related

iOS - CMSampleBufferRef is not being released from captureOutput:didOutputSampleBuffer:fromConnection

I am capturing frames from the camera using the code:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
:(AVCaptureConnection *)connection
{
// Create a UIImage from the sample buffer data
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
if(delegate && [delegate respondsToSelector:#selector(captureManagerCapturedFrame:withFrameImage:withFrameBuffer:)]) {
[delegate captureManagerCapturedFrame:self withFrameImage:image withFrameBuffer:sampleBuffer];
}
}
I am doing this because in the delegate method captureManagerCapturedFrame:withFrameImage:withFrameBuffer: I have a flag which tells the app to use either the returned uiimage OR the returned sampleBuffer.
The delegate method is:
- (void) captureManagerCapturedFrame:(AVCamCaptureManager *)captureManager
withFrameImage:(UIImage *)image
withFrameBuffer:(CMSampleBufferRef)frameBuffer {
if(_screen1) {
NSLog(#"Only display camera image\n");
}
else if(_screen2) {
//Enable IR
NSLog(#"Display AND Process camera image\n");
[self imageReconigitionProcessFrame:frameBuffer];
}
}
where imageReconigitionProcessFrame: is:
-(void)imageReconigitionProcessFrame:(CMSampleBufferRef)frameBuffer {
//CFRetain(frameBuffer);
MSImage *qry = [[MSImage alloc] initWithBuffer:frameBuffer orientation:AVCaptureVideoOrientationPortrait]; //MEMORY LEAK HERE???
qry = nil;
//CFRelease(frameBuffer);
}
This code effectively works. But here is my problem. When this code is run and profiled in instruments, I see a rapid increase in the overall bytes used, but the allocations profiler doesn't appear to increase. Nor do a see any 'leaks' using the leaks tool. But clearly, there is a rapid memory gain each time imageReconigitionProcessFrame: is called and the app crashes after a few seconds. When I set frameBuffer to nil, there is NO increase in memory (or course I also don't have the frame buffer to do any processing with).
I have tried transfering ownership of frameBuffer using CFRetain and CFRelease (commented out in the above code), but these don't seem to do anything either.
Does anyone have any idea where I could be leaking memory inside this function???
The method [[MSImage alloc] initWithBuffer: is form a third party SDK (Moodstocks, which is an awesome image recognition SDK) and it works just fine in their demos, so I don't think the problem is inside this function.
First of all, thanks for mentioning Moodstocks (I work for them): we're happy that you find our SDK useful!
To answer your question, I guess your code does indeed contain a leak: at the end of the imageReconigitionProcessFrame method, you should call [qry release]. The rule in Obj-C is quite simple: whenever you manually call alloc on an object, it should also be manually released!
That's BTW what is done in the Moodstocks SDK wrapper: if you look at the [MSScannerSession session: didOutputSampleBuffer:] method, you'll see that we do manually release the MSImage object after it's been processed.
As to why the profiler doesn't find this leak, I guess that it's due to the fact that leaks are analyzed every 10 seconds by default: in this case, the memory leak is so heavy (1280x720 frames, at 15+ FPS if you're on an iPhone 5, for 10 seconds: at least 130 MB leaked) that the code must crash before the first 10 seconds are reached.
Hope this helps!

GCD pattern for shared resource access + UI update?

folks! I'm implementing a shared cache in my app. The idea is to get the cached data from the web in the background and then update the cache and the UI with the newly retrieved data. The trick is of course to ensure thread-safety, since the main thread will be continuously using the cache. I don't want to modify the cache in any fashion while someone else might be using it.
It's my understanding that using #synchronized to lock access to a shared resource is not the most elegant approach in ObjectiveC due to it trapping to the kernel and thus being rather sluggish. I keep reading that using GCD instead is a great alternative (let's ignore its cousin NSOperation for now), and I'd like to figure out what a good pattern for my situation would be. Here's some sample code:
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
// download the data in a background thread
dispatch_async(queue, ^{
CacheData *data = [Downloader getLatestData];
// use the downloaded data in the main thread
dispatch_sync(dispatch_get_main_queue(), ^{
[AppCache updateCache:data];
[[NSNotificationCenter defaultCenter] postNotificationName:#"CacheUpdated" object:nil];
});
});
Would this actually do what I think it does, and if so, is this the cleanest approach as of today of handling this kind of situation? There's a blog post that's quite close to what I'm talking about, but I wanted to double-check with you as well.
I'm thinking that as long as I only ever access shared the shared resource on the same thread/queue (main in my case) and only ever update UI on main, then I will effectively achieve thread-safety. Is that correct?
Thanks!
Yes.
Other considerations aside, instead of shunting read/write work onto the main thread consider using a private dispatch queue.
dispatch_queue_t readwritequeue;
readwritequeue = dispatch_queue_create("com.myApp.cacheAccessQueue", NULL);
Then update your AppCache class:
- (void)updateCache:(id)data {
dispatch_sync(readwritequeue, ^{ ... code to set data ... });
}
- (id)fetchData:... {
__block id data = nil;
dispatch_sync(readwritequeue, ^{ data = ... code to fetch data ...});
return data;
}
Then update your original code:
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
// download the data in a background thread
dispatch_async(queue, ^{
CacheData *data = [Downloader getLatestData];
**[AppCache updateCache:data];**
// use the downloaded data in the main thread
dispatch_async(dispatch_get_main_queue(), ^{
[[NSNotificationCenter defaultCenter] postNotificationName:#"CacheUpdated" object:nil];
});
});
If you ask 100 developers here was is the most elegant way to do this, you will get at least 100 different answers (maybe more!)
What I do, and what is working well for me, is to have a singleton class doing my image management. I use Core Data, and save thumbnails directly in the store, but use the file system and a URL to it in Core Data for "large" files. Core Data is setup to use the new block based interface so it can do all its work on a private thread managed by itself.
Possible image URLS get registered with a tag on the main thread. Other classes can ask for the image for that tag. If the image is not there, nil is returned, but this class sets a fetchingFlag, uses a concurrent NSOperation coupled to a NSURLConnection to fetch the image, when it gets it it messages the singleton on its thread with the received image data, and the method getting that message uses '[moc performBlock:...]' (no wait) to process it.
When images are finally added to the repository, the moc dispatches a notification on the main queue with the received image tag. Classes that wanted the image can listen for this, and when they get it (on the main thread) they can then ask the moc for the image again, which is obviously there.

preparing view elements on a background thread

OK, so I know you're not supposed to directly interact with view elements from any thread other than the main thread.
But can you do stuff in a background thread that will be used by a view?
In particular, I have a pretty substantial algorithm that ends up spitting out a string. If I want that string to become the text of a UITextView, do I need to run this whole algorithm on the main thread? Or can it be done in the background ?
You can certainly run it in the background, just like a graphical application might render images in the background. Once you have the string ready, GCD is your friend:
- (void)backgroundStringGenerator
{
NSString *expensiveString = ... // do string generation algorithm
dispatch_async(dispatch_get_main_queue(), ^{
theLabel.text = expensiveString;
});
}

How to make a static image appear after 3 seconds?

How would I make an image appear after 3 seconds?
You can use:
[self performSelector: withObject: afterDelay: ]
I'm a big fan of using GCD (iOS 4+) because you can simplify your code with inline blocks.
In your case, you should set the image to hidden in Interface Builder, then create an IBOutlet with a connection to an ivar in your class.
Then you can simply run this in viewDidLoad or similar:
dispatch_time_t delay = dispatch_time(DISPATCH_TIME_NOW, NSEC_PER_SEC * 3.0);
dispatch_after(delay, dispatch_get_main_queue(), ^(void){
yourImage.hidden = NO;
});
This assumes that you are calling performSelector:withObject:afterDelay from the main thread, and that your UIImageView is initially hidden.
//assumes theImageView.hidden = YES
[self performSelector:#selector(showImage:) withObject:theImageView afterDelay:yourTimeInterval];
-(void)showImage:(UIImageView*)anImageView {
anImageView.hidden = NO;
}
It is important that performSelector is called from the main thread because the selector that is called after the delay will run on the same thread, and you do not want to update UI from anything other than the main thread as a general rule.
I haven't used XCode in awhile, but I'll take a stab for ya..
In your Interface Builder set the image's visibility as hidden
When your app starts up, set some global variable to the current time in an init fxn
In the main control loop for your UI, check if that global var contains a time that is more than 3 seconds ago, if so, change that image's visibility parameter to shown.
Best I can really say without really taking a look, which isn't possible right now.
Good luck!

iphone - main thread freezes for half a second... why?

I have an app that is drawing lines on a quartz context. The app starts drawing when the user move his finger across the screen.
At the time TouchesMoved is fired, I save the quartz context to a PNG file (I know saving a file is slow... I have tried to do this to memory but app memory usage skyrocketed, so, I am trying to do it to disk).
As the context is being saved to this, I do this on touches moved
if (firstMove) // first movement after touchesbegan
[NSThread detachNewThreadSelector:#selector(newThreadUNDO)
toTarget:self
withObject:nil];
firstMove = NO
}
and then I have
- (void) newThreadUNDO {
NSAutoreleasePool* p = [[NSAutoreleasePool alloc] init];
[NSThread setThreadPriority:0.1];
[NSThread sleepForTimeInterval:0.0];
[self performSelectorOnMainThread:#selector(copyUNDOcontext) withObject:nil waitUntilDone:NO];
[p release];
}
and
- (void) copyUNDOcontext {
CGFloat w = board.image.size.width;
CGFloat h = board.image.size.height;
CGRect superRect = CGRectMake(0,0, w, h);
CGSize size = CGSizeMake(w, h);
UIGraphicsBeginImageContext(size);
CGContextRef new = UIGraphicsGetCurrentContext();
// lineLayer is the layer context I need to save
CGContextDrawLayerInRect(new, superRect, lineLayer);
UIImage *imagem = UIGraphicsGetImageFromCurrentImageContext();
[self saveTempImage:imagem :#"UNDO.png"];
UIGraphicsEndImageContext();
}
The problem is: as soon as the user starts moving, the new thread is fired, but even this new thread being with low priority the main thread still freezes for about half second (probably while the file is being saved).
Why is that?
How can I try to solve that?
thanks.
Have you tried:
performSelector:onThread:withObject:waitUntilDone:
With waitUntilDone set to NO.
If I recall correctly performing the selector on the Main thread always processes the selector in the main run loop of the application. I could be wrong. I have been using GCD for some time now.
If you try this I believe you will need to put the autorelease pool into the function, as it will serve as the entry and exit point of the thread.
First, a method named like saveTempImage:: is to be discouraged. Make it saveTempImage:fileName: or something.
Your guess is probably good; saving the file is probably where the pause is coming from. Could also be the rendering itself, if complicated, but doesn't look like it is.
However, guessing is generally an unproductive way to analyze performance problems. Use the provided tools. The CPU Sampler instrument could tell you what is really going on.
To fix? First confirm the problem. If it is the file I/O, move it off the main thread (I haven't looked at UIImage's documentation to know if it is thread safe in such a context).