iphone - main thread freezes for half a second... why? - iphone

I have an app that is drawing lines on a quartz context. The app starts drawing when the user move his finger across the screen.
At the time TouchesMoved is fired, I save the quartz context to a PNG file (I know saving a file is slow... I have tried to do this to memory but app memory usage skyrocketed, so, I am trying to do it to disk).
As the context is being saved to this, I do this on touches moved
if (firstMove) // first movement after touchesbegan
[NSThread detachNewThreadSelector:#selector(newThreadUNDO)
toTarget:self
withObject:nil];
firstMove = NO
}
and then I have
- (void) newThreadUNDO {
NSAutoreleasePool* p = [[NSAutoreleasePool alloc] init];
[NSThread setThreadPriority:0.1];
[NSThread sleepForTimeInterval:0.0];
[self performSelectorOnMainThread:#selector(copyUNDOcontext) withObject:nil waitUntilDone:NO];
[p release];
}
and
- (void) copyUNDOcontext {
CGFloat w = board.image.size.width;
CGFloat h = board.image.size.height;
CGRect superRect = CGRectMake(0,0, w, h);
CGSize size = CGSizeMake(w, h);
UIGraphicsBeginImageContext(size);
CGContextRef new = UIGraphicsGetCurrentContext();
// lineLayer is the layer context I need to save
CGContextDrawLayerInRect(new, superRect, lineLayer);
UIImage *imagem = UIGraphicsGetImageFromCurrentImageContext();
[self saveTempImage:imagem :#"UNDO.png"];
UIGraphicsEndImageContext();
}
The problem is: as soon as the user starts moving, the new thread is fired, but even this new thread being with low priority the main thread still freezes for about half second (probably while the file is being saved).
Why is that?
How can I try to solve that?
thanks.

Have you tried:
performSelector:onThread:withObject:waitUntilDone:
With waitUntilDone set to NO.
If I recall correctly performing the selector on the Main thread always processes the selector in the main run loop of the application. I could be wrong. I have been using GCD for some time now.
If you try this I believe you will need to put the autorelease pool into the function, as it will serve as the entry and exit point of the thread.

First, a method named like saveTempImage:: is to be discouraged. Make it saveTempImage:fileName: or something.
Your guess is probably good; saving the file is probably where the pause is coming from. Could also be the rendering itself, if complicated, but doesn't look like it is.
However, guessing is generally an unproductive way to analyze performance problems. Use the provided tools. The CPU Sampler instrument could tell you what is really going on.
To fix? First confirm the problem. If it is the file I/O, move it off the main thread (I haven't looked at UIImage's documentation to know if it is thread safe in such a context).

Related

iOS - CMSampleBufferRef is not being released from captureOutput:didOutputSampleBuffer:fromConnection

I am capturing frames from the camera using the code:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
:(AVCaptureConnection *)connection
{
// Create a UIImage from the sample buffer data
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
if(delegate && [delegate respondsToSelector:#selector(captureManagerCapturedFrame:withFrameImage:withFrameBuffer:)]) {
[delegate captureManagerCapturedFrame:self withFrameImage:image withFrameBuffer:sampleBuffer];
}
}
I am doing this because in the delegate method captureManagerCapturedFrame:withFrameImage:withFrameBuffer: I have a flag which tells the app to use either the returned uiimage OR the returned sampleBuffer.
The delegate method is:
- (void) captureManagerCapturedFrame:(AVCamCaptureManager *)captureManager
withFrameImage:(UIImage *)image
withFrameBuffer:(CMSampleBufferRef)frameBuffer {
if(_screen1) {
NSLog(#"Only display camera image\n");
}
else if(_screen2) {
//Enable IR
NSLog(#"Display AND Process camera image\n");
[self imageReconigitionProcessFrame:frameBuffer];
}
}
where imageReconigitionProcessFrame: is:
-(void)imageReconigitionProcessFrame:(CMSampleBufferRef)frameBuffer {
//CFRetain(frameBuffer);
MSImage *qry = [[MSImage alloc] initWithBuffer:frameBuffer orientation:AVCaptureVideoOrientationPortrait]; //MEMORY LEAK HERE???
qry = nil;
//CFRelease(frameBuffer);
}
This code effectively works. But here is my problem. When this code is run and profiled in instruments, I see a rapid increase in the overall bytes used, but the allocations profiler doesn't appear to increase. Nor do a see any 'leaks' using the leaks tool. But clearly, there is a rapid memory gain each time imageReconigitionProcessFrame: is called and the app crashes after a few seconds. When I set frameBuffer to nil, there is NO increase in memory (or course I also don't have the frame buffer to do any processing with).
I have tried transfering ownership of frameBuffer using CFRetain and CFRelease (commented out in the above code), but these don't seem to do anything either.
Does anyone have any idea where I could be leaking memory inside this function???
The method [[MSImage alloc] initWithBuffer: is form a third party SDK (Moodstocks, which is an awesome image recognition SDK) and it works just fine in their demos, so I don't think the problem is inside this function.
First of all, thanks for mentioning Moodstocks (I work for them): we're happy that you find our SDK useful!
To answer your question, I guess your code does indeed contain a leak: at the end of the imageReconigitionProcessFrame method, you should call [qry release]. The rule in Obj-C is quite simple: whenever you manually call alloc on an object, it should also be manually released!
That's BTW what is done in the Moodstocks SDK wrapper: if you look at the [MSScannerSession session: didOutputSampleBuffer:] method, you'll see that we do manually release the MSImage object after it's been processed.
As to why the profiler doesn't find this leak, I guess that it's due to the fact that leaks are analyzed every 10 seconds by default: in this case, the memory leak is so heavy (1280x720 frames, at 15+ FPS if you're on an iPhone 5, for 10 seconds: at least 130 MB leaked) that the code must crash before the first 10 seconds are reached.
Hope this helps!

UIImageView performance within UITableViewCell using ASIHTTPRequest

This problem has been vexing me for a little over a day. The gist is that I have a Grouped UITableView setup such that there is only one entry per section. Each cell within the table view contains a UIImageView that takes up the entire width and length of the cell. Furthermore, each cell is at a specified set height and width.
Here is my init method:
- (id)initWithStyle:(UITableViewCellStyle)style reuseIdentifier:(NSString *)reuseIdentifier{
self = [super initWithStyle:style reuseIdentifier:reuseIdentifier];
if (self) {
[[self layer] setMasksToBounds:NO];
[[self layer] setShouldRasterize:YES];
[[self layer] setRasterizationScale:[[UIScreen mainScreen] scale]];
[[self layer] setCornerRadius:10];
_bgCoverView = [[UIImageView alloc] initWithFrame:CGRectZero];
[_bgCoverView setClipsToBounds:YES];
[[_bgCoverView layer] setCornerRadius:10];
[_bgCoverView setContentMode:UIViewContentModeScaleAspectFill];
[self.contentView addSubview:_bgCoverView];
[_bgCoverView release];
//Init other parts of the cell, but they're not pertinent to the question
}
return self;
}
This is my layout subviews:
- (void)layoutSubviews{
[super layoutSubviews];
if (!self.editing){
[_bgCoverView setFrame:CGRectMake(0, 0, tableViewCellWidth, self.contentView.frame.size.height)];
}
}
This is my setter for the cell:
- (void)setGroup:(Group*)group{
//Set background cover for the contentView
NSSet *usableCovers = [[group memories] filteredSetUsingPredicate:[NSPredicate predicateWithFormat:#"pictureMedData != nil || pictureFullData != nil || (pictureMedUrl != nil && NOT pictureMedUrl contains[cd] '/missing.png')"]];
Memory *mem = [usableCovers anyObject];
[_bgCoverView setImage:[UIImage imageWithContentsOfFile:#"no_pics_yet.png"]];
if (mem != nil){
[Folio cacheImageAtStringURL:[mem pictureMedUrl] forManagedObject:mem withDataKey:#"pictureMedData" inDisplay:_bgCoverView];
}
}
This is where I think I'm getting a performance ding. First, there's the filtered set call. If there are a lot of objects in the set, this could slow stuff down. However, there tend to be relatively few objects. Furthermore, when I removed this line of code, I saw no performance change, so it's pretty unlikely.
So, that pretty much leaves my cache method (which is located in a helper class). Here's the method:
+(void)cacheImageAtStringURL:(NSString *)urlString forManagedObject:(NSManagedObject*)managedObject withDataKey:(NSString*)dataKey inDisplay:(NSObject *)obj{
int objType = 0; //Assume UIButton
if (obj == nil)
objType = -1;
else if ([obj isKindOfClass:[UIImageView class]])
objType = 1; //Assign to UIImageView
NSData *data = (NSData*)[managedObject valueForKey:dataKey];
if ([data bytes]){
if (objType == 0) [(UIButton*)obj setBackgroundImage:[UIImage imageWithData:data] forState:UIControlStateNormal];
else if (objType == 1) [(UIImageView*)obj setImage:[UIImage imageWithData:data]];
return;
}
//Otherwise, do an ASIHTTPRequest and set the image when it's returned, save the data into core data so that we get a cache hit the next time.
}
Regardless of if the images are cached or not, there are major performance issues with scrolling down on this view. If I comment out the cache method, it works pretty well. Additionally, other views that call this method are sluggish, so I'm pretty sure it's something here.
I will also say, however, that I'm also suspicious of the cornerRadius code, however, I heard that if you set shouldRasterize to YES, it'll result in a performance speed-up. Still, I'm not sure if that's 100% true or if my implementation is off.
Any help would be greatly appreciated.
UPDATE
Still not 100% fixed yet, but we are getting there.
These variables must be set on the request:
[request setCachePolicy:ASIOnlyLoadIfNotCachedCachePolicy];
[request setCacheStoragePolicy:ASICachePermanentlyCacheStoragePolicy];
The first tells the cache to only ping the server for images if they are not cached. The second tells the cache to store the images permanently for the lifecycle of the app.
This got me part-way there. Some images loaded instantly whereas others seemed sluggish. I did more research and added these lines of code in my app delegate:
[[ASIDownloadCache sharedCache] setShouldRespectCacheControlHeaders:NO];
[ASIHTTPRequest setDefaultCache:[ASIDownloadCache sharedCache]];
The first line is crucial. ASIHTTPRequest's default action is to follow what the web server tells you. So, if it tells you in its headers not to cache a response, it won't. This line overrides that. Now, this worked really well on my iOS simulator. Then, I tried it on my 4S and it was slow again.
The reason was because I set up a block of code in my request that, upon the successful retrieval of an image, I would save it to Core Data. This call was being triggered every time, even for cached requests.
This is because ASIHTTPRequest will still make a request regardless of whether or not it uses a cached copy of your data. If it has a cached copy, it won't send out a request to a server, but will populate its request object's response data with cached data instead. So after my block takes care of display work, I call:
if ([request didUseCachedResponse]) return;
This works very well on the 4S and the 4. However, the 3GS is still disastrously slow :-(.
This is because ASIHTTPRequest is using a cached copy of my NSData and not my UIImage. So, it needs to re-create the image each time it uses the cache. This makes sense because ASIHTTPRequest is only returning data, and doesn't know what you are going to do with the data afterwards, so it'll always call your code that's set up when the request is returned. And because my code block converts the image from data, that's what I get.
A WAY FORWARD
I think to get the performance speedup I want, it's necessary to implement a custom UIImage cache. If we find the file in the cache, we return the UIImage object. If not, then we create a request and save it in the cache on return.
The tip with shouldRasterize is a valid one, though from my experience this decreases the quality of the graphic significantly. Also, since like iOS4.3 or so (can't remember anymore) they enormously improved the cornerRadius call, it's a very long time since I last needed to activate rasterizing for this kind of problem.
You could always deactivate cornerRadius and see how it performs.
What I really want to tell you however is that ASIHTTPRequest has built-in cache support, see
http://allseeing-i.com/ASIHTTPRequest/How-to-use#using_a_download_cache
Try it, it worked flawlessly for me. Think about what cache policies fit your needs best. Of course, if you need the images in CoreData you'll have to insert them, but I wouldn't try to cache them manually from CoreData.
Oh and something else ... ASIHTTPRequest is not under development anymore, you might want to consider switching to another framework (like AFNetworking). ASIHTTPRequest e.g. doesn't support ARC.
Check out this post. I use a blocks and GCD for image loading (especially useful in a TableViewCell). This loads the images on the fly if you put the method call in the cellForRowAtIndexPath method. I have no slugglish UI.

Anyway of having captureOutput callback on a Background Thread?

Is there anyway to have the OS callback the delegate
- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)psampleBuffer fromConnection:(AVCaptureConnection *)pconnection
on a background thread instead of the main thread? The issue is the time that it is taking to copy off the data is affecting the UI. This seems to have to be done on the main thread since the captureOutput seems to be gone if trying to copy it in a background thread, etc... Am I missing something here?
CMFormatDescriptionRef format;
format = CMSampleBufferGetFormatDescription(sampleBuffer);
bufSize = CMSampleBufferGetNumSamples(sampleBuffer);
sampleSize = CMSampleBufferGetSampleSize(sampleBuffer,0);
sampleLength = CMSampleBufferGetTotalSampleSize(sampleBuffer);
blockbuff = CMSampleBufferGetDataBuffer(sampleBuffer);
CMBlockBufferCopyDataBytes(blockbuff, 0, tocopy*_depth, buffInUse+(offset*2));
I use CVPixelBuffer functions to lock/unlock the buffer and get image format information. I use memcpy to copy the data (while the buffer is locked) and call performSelectorInBackground to process the data.
For a good answer, this seems to get the location to fix the callback thread
[audioOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
Just need to change the queue.

When are a methods GUI operations actually carried out?

I am working on a web-services data processing app and I am trying to make the app run as quickly as possible. When a certain 3 finger pan gesture is performed, I call a method that sends updated information off to the server to get a new batch of images to update the existing ones with.
So lets say there are 15 images in an array, I filter through them with a 2 finger gesture, and then if I want to change something about them, I can do the 3 finger gesture, and I get that same set back, just tweaked a bit (contrast/brightness, etc.).
Is what I want though is to be able to update the imageView that is displaying the images after the first image has been retrieved, so as to give the user a feel for what the rest in the series are going to look like. But no matter what I try, and no matter how many different threads I try and implement, I can't get the imageView to update before the entire download is complete. Once the batch download is done (which is handled on a separate thread) the imageView updates with the new images and everything is great.
The first step in the process is this:
if(UIGestureRecognizerStateEnded == [recognize state]){
[self preDownload:windowCounter Level:levelCounter ForPane:tagNumber];// Where this method is what gets the first image, and tries to set it to the imageView
[self downloadAllImagesWithWL:windowCounter Level:levelCounter ForPane:tagNumber]; //And this method goes and gets all the rest of the images
}
This is my preDownload method:
-(void)preDownload:(int)window Level:(int)level ForPane:(int) pane{
int guidIndex = [[globalGuids objectAtIndex:pane] intValue];
UIImage *img = [DATA_CONNECTION getImageWithSeriesGUID:[guids objectAtIndex:guidIndex] ImageID:counter Window:window Level:level];
if(pane==0){
NSLog(#"0");
[imageView3 setImage:img];
}else if(pane==1){
NSLog(#"1");
[imageView31 setImage:img];
}else if(pane==2){
NSLog(#"2");
[imageView32 setImage:img];
}else if(pane==3){
NSLog(#"3");
[imageView33 setImage:img];
}
}
So by separating this out into two different methods (there are no threads being implemented at this point, these methods are being called before all that) I was thinking that after the preDownload method completed, that the imageView would update, and then control would continue on down into the downloadAllImagesWithWL method, but that doesn't appear to be the case.
Am I missing something simple here? What can I do to update my GUI elements before that second method is through running?
You are right. However the viewn won't refresh until your code reaches runloop. You can do 2 things:
Make your downloadAllImagesWithWL method async, so it will return after you called it, your main thread reaches runloop, gui updates, and the download method will tell your logic through a callback when its done.
OR
A simplier hackier (and bad) solution would be to run runloop for some time before you call your download method. Something like this: [[NSRunloop currentRunLoop] runUnitlDate: [Date dateWithTimeIntervalSinceNow: 0.1]]; It will run runloop for 0.1 second.
When the image is set, the image view will mark itself as needing display. The actual display won't occur until the beginning of the next run loop. In OS X, you can use -display to draw the view immediately, but I don't think Apple created a public method to do this on iOS. However, if the next method simply creates the background thread, then it will return quickly and the display update will probably occur before the thread finishes.

iPhone : Best way to detect the end of UIImageView image sequence animation

We know that UIImageView has a very nice support for image sequence animation. We can easily create an array of UIImage objects, set the animationImages property, configure animation duration, repeat count etc. and then just fire. But there seems to be no way to know when this animation has ended.
Say I have ten images and then I want to run an animation (repeat count = 1) with them. And when the animation is over, I want to run some other code. What is the best way to know that animation has ended?
I already understand that I can create a NSTimer and schedule it to fire after animation duration. But you really cannot rely on timer if you need good precision.
So my question is, is there any better way to know that an UIImageView image sequence animation has ended without using the timer?
The code is something like this
myImageView.animationImages = images; // images is a NSArray of UIImages
myImageView.animationDuration = 2.0;
myImageView.animationRepeatCount = 1;
[myImageView startAnimating]
The isAnimating property on UIImageView should go to NO when it's done animating. It's not a formal property, though, so you can't set up observation on it. You can poll it on a fine-grained timer (like CADisplayLink's).
There's no "animation completed" delegate for this, if that's the sort of thing you're looking for. The timing can be variable based on loading delay of the images, etc, and no, there's no sure-fire way to know precisely when it's done.
The image animation stuff on UIImageView is a convenience, and not heavyweight enough to do serious animation work with. Consider rolling your own if you need that kind of precision.
I made this (method of my UIImageView subclass).
-(void)startAnimatingWithCallback:(UIImageViewAnimationCompletitionBlock) completitionCallback
{
[self startAnimating];
dispatch_queue_t animatingQueue = dispatch_get_current_queue();
dispatch_queue_t pollingQueue = dispatch_queue_create("pollingQueue", NULL);
dispatch_async(pollingQueue, ^{
while(self.isAnimating) { usleep(10000); }
dispatch_async(animatingQueue, ^{ if (completitionCallback) completitionCallback(); });
});
}
Simple usage:
[self.oneView fadeOut];
[self.otherView startAnimatingWithCallback:^
{
[self.oneView fadeIn];
}];
Furthermore I could recommend setting default image of the UIImageView (property image) on the first frame of the animation and changing it to the last frame just after launching the animation (startAnimating method). This way we avoid the ugly blick which can occur when the animation is finished but the callback is not invoked.