UIImage within Thread not being Released / Overwritten - iphone

This appears to be the the classic method for scanning images from the iPhone. I have a thread that is dispatched from the main thread to go and scan for Codes. It essentially creates a new UIImage each time then removes it.
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
{
while (![thread isCancelled]) {
#ifdef DEBUG
NSLog(#"Decoding Loop");
#endif
// [self performSelectorOnMainThread:#selector(updateImageBuffer) withObject:nil waitUntilDone:YES];
CGImageRef cgScreen = UIGetScreenImage();
UIImage *uiimage = [UIImage imageWithCGImage:cgScreen];
if (uiimage){
CGSize size = [uiimage size];
CGRect cropRect = CGRectMake(0.0, 80.0, size.width, 360); // Crop to centre of the screen - makes it more robust
#ifdef DEBUG
NSLog(#"picked image size = (%f, %f)", size.width, size.height);
#endif
[decoder decodeImage:uiimage cropRect:cropRect];
}
[uiimage release];
CGImageRelease(cgScreen);
}
}
[pool release];
the problem is that the [pool release] causes an ERROR_BAD_EXC (that old classic) and the program bombs. I'm told that there is no need to call [uiimage release] as I havent explicitly allocated a UIImage but this doesn't seem to be the case. If I take that line out, Memory usage goes through the roof and the program quits dues to lack of memory. It appears I can't have this work the way I'd like.
Is there a way to create a UIImage "in-place"? I.e, have a buffer that is written to again and again as a UIImage? I suspect that would work?
Update!
Tried executing the UIKit related calls on the main thread as follows:
-(void)performDecode:(id)arg{
// Perform the decoding in a seperate thread. This should, in theory, bounce back with a
// decoded or not decoded message. We can quit at the end of this thread.
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
{
while (![thread isCancelled]) {
#ifdef DEBUG
NSLog(#"Decoding Loop");
#endif
[self performSelectorOnMainThread:#selector(updateImageBuffer) withObject:nil waitUntilDone:YES];
if (uiimage){
CGSize size = [uiimage size];
CGRect cropRect = CGRectMake(0.0, 80.0, 320, 360); // Crop to centre of the screen - makes it more robust
#ifdef DEBUG
NSLog(#"picked image size = (%f, %f)", size.width, size.height);
#endif
[decoder decodeImage:uiimage cropRect:cropRect];
}
}
}
[pool drain];
#ifdef DEBUG
NSLog(#"finished decoding.");
#endif
}
-(void) updateImageBuffer {
CGImageRef cgScreen = UIGetScreenImage();
uiimage = [UIImage imageWithCGImage:cgScreen];
//[uiimage release];
CGImageRelease(cgScreen);
}
No joy however as EXC_BAD_ACCESS rears its ugly head when one wishes to grab the "Size" of the UIImage

As has been stated by others, you should not release the UIImage returned from imageWithCGImage: . It is autoreleased. When your pool drains, it tries sending a release message to your already-released image objects, leading to your crash.
The reason why your memory usage keeps climbing is that you only drain the autorelease pool outside of the loop. Your autoreleased objects keep accumulating inside of the loop. (By the way, you need to release your autorelease pool at the end of that method, because it is currently being leaked.) To prevent this accumulation, you could drain the pool at regular intervals within the loop.
However, I'd suggest switching to doing [[UIImage alloc] initWithCGImage:cgScreen] and then releasing the image when done. I try to avoid using autoreleased objects whereever I can within iPhone applications in order to have tighter control over memory usage and overall better performance.

UIGetScreenImage() is private and undocumented so you flat-out cannot use it. Saying that nothing about it suggests that you now own CGImageRef cgScreen so why do you release it? You also have no way of knowing if it is thread safe and so should assume it isn't. You then go on to release the IImage *uiimage which you did not init, retain or copy, so again - you don't own it. Review the docs.

[uiimage release] is definitely wrong in this context. Also, Apple stresses that all UIKit methods must be executed on the main thread. That includes UIGetScreenImage() and +[UIImage imageWithCGImage:].
Edit: So you get an exception when calling -[UIImage size] on the wrong thread. This probably shouldn't surprise you because it is not permitted.

UIImage *uiimage = [[UIImage alloc] initWithCGImage: cgScreen];
Explicitly stating that I know best when to release the object seemed to work. Virtual Memory still increases but physical now stays constant. Thanks for pointing out the UIKit Thread Safe issues though. That is a point I'd missed but seems not affect the running at this point.
Also, I should point out, Red Laser and Quickmark both use this method of scanning camera information ;)

Related

Feel lag when scroll and sametime load full screen image from asset in background thread

Here is the thing:
I have a scroll view, it did the lazy load for full screen image of user's photo:
[self.assetsLibrary assetForURL:[NSURL URLWithString:[[self.assets objectAtIndex:index] objectForKey:#"asset_url"]]
resultBlock:^(ALAsset *asset) {
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
CGImageRef cgImage = asset.defaultRepresentation.fullScreenImage;
UIImage *image = [UIImage imageWithCGImage:cgImage];
dispatch_async(dispatch_get_main_queue(), ^{
imageView.image = image;
});
});
}
failureBlock:^(NSError *error) {
NSLog(#"error");
}];
I know it is expensive to load full screen image, so I put it in to the background thread, but it is still lag when I do the scroll. And still lag even I change it like this:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
CGImageRef cgImage = asset.defaultRepresentation.fullScreenImage;
UIImage *image = [UIImage imageWithCGImage:cgImage];
imageView.image = image;
dispatch_async(dispatch_get_main_queue(), ^{
});
});
Obviously, nothing to do in the main queue, but it still lag until I comment the line:
// CGImageRef cgImage = asset.defaultRepresentation.fullScreenImage;
So I am so confused, is there something wrong when I used GCD?
Somebody can help me to explain it? Any thing will be helpful.
Thank you, guys.
UPDATE
To #Fogmeister : The size of the photo is the full screen size, actuel imageView size is around half. Even I comment the line: "imageView.image = image;" it is still lag. Which means it is not from the resizing. I know where the time is being taken, here: "asset.defaultRepresentation.fullScreenImage;". When I comment it, everything fine, there is no more lag.
So, which I don't understand is, I've already put it in the background thread...
Ok, finally I solved problem:
Instead of getting image directly by
asset.defaultRepresentation.fullScreenImage
I use the method from Apple's Exemple PhotosByLocation (code below) to get the image in BG thread. That works great, no more lag when scroll. But I am still confused, I don't know exactly why. So I appreciate if someone can explain it to me.
- (UIImage *)fullSizeImageForAssetRepresentation:(ALAssetRepresentation *)assetRepresentation {
UIImage *result = nil;
NSData *data = nil;
uint8_t *buffer = (uint8_t *)malloc(sizeof(uint8_t)*[assetRepresentation size]);
if (buffer != NULL) {
NSError *error = nil;
NSUInteger bytesRead = [assetRepresentation getBytes:buffer fromOffset:0 length:[assetRepresentation size] error:&error];
data = [NSData dataWithBytes:buffer length:bytesRead];
free(buffer);
}
if ([data length]) {
CGImageSourceRef sourceRef = CGImageSourceCreateWithData((__bridge CFDataRef)data, nil);
NSMutableDictionary *options = [NSMutableDictionary dictionary];
[options setObject:(id)kCFBooleanTrue forKey:(id)kCGImageSourceShouldAllowFloat];
[options setObject:(id)kCFBooleanTrue forKey:(id)kCGImageSourceCreateThumbnailFromImageAlways];
[options setObject:(id)[NSNumber numberWithFloat:640.0f] forKey:(id)kCGImageSourceThumbnailMaxPixelSize];
//[options setObject:(id)kCFBooleanTrue forKey:(id)kCGImageSourceCreateThumbnailWithTransform];
CGImageRef imageRef = CGImageSourceCreateThumbnailAtIndex(sourceRef, 0, (__bridge CFDictionaryRef)options);
if (imageRef) {
result = [UIImage imageWithCGImage:imageRef scale:[assetRepresentation scale] orientation:(UIImageOrientation)[assetRepresentation orientation]];
CGImageRelease(imageRef);
}
if (sourceRef) CFRelease(sourceRef);
}
return result;
}
You're solution taken from Apple's PhotosByLocation is actually grabbing the biggest resolution image, not the fullscreen image. IOW, it's essentially the same as calling fullResolutionImage instead of fullScreenImage. How that fixes your problem, I'm not sure. I'm struggling with the same performance issue. If I use fullScreenImage, I get lags in my scrolling. But switching to fullResolutionImage gets rid of the lags. fullResolutionImage takes about twice as long as fullScreenImage, but since this is always in the background, it shouldn't really matter how much time it takes. I suspect that fullScreenImage is returning an image that needs some sort of additional processing once it gets rendered to the screen in the main thread - hence the lag.
Do you know the actual size of the photo? What is very expensive is scrolling images that are being resized to fit the screen.
Seeing as you're already loading in a BG thread it might be worth resizing the image to the size you are displaying it at before sticking it on the screen.
You can see where the time is being taken by using the CoreAnimation tool in Instruments by profiling the app from Xcode. It will even tell you which line of code is causing the slow down and missed animation frames.
From the apple documentation:
DISPATCH_QUEUE_PRIORITY_DEFAULT
Items dispatched to the queue run at the default priority; the queue is scheduled for execution after all high priority queues have been scheduled, but before any low priority queues have been scheduled.DISPATCH_QUEUE_PRIORITY_BACKGROUNDItems dispatched to the queue run at background priority; the queue is scheduled for execution after all high priority queues have been scheduled and the system runs items on a thread whose priority is set for background status. Such a thread has the lowest priority and any disk I/O is throttled to minimize the impact on the system.
You're running it in a separate thread, but that's not necessarily a thread "in the background." A background thread loading something in my experience will be completely blocked by doing a UI update such as scrolling a UIScrollView. Have you tried using DISPATCH_QUEUE_PRIORITY_BACKGROUND?

Asynchronous texture loading iPhone OpenGL ES 2

I'm creating and loading a lot of textures (made of strings). To keep the animation running smoothly, I offload the work to a separate worker thread. It seems to work more or less exactly the way I want, but on older devices (iPhone 3GS) I sometimes notice a long (1 sec) lag. It only occurs sometimes. Now I'm wondering if I'm doing this correctly or if there is any conceptual issue. I paste the source code below.
I should also mention that I do not want to use the GLKit TextureLoader because I also want to offload the texture generating work to the other thread, not just the loading part.
In case you're wondering what I need these textures for, have a look at this video: http://youtu.be/U03p4ZhLjvY?hd=1
NSLock* _textureLock;
NSMutableDictionary* _texturesWithString;
NSMutableArray* _texturesWithStringLoading;
// This is called when I request a new texture from the drawing routine.
// If this function returns 0, it means the texture is not ready and Im not displaying it.
-(unsigned int)getTextureWithString:(NSString*)string {
Texture2D* _texture = [_texturesWithString objectForKey:string];
if (_texture==nil){
if (![_texturesWithStringLoading containsObject:string]){
[_texturesWithStringLoading addObject:string];
NSDictionary* dic = [[NSDictionary alloc] initWithObjectsAndKeys:string,#"string", nil];
NSThread* thread = [[NSThread alloc] initWithTarget:self selector:#selector(loadTextureWithDictionary:)object:dic];
thread.threadPriority = 0.01;
[thread start];
[thread release];
}
return 0;
}
return _texture.name;
}
// This is executed on a separate worker thread.
// The lock makes sure that there are not hundreds of separate threads all creating a texture at the same time and therefore slowing everything down.
// There must be a smarter way of doing that. Please let me know if you know how! ;-)
-(void)loadTextureWithOptions:(NSDictionary*)_dic{
[_textureLock lock];
EAGLContext* context = [[SharegroupManager defaultSharegroup] getNewContext];
[EAGLContext setCurrentContext: context];
NSString* string = [_dic objectForKey:#"string"];
Texture2D* _texture = [[Texture2D alloc] initWithStringModified:string];
if (_texture!=nil) {
NSDictionary* _newdic = [[NSDictionary alloc] initWithObjectsAndKeys:_texture,#"texture",string,#"string", nil];
[self performSelectorOnMainThread:#selector(doneLoadingTexture:) withObject:_newdic waitUntilDone:NO];
[_newdic release];
[_texture release];
}
[EAGLContext setCurrentContext: nil];
[context release];
[_textureLock unlock];
}
// This callback is executed on the main thread and marks adds the texture to the texture cache.
-(void)doneLoadingTextureWithDictionary:(NSDictionary*)_dic{
[_texturesWithString setValue:[_dic objectForKey:#"texture"] forKey:[_dic objectForKey:#"string"]];
[_texturesWithStringLoading removeObject:[_dic objectForKey:#"string"]];
}
The problem was that too many threads were started at the same time. Now I am using a NSOperationQueue rather than NSThreads. That allows me to set maxConcurrentOperationCount and only run one extra background thread that does the texture loading.

UIImageView.image = mImage leak

I have thread2 loop where i do assembly (create from raw bytes data) some UIImage
in every iteration of this loop
thread2loop()
{
//make UIIamge here
[self performSelectorOnMainThread:#selector(setUiImage) withObject:nil waitUntilDone:YES];
}
there and then i call setUIImage method on the main thread
- (void) setUiImage
{
self.imageView.image = nil;
self.imageView.image = mImage;
[mImage release];
}
it is working but the Instruments , leaks application shows to me that there are
UIImage leaks here and i do not know how to ##$! get rid of it! (im sad and little tired
and bored), help, what to do, tnx
Surround your threaded code with...
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
//threaded code....
[pool release];
Classic producer/consumer problem. Your producer thread is probably outrunning the main thread (the consumer). I'd recommend keeping a queue of images (instead of the single mImage), guarded by a lock which you enqueue images onto (from your background queue), and dequeue images from your main queue. Or you could use GCD, which makes this even easier. Instead of using mImage to hold onto the created image, you could just use a block which would retain the image and then set it on your image view in the main queue. Something like:
thread2loop() {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
while (...) {
__block id self_block = self; // (don't want to retain self in the block)
UIImage *img = [[UIImage alloc] initWithCGImage:quartzImage scale:1.0 orientation:UIImageOrientationUp];
dispatch_async(dispatch_get_main_queue(), ^{
block_self.imageView.image = img;
[img release];
});
}
[pool drain]; // release is outdated for autorelease pools
}
Warning: Doing this too much will quickly run the device out of memory and cause your app to be killed. You probably want to make sure that your use of this technique is limited to creating a small number of images.

NSAutoreleasePool leaking

I know this must be something simple that I am overlooking, but why is this leaking:
//add a pool for this since it is on its own thread
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
//add the image
NSURL * imageURL = [NSURL URLWithString:[recipeData objectForKey:#"imagePath"]];
NSData * imageData = [NSData dataWithContentsOfURL:imageURL];
UIImage * image = [UIImage imageWithData:imageData];
UIImageView * myImageView = [[UIImageView alloc] initWithImage:image];
//resize to make it fit the screen space
CGRect frame = myImageView.frame;
frame.size.width = 320;
frame.size.height = 357;
myImageView.frame = frame;
[self.view addSubview:myImageView];
[activity stopAnimating];
[pool drain];
[self placeItems];
I get the error:
_NSAutoreleaseNoPool(): Object 0x4e2fdf0 of class NSPathStore2 autoreleased with no pool in place - just leaking
I tried moving the placement of [pool drain] but that did nothing. I see a lot of code that looks just like this while searching Google for a cause.
Thanks for your help.
Draining the pool has the effect of releasing and subsequently deallocating it, since autoreleasepools cannot be retained. I suspect there must be some need for an autoreleasepool within placeItems (or some other place called after [pool drain]) since at that point the pool is propably gone already.
So, you might want to try commenting out the drain message to see if that will make the leak go away.
A lot of things to say here :
first, you're leaking myImageView. You have to release it after the -addSubview.
next, since you're on another thread, your [pool drain] must be at the end
last, since you're not on the main thread, you can't perform any UI operation. Try to replace [self.view addSubview:myImageView] by [self.view performSelectorOnMainThread:#selector(addSubview:) withObject:myImageView waitUntilDone:YES]. Same with [activity stopAnimating].
And like Brian said, the -drain message must be at the end of your thread.

Objective C: EXC_BAD_ACCESS when generating a UIImage

I have a view that generates an image based on a series of layers. I have images for the background, for the thumbnail, and finally for an overlay. Together, it makes one cohesive display.
It seems to work a dream, except for when it doesn't. For seemingly no reason, I get an EXC_BAD_ACCESS on the specified line below after it's generated somewhere between 8 and 20 images. I've run it through the memory leak tool and allocation tool, and it's not eating up tons of memory and it's not leaking. I'm totally stumped.
Here's the relevant code:
- (UIImage *)addLayer:(UIImage *)layer toImage:(UIImage *)background atPoint:(CGPoint)point {
CGSize size = CGSizeMake(240, 240);
UIGraphicsBeginImageContext(size);
[background drawAtPoint:CGPointMake(0, 0)]; // <--- error here
[layer drawAtPoint:point];
UIImage* result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return result;
}
// Build the layered image -- thingPage onto thingBackground,
// then the screenshot on that, then the thingTop on top of it all.
// thingBackground, thingPage and thingTop are all preloaded UIImages.
-(UIImage *)getImageForThing:(Thing *)t {
[self loadImageCacheIfNecessary];
if (!t.screenshot) {
return [UIImage imageNamed:#"NoPreview.png"];
} else {
UIImage *screenshot = t.screenshot;
UIImage *currentImage = [self addLayer:thingPage toImage:thingBackground atPoint:CGPointMake(0, 0)];
currentImage = [self addLayer:screenshot toImage:currentImage atPoint:CGPointMake(39, 59)];
currentImage = [self addLayer:thingTop toImage:currentImage atPoint:CGPointMake(0, 1)];
return currentImage;
}
}
I can't find anywhere that this is going wrong, and I've been tearing my hair out for a couple of hours on this. It's the final known bug in the system, so you can imagine how antsy I am to fix it! :-)
Thanks in advance for any help.
As to me, I always use -(void)drawInRect: instead of -(void)drawAtPoint:
CGRect rtDraw;
rtDraw.origin = CGPointZero;
rtDraw.size = size;
[background drawInRect:rtDraw];
[layer drawInRect:rtDraw];
And ....
The paint method with UIGraphicsBeginImageContext(size) and UIGraphicsEndImageContext() is not thread-safe.
Those functions will push or pop a context with stack struct, which is managed by system.
EXC_BAD_ACCESS is almost always due to accessing an object that has already been released. In your code this seems to be t.screenshot. Check creation (and retaining if it is an instance variable) of the object returned by Thing's screenshot property.
As it turns out, the error wasn't in the code I posted, it was in my caching of the thingBackground, thingPage and thingTop images. I wasn't retaining them. Here's the missing code, fixed:
-(void)loadImageCacheIfNecessary {
if (!folderBackground) {
thingBackground = [[UIImage imageNamed:#"ThingBack.png"] retain];
}
if (!folderPage) {
thingPage = [[UIImage imageNamed:#"ThingPage.png"] retain];
}
if (!folderTop) {
thingTop = [[UIImage imageNamed:#"ThingTop.png"] retain];
}
}
I will admit I'm still not comfortable with the whole retain/release/autorelease stuff in Objective C. Hopefully it'll sink in one day soon. :-)