Passing Image data from UIImagePickerController off for background processing - iphone

I'm using a UIImagePickerController to capture a still image. I then need to do some processing work before saving different copies of the image to a Core Data store. The processing and saving work can take up to 4-8 seconds on an iPhone 4 so I'm trying to branch the work off to a background queue so the whole app and UI doesn't block.
At the root of my question is this. Is it possible to use a UIImage in a background thread so long as the UIImage object is totally confined to that thread? I found the following in apple's thread safety summary about NSImage. I'm assuming UIImage would work the same way.
NSImage Restrictions:
One thread can create an NSImage object, draw to the image buffer, and pass it off to the main thread for drawing. The underlying image cache is shared among all threads. For more information about images and how caching works, see Cocoa Drawing Guide.
Can anyone confirm this or is it just plain wrong to touch something like a UIImage object outside of the main thread. If using a confined UIImage instance is okay, then I that leads to another issue. The UIImagePickerController returns an NSDictionary which is thread safe, but within that NSDictionary is a UIImage object. Is it safe to pass that dictionary off to another thread then use the contained object to within that thread?
If the UIImage in the imagePicker info dictionary is not safe to use then any suggestions on how best to proceed?
I think I have the actual Core Data threading issues figured out. But for information I currently write and retrieve the image data using an NSValueTransformer to transform a UIImage to and from NSData within a custom NSManagedObject subclass.

I worked on some "Process an image after UIImagePickerController" code in a background thread just as you are and had no problems. Below are the steps I had success with.
Image Picker Delegate returns user image.
Assign image to the background thread class (In my case it was a #property retain'ed IVAR)
close/dismiss Image Picker
Background processing continues and creates a new UIImage after manipulating the original
new UIImage is returned to appropriate VC
Background process ends.

Related

How to notify the background thread from the main thread when there is something needs to be processed

I am working on an app that does image processing and displays the resulting image. Im using UIScrollView to let user scroll all images, because the image is not a standard jpg or png, it takes time to load. so i want to use a thread to load image, and then update the views.
For now, I use a timer in the background thread to check whether there are any images that are needed to be loaded. but it is not working so well. I want to know whether there is a way to notify the background thread when there are some images that are needed to be loaded in the main thread or any other suggestions?
Thanks in advance.
Provide a method in your class that controls the scrollView, lets call in 'processImage'. In your background thread, when you have an image, send it to the UI class as follows:
dispatch_async(dispatch_get_main_queue(), ^{ [uiClass processImage:theImage] } );
The background object should keep a weak reference to the uiClass (which is a delegate in this example). The idea is to do the image processing in the background, but provide it to the UI class on the main thread.

Asynchronously loading images on the iPhone

I have a grid of images that are loaded from the web, but there's a bit of lag/stutter while scrolling. I'm using an asynchronous ASIHTTPRequest to make the requests, so the download itself is happening in a separate thread, but because UIKit isn't thread-safe, once I receive the NSData response, I have to call UIImage initWithData on the main thread.
Profiling shows that, by far, the bottleneck consists of the internal PNG parsing functions invoked by UIImage initWithData. I'm interested in doing this in a background thread, so the main UI remains responsive and there's less lag.
But I'm not sure exactly how to do this. It sounds like the right direction is to use CGImageRef, since Core Graphics is thread-safe, but I only see CGImageCreateWithPNGDataProvider and CGImageCreateWithJPEGGDataProvider, whereas UIImage initWithData supports a large list of image types.
I want something that has the same functionality as UIImage initWithData but doesn't have the thread-safety issues.
You can call UIImage initWithData in a background thread safely. As a good rule of thumb, what you can't do in a background thread is alter user interface elements. In this case, you should not be setting the image property of a UIImageView, or adding a UIImageView to a superview in a background thread.
However, creating a UIImage instance is safe, and will work with no problems.

iphone - single UIImage multiple UIImageView?

I have a single image that needs to be placed on multiple UIImageView.
I am wondering whether there is a way to save the memory in doing this?
For e.g., I have a pic file. I can create one UIImage object for this file. If I create multiple UIImageView and init them with the single UIImage Object, will that save the memory?
thanks
Yes, it will save memory compared to one UIImage per UIImageView. How significant will it be? To find out, you'll have to run it both ways and watch the memory allocations by Instruments.

Pointer to Pointer to UIImage created by NSURLConnection

Hey all, here's the deal...
I've got a UIImage that is being loaded in the background via an NSURLConnection within a subclassed UIImageView. Until the data finishes downloading, a placeholder image is shown. The UIImage needs to be used by another, separate UIImageView, as well.
The problem I'm having is that if I set the second UIImageView's image property to that of the subclassed object before the download is complete, the second UIImageView never displays the downloaded image, since it's image prop is pointing to the placeholder.
Is there anyway to pass a pointer to a pointer between these two UIImageViews?
I've tried things like this:
imageView2.image = &[imageView1 image];
imageView2.image = *[imageView1 image];
But these don't seem to work. Any ideas?
Those pointers look particularly gross. The problem with doing it that way is that, even though you might be updating the data pointed to by those pointers, you are not notifying your UIImageView subclasses that the data has changed, and thus they don't know to redraw.
Instead of playing with that pointer mess, why not use Key-Value Observing instead? If you have a separate thread running that downloads the UIImage, you can just tell your UIImageViews to observe that property and when your downloader class is finished downloading all the data and has put it into the property, the views will get a notification and can then display the image.
Can't you just catch the connectionDidFinishLoading: message from the NSURLConnection and then set the second image?

iPhone - save UIImageView state

I have created a subclass of UIImageView and I am handling the touches for its objects inside the subclass itself.
Now when the user is about to exit the app I want to save the state of the images. And as there are multiple transforms which might have taken place on the images (I am saving all the transforms in an array), I want to be able to save these objects in applicationWillTerminate.
Is there any way I can save these objects? Or do I have to save everything individually? If I do, how do I save all the transformations which have happened on the image view objects till the user exits?
Thanks.
UIImageView conforms to NSCoding. You will need to implement encodeWithCoder: in your UIImageView subclass to allow it to be serialised. You use the passed encoder to serialise the important member variables, in this case your images then call the superclass' encodeWithCoder:.
You do the the inverse of this in initWithCoder: an you've got your original state back.
If your images are instances of UIImage you'll have a bit more work to do since it doesn't conform to NSCoding. You might be able to get away with converting the image data to an NSData object and encoding that depending on your requirements. Here's an example of how this can be done: http://www.nixwire.com/getting-uiimage-to-work-with-nscoding-encodewithcoder/
Have a look at the Archives and Serializations Programming Guide for Cocoa: http://developer.apple.com/iphone/library/documentation/Cocoa/Conceptual/Archiving/Archiving.html