iPhone - save UIImageView state - iphone

I have created a subclass of UIImageView and I am handling the touches for its objects inside the subclass itself.
Now when the user is about to exit the app I want to save the state of the images. And as there are multiple transforms which might have taken place on the images (I am saving all the transforms in an array), I want to be able to save these objects in applicationWillTerminate.
Is there any way I can save these objects? Or do I have to save everything individually? If I do, how do I save all the transformations which have happened on the image view objects till the user exits?
Thanks.

UIImageView conforms to NSCoding. You will need to implement encodeWithCoder: in your UIImageView subclass to allow it to be serialised. You use the passed encoder to serialise the important member variables, in this case your images then call the superclass' encodeWithCoder:.
You do the the inverse of this in initWithCoder: an you've got your original state back.
If your images are instances of UIImage you'll have a bit more work to do since it doesn't conform to NSCoding. You might be able to get away with converting the image data to an NSData object and encoding that depending on your requirements. Here's an example of how this can be done: http://www.nixwire.com/getting-uiimage-to-work-with-nscoding-encodewithcoder/
Have a look at the Archives and Serializations Programming Guide for Cocoa: http://developer.apple.com/iphone/library/documentation/Cocoa/Conceptual/Archiving/Archiving.html

Related

How does UIImage get constructed when created from an XIB?

I'm trying to do some fanciness with XIBs and that includes wanting to somehow get and store the paths of images loaded from the xib. To do this, I made some categories and did some method swizzling to override all the UIImage constructors to save their path before calling their parent constructor.
But due to Apple's black box BS with all their XIB stuff, absolutely none of the exposed constructors for UIImage seem to get called when I create a UIImageView through [UIViewController initWithNib...].
Does anybody know what function call happens or how they do this? I can't find any information whatsoever that exposes what initWithNib actually does behind the scenes.
Thanks!
EDIT:
If you're in a similar situation, you may try using the accessibilityLabel / accessibilityHint which is automatically populated with the image path. The only issue is that accessibility needs to be enabled or these values are nil.
I know the objects constructed from a Nib are being unarchived according to the NSCoding protocol; you need to override initWithCoder: in this case.
You could use swizzling to replace UIImageView's initWithCoder: method, then snoop around to see if an image name or path is available in any of the coder's keys. This might be more effective than hacking UIImage itself, since for all we know UIImageView could be using a custom subclass that you don't have access to.
the initWith... methods are meant for programatically creating UIImageView objects.
Sounds like you want to catch things as they are instantiated from XIB files. That would be the parent class UIView's [initWithCoder:] method.
As the UIView documentation says:
initWithCoder: - Implement this method if you load your view from an
Interface Builder nib file and your view requires custom
initialization.
You are both close to right - I tried doing this with UIImageView which works for initWithCoder as you guys are both suggesting. However, UIImage doesn't get initWithCoder called for some reason, instead it uses initWithCGImageStored:(CGImageRef)cgImage scale:(CGFloat)scale orientation:(UIImageOrientation)orientation.
If and when I actually get the path out of this as I desire I'll post it up here. Thanks for the help, gents.

Passing Image data from UIImagePickerController off for background processing

I'm using a UIImagePickerController to capture a still image. I then need to do some processing work before saving different copies of the image to a Core Data store. The processing and saving work can take up to 4-8 seconds on an iPhone 4 so I'm trying to branch the work off to a background queue so the whole app and UI doesn't block.
At the root of my question is this. Is it possible to use a UIImage in a background thread so long as the UIImage object is totally confined to that thread? I found the following in apple's thread safety summary about NSImage. I'm assuming UIImage would work the same way.
NSImage Restrictions:
One thread can create an NSImage object, draw to the image buffer, and pass it off to the main thread for drawing. The underlying image cache is shared among all threads. For more information about images and how caching works, see Cocoa Drawing Guide.
Can anyone confirm this or is it just plain wrong to touch something like a UIImage object outside of the main thread. If using a confined UIImage instance is okay, then I that leads to another issue. The UIImagePickerController returns an NSDictionary which is thread safe, but within that NSDictionary is a UIImage object. Is it safe to pass that dictionary off to another thread then use the contained object to within that thread?
If the UIImage in the imagePicker info dictionary is not safe to use then any suggestions on how best to proceed?
I think I have the actual Core Data threading issues figured out. But for information I currently write and retrieve the image data using an NSValueTransformer to transform a UIImage to and from NSData within a custom NSManagedObject subclass.
I worked on some "Process an image after UIImagePickerController" code in a background thread just as you are and had no problems. Below are the steps I had success with.
Image Picker Delegate returns user image.
Assign image to the background thread class (In my case it was a #property retain'ed IVAR)
close/dismiss Image Picker
Background processing continues and creates a new UIImage after manipulating the original
new UIImage is returned to appropriate VC
Background process ends.

Shared UIImageView for background image throughout app - Singleton Property

I have a UIImage that I use throughout my app as a background image for grouped UITableViews.
I thought that for efficiency I would alloc and init the UIView with my UIImage in my appDelegate and then access throughout my app. That way I would only allocate that imageView once and if I was drilling into a nav stack with multiple tableviews with this image I wouldn't need to worry about releasing and restoring the image as I descend and ascend or incur overhead at each step.
As soon as I tried this I noticed that it seems that the UITableView class is releasing the my shared image down to 0 and it therefore is going away. Makes perfect sense but I would need to prevent the image from ever hitting a 0 retain count for this to work.
Is this a totally goofy approach?
If it is not what's the best way to retain my shared ImageView? I know I could call retain when I setup each tableview's backgroundimage but I was wondering if there is a way to set the retain count of the shared UIImageView to NSUIntegerMax in my appDelegate. I've setup singleton classes before but in this case I'm trying to have a single property that is never released rather than creating a UIImageView singleton subclass.
Sorry if that's a little muddled and thanks for any pointers.
I would not worry so much as + (UIImage *)imageNamed:(NSString *)name are cached.
From the spec:
This method looks in the system caches for an image object with the specified name and returns that object if it exists. If a matching image object is not already in the cache, this method loads the image data from the specified file, caches it, and then returns the resulting object.

Pointer to Pointer to UIImage created by NSURLConnection

Hey all, here's the deal...
I've got a UIImage that is being loaded in the background via an NSURLConnection within a subclassed UIImageView. Until the data finishes downloading, a placeholder image is shown. The UIImage needs to be used by another, separate UIImageView, as well.
The problem I'm having is that if I set the second UIImageView's image property to that of the subclassed object before the download is complete, the second UIImageView never displays the downloaded image, since it's image prop is pointing to the placeholder.
Is there anyway to pass a pointer to a pointer between these two UIImageViews?
I've tried things like this:
imageView2.image = &[imageView1 image];
imageView2.image = *[imageView1 image];
But these don't seem to work. Any ideas?
Those pointers look particularly gross. The problem with doing it that way is that, even though you might be updating the data pointed to by those pointers, you are not notifying your UIImageView subclasses that the data has changed, and thus they don't know to redraw.
Instead of playing with that pointer mess, why not use Key-Value Observing instead? If you have a separate thread running that downloads the UIImage, you can just tell your UIImageViews to observe that property and when your downloader class is finished downloading all the data and has put it into the property, the views will get a notification and can then display the image.
Can't you just catch the connectionDidFinishLoading: message from the NSURLConnection and then set the second image?

iPhone - Using NSCoder with subclass of UIImageView

I have created a subclass of UIImageView and I am handling the touches for its objects inside the subclass itself.
Now when the user is about to exit the app I want to save the state of the images. And as there are multiple transforms which might have taken place on the images (I am saving all the transforms in a dictionary), I want to be able to save these objects in applicationWillTerminate.
I am using the encodeWithCoder and initWithCoder methods. The objects are saved and loaded as expected (at least looks like it). But when I print the frame size and origin, I get some absurd values on the console.
Can someone please tell me if the frame is not saved when encodeWithCoder is called and the contents are stored in a file?
Thanks.
Why don't you serialize the UIImageView's frame along with the actual UIImageView? Just add another property in encodeWithCoder and initWithCoder for the frame. This way, you can be sure to have the correct frame if you feel something is off.
Please paste the code where you are logging the values of the UIImageView's frame. I believe you may be casting to the wrong type which will cause your values to be off.
write Dictionary content to plist by using NSUserDefault's object for key , and setobject for key method