Persisting an array of images - iphone

What's the best way to save and retrieve an array of images across app restarts?
I'm implementing a caching feature for offline viewing of downloaded images and just want to make sure I'm using the right persisting methods.
Thanks!

The quickest & probably best solution would be to persist your images to disk, no question about it.
You could do something like this to save them as JPEG.
NSData *data = UIImageJPEGRepresentation(image, 1.0f);
[data writeToFile:imagePath atomically:YES];

Related

iPhone, how does one read an image from the photo library as NSData?

On the iPhone I saved an image to the photo library with modifications (appending data after the image code) using the assest library to save the NSData directly to the photo library. Now I want to read the image back from the photo library as an NSData to read it. If I read it as a UIImage, it changes the data inside the image.
How can I read the photo from the photo library as NSData? I tried looking into the reference URL in 4.1 but no luck.
Edit: I edited to explain why I'm saving it as an NSData. The URL method in the answers works if you just want pure NSData, but does not help in the context that I am asking.
Edit 2: The answer with the getBytes was the answer that did it for me. The code I used was:
int8_t *bytes = malloc([representation size]);
NSUInteger length = [representation getBytes:bytes fromOffset:0 length:[representation size] error:&error];
This was able to get me everything inside the file which gave me the image code PLUS what I added in NSData form.
Edit 3:
Is there a way to do this now in iOS 10 with the PHPhotoLibrary which replaces AssetLibrary? I need the image in NSData being the original RAW image with no modifications.
You should look into the getBytes method of ALAssetRepresentation.
This gives you the original RAW image data of your image and also makes it possible to use a buffer to process the image instead of reading the image into memory all at once.
And of course you can always generate NSData from the getBytes method.
Cheers,
Hendrik
If you already have your ALAsset, you should be able to get an NSData of the default representation using the following:
NSData *data = [NSData dataWithContentsOfURL:[[asset defaultRepresentation] url]];

ALAssetRepresentation as NSData for GIFs

I'm try to allow users to pull images out of their Photos collections using ALAssetsLibrary. Users can then upload these images. My goal is to allow users to upload any GIFs they may have in their library w/o loosing any animation they may have.
For PNG and JPEG files I can grab the ALAssetRepresentation, use - (CGImageRef)fullResolutionImage to get a CGImageRef, and then save it to NSData using UIImageJPEGRepresentation or UIImagePNGRepresentation.
However, because no similar function exists for GIF files, all I can do is covert the GIF to either JPEG or PNG, but then I lose the animation.
Is there either
a way to grab the NSData straight from an ALAssetRepresentation object or
a way to go from ALAssetRepresentation -> CGImageRef -> NSData without loosing any gif animation frames?
Thanks in advance!
Yes, there is a quite simple way:
Use the getBytes:fromOffset:length:error: method of ALAssetRepresentation. This gives youthe raw file data of the ALAsset, in your case the GIF file.

UIImagePNGRepresentation slow or am I doing something wrong?

I'm working on an iPhone App that uses the camera to take pictures, then I'm saving them to the Applications Documents directory. I'm using the following code to convert the UIImage to NSData,
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(image)];
Then I write the NSData using
[imageData writeToFile:path atomically:NO]
It all works. The problem is that UIImagePNGRepresentation() is really slow. It takes 8-9 secs on my 3G to convert the image to NSData. This seems wrong to me. Does anyone have any experience with this? Is this just slow function or am I doing something terribly wrong?
Thanks
Are you sure you want to save pictures captured with the camera as PNG?
JPEG is a more appropriate format for photographs. Additionally, its likely much faster!

How does the default Camera iPhone app manages to save a photo so fast?

So far I've managed to create an app for iPhone that takes multiple images with about a 3 second interval between each. I`m processing each image in a separate thread asynchronously and everything is great till it gets to the moment for saving the image on the iPhone disk. Then it takes about 12 seconds to save the image to the disk using JPEG representation.
How does Apple do it, how do they manage to save a single image so fast to the disk is there a trick they are using? I saw that the animations distract the user for a while, but still the time needed is below 12 seconds!
Thanks in advance.
Actually apple uses its kernal driver AppleJPEGDriver, It is a hardware jpeg encoding api and is much faster than software encoding (JPEGRepresnetaion) and some of the people using it in their jailbreak apps(cycorder video recording application).
Apple should give the same functionality to its users but they are apple :)
I haven't tried this but I wouldn't be so sure that Apple isn't using the same methods. A big part of the Apple design philosophy relies on hiding operational interruptions from the user. The Apple code may take as much time as yours but simply be adroit at hiding the entire save time from the perception of the user.
If someone can't tell you how Apple actually does save faster I would suggest looking at ways to disguise the save time.
If you google around a bit... there is a whole bunch of people with the same problem.
I didn't find an answer. The general conclusion seems to be that apple either uses some internal api and bypass public api overhead or some hardware encoder.
Guess you are out of luck for fast image saving
I was having this problem in my app, on saving it would hang so I used Grand central dispatch.
Below is the setImage method out of my image cache class, if UIImage has a image it saves it otherwise it deletes it. You can adapt this to suit your needs hopefully, will only work on iOS 4+. The code is ARC enabled.
-(void)setImage:(UIImage *)image{
if (image == nil){
NSLog(#"Deleting Image");
// Since we have no image let's remove the cached image if it exists
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSString *cachePath = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory,
NSUserDomainMask, YES) objectAtIndex:0];
[[NSFileManager defaultManager] removeItemAtPath:[cachePath
stringByAppendingPathComponent:#"capturedimage.jpg"] error:nil];
});
}
else {
NSLog(#"Saving Image");
// We've got an image, let's save it to flash memory.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSString *cachePath =
[NSSearchPathForDirectoriesInDomains(NSCachesDirectory,
NSUserDomainMask, YES) objectAtIndex:0];
NSData *dataObj = UIImagePNGRepresentation(image);
[dataObj writeToFile:[cachePath
stringByAppendingPathComponent:#"capturedimage.jpg"] atomically:NO];
});
}
imageCache = image;
}

ObjC - Post a UIImage by using NSData and NSURLRequest?

I'm trying to post an image to TwitPic.com using their API.
However, I have never posted an image using HTTPPOST or however else before.
Anybody enlighten me to how I can post NSData from a UIImage using their api?
TwitPic is expecting a multi-part form data. There is a great open source library called ASIHTTPRequest.
You can use their APIs and post your image as multi-part/form-data.
See below for the sample.
http://allseeing-i.com/ASIHTTPRequest/How-to-use
Use the UIImagePNGRepresentation function or its JPEG equivalent to turn a UIImage into an NSData:
http://developer.apple.com/iphone/library/documentation/UIKit/Reference/UIKitFunctionReference/Reference/reference.html#//apple_ref/c/func/UIImagePNGRepresentation
#Jamie:
Try:
[request setData:twitpicImage forKey:#"media"];
If the image view is actually displaying an image that is stored in a file on disk, use:
[request setFile:theImagePath forKey:#"media"];
This way, ASIHTTPRequest will stream the image data from disk, so you don't have to keep an NSData instance in memory.
Ta
Ben
Instead of setPostValue for the image, use setData instead.