Dump UIImage on iPhone - iphone

I wonder how to save the UIImage object from UIImagePickerController into the App Document directory. I tried to use UIImageJPEGRepresentation() method and UIImagePNGRepresentation(), but it seemed the image data was changed. Is there any method to keep the original image content without any compression?

PNG employs lossless data compression and is a good option for you. You can read more at the wiki link. A UIImage that is created from PNG data should be identical to the UIImage it was created from. I'm not sure where your image is coming from but a PNG will be a different size than a JPEG it was created from, but the bitmap data should be the same.

Related

Decompress images to use on a UITableView

I currently working on a project that i need to display large images on a UITableView,this is very common problem for a lot of developers and learning with they threads i reached to the following procedure:
NOTE:The large images i refer,they all have 300x300px(600x600px,retina) and about 200kb,JPEG
Create a NSOperationQueue;
Download images asynchronously(Each image has 600x600px,corresponding to the #2x image);
Resize and create the non retina image(300x300px image);
Decompress both images;
Store all images on a NSCache;
After all that procedures have finished i update my UITableView on the main thread.I'am using a UITableViewCell subclass to draw all my needed content(As seen on Apple's sample codes).The main problem is that i'm confused about step 4(decompress images),my doubts:
NOTE:I'm currently storing my decompressed images on a NSCache.
Should i decompress the images and store then as UIImage's or NSData's?
How can i store the decompressed images?(NSCache,NSMutableArray...)
What is the best way to pass the decompressed image to my UITableViewCell subclass?
NOTE:I'm using the decompression code presented here:link
You can't really store an UIImage object to disk, but you can turn it to NSData using UIImagePNGRepresentation
Using UIImage will give you cache out of the box, I bet it's the most efficient you can get
Just put the image into UIImageView, Apple spent a lot of time on making image rendering fast.
That said, your images are not particularly big, especially for retina devices, I would advice looking at something like AFNetworking library that has a complete and tested solution for this problem.
Plus, you can look up the code of AFImageRequestOperation which does exactly what you need: download, store, cache, reuse.

Saving UIImage With No Quality Loss

My app takes screenshot of entire web pages. When it saves to Photo Album, the image quality suffers a bit and text on still image is blurry and hard to read. I found this solution which promises to have a better quality when saving to Photo Album, but it still appears to not be perfectly acceptable. Is there a better way than this to save UIImage to Photo Album with very minimal quality loss? Thanks.
UIImage* img = [UIImage imageWithCGImage:screenshotImage.CGImage];
NSData* imdata = UIImagePNGRepresentation ( img );
UIImage* image = [UIImage imageWithData:imdata]; // wrap UIImage around PNG representation
UIImageWriteToSavedPhotosAlbum(image, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
In addition to looking at the image quality settings, you need to be aware of which image formats are best for storing different types of images.
I prefer JPG for real world images such as those taken with a camera. JPEG does a better job of retaining high image quality while achieving good compression.
If you are taking a screenshot where a lot of the parts of the image are the same color (like this web page or your programming editor), then I prefer to use PNG. PNG will save the image without creating a lot of the dirty artifacts that often appear in JPEG but will still achieve good compression ratio for these types of images.
The reasons for the above have to do with the compression algorithms each use (JPEG is FFT; PNG is DEFLATE). Try changing the image type as well as the quality settings and see if that makes a difference.

How to get dimension of png file stored in documents folder of iPhone application?

I am facing a problem, hope some one will help me to sort it out. I am working on an application that saves png files in the documents folder of the iPhone application, and also user can view the files that were saved.
So, i want to know is there any way to find out the dimension of the file being displayed? so that, if image dimension is bigger than my imageView then i can make them scroll, currently image is shrink.
this may help...
ImageIO framework (CGImageSource...)
similar Q to accessing UIImage properties without loading in memory the image
Create a UIImage from the file, using imageWithContentsOfFile:. The size of the image is held in the size property of your new UIImage object.

Should I use png or jpg for my image and thumbnail?

I'm taking images from the camera or the camera roll and I'm saving them to core data using an ImageToDataTransformer class. I need to use these saved images in two different places in my app: 250x250 imageview and 50x50 imageview.
First, should I use png format for both imageviews?
Second, can I compress the image before I save it to core data, and what's the best way?
Third, should I save two different images, one for the big image and another for the thumbnail in a different view?
When Xcode builds your project, it automatically optimizes PNG files included in your project. So, I guess you should use PNG.
I don't know about runtime.
That would be a good idea if you have a table view and you want to show thumbnails. You wouldn't want to be loading the huge files, that would be excruciatingly slow.

iPhone - access location information from a photo

Is it possible, in an iPhone app, to extract location information (geocode, I suppose it's called) from a photo taken with the iPhone camera?
If there is no API call to do it, is there any known way to parse the bytes of data to extract the information? Something I can roll on my own?
Unfortunately no.
The problem is thus;
A jpeg file consists of several parts. For this question the ones we are interested in are the image data and the exif data. The image data is the picture and the exif data are where things like geocoding, shutter speed, camera type and so on are stored.
A UIImage (and CGImage) only contain image data, no tags.
When the image picker selects an image (either from the library or the camera) it returns a UIImage, not a jpeg. This UIImage is created from the jpeg image data, but the exif data in the jpeg is discarded.
This means this data is not in the UIImage at all and thus is not accessible.
I think the selected answer is wrong, actually. Well, not wrong. Everything it said is correct, but there is a way around that limitation.
UIImagePickerController passes a dictionary along with the UIImage it returns. One of the keys is UIImagePickerControllerMediaURL which is "the filesystem URL for the movie". However, as noted here in newer iOS versions it returns a url for images as well. Couple that with the exif library mentioned by #Jasper and you might be able to pull geotags out of photos.
I haven't tried this method, but as #tomtaylor mentioned, this has to be possible somehow, as there are a few apps that do it. (e.g. Lab).