My app takes screenshot of entire web pages. When it saves to Photo Album, the image quality suffers a bit and text on still image is blurry and hard to read. I found this solution which promises to have a better quality when saving to Photo Album, but it still appears to not be perfectly acceptable. Is there a better way than this to save UIImage to Photo Album with very minimal quality loss? Thanks.
UIImage* img = [UIImage imageWithCGImage:screenshotImage.CGImage];
NSData* imdata = UIImagePNGRepresentation ( img );
UIImage* image = [UIImage imageWithData:imdata]; // wrap UIImage around PNG representation
UIImageWriteToSavedPhotosAlbum(image, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
In addition to looking at the image quality settings, you need to be aware of which image formats are best for storing different types of images.
I prefer JPG for real world images such as those taken with a camera. JPEG does a better job of retaining high image quality while achieving good compression.
If you are taking a screenshot where a lot of the parts of the image are the same color (like this web page or your programming editor), then I prefer to use PNG. PNG will save the image without creating a lot of the dirty artifacts that often appear in JPEG but will still achieve good compression ratio for these types of images.
The reasons for the above have to do with the compression algorithms each use (JPEG is FFT; PNG is DEFLATE). Try changing the image type as well as the quality settings and see if that makes a difference.
Related
I noticed that some article talking about crushed PNG images and how to uncrush them. What's the purpose of crush the images in the first hand? And also can the crushed images still be loaded using [UIImage imageWithName:]?
It's used to reduce the file's size, using lossless optimizations and/or compression.
It can evaluate your input image using several optimizations. Basic example: If your input is grayscale and saved as color, it may output a grayscale image. Of course, there are more complex optimizations which it uses.
can the crushed images still be loaded using [UIImage imageWithName:]?
Yes
It's basically a form of compression that doesn't involve any data loss (that is, you don't lose any image quality). Compressing any data reduces it's size that's the reason for actually doing it.
I'm working on a comic viewer app that downloads the latest content from a server. It downloads a single file regardless of the screen scale. What I'd like to do is make this file work correctly on both screens.
What's the procedure for this and how should I size the photos to fit? The trouble I'm having is that the graphics are retina screen size, but the iPhone doesn't interpret them as such. That means they're displayed twice as large as they should be.
CGImageRef cgImage = [myImage cgImage];
UIImage *retinaImage = [UIImage imageWithCGImage:cgImage scale:2.0 orientation:UIImageOrientationUp];
You could also change your image's dpi value in an image editor to make UIImage recognize the scale automatically.
Generally though, you should use lower-resolution images on devices that don't have a retina display because otherwise you're wasting precious memory.
For normal screen, you can resize them programmatically, by adding a scale category to the UIImage class by example, there are many code samples on stackoverflow like :
UIImage resize (Scale proportion)
For retina, you need to set the scale of the UIImage to let the ios know for what screen size it is used (you can set this with the initWithCGImage:scale:orientation: method of UIImage).
I am trying to figure out how I can shrink and save a UIImage, read it back in later, and then convert back to the original size without losing quality. I have managed to do the resizing and saving, but the problem is that if I save it smaller, when I read it back in and expand it, the quality is very poor. Does anyone know how this can be done without losing image quality?
You can't downsize the image and then bring it back without losing quality. You can't make something out of nothing, once you lose the data you lose the data.
You will need to save two versions of the image, one large and one small. This is a very typical scenario when dealing with thumbnails.
Check out the following site which provides categories for resizing images as well as several other really cool stuff:
http://vocaro.com/trevor/blog/2009/10/12/resize-a-uiimage-the-right-way/
Although you cannot upsize a downsized image, you can display a downsized one while retaining a reference to the original image (which you can save):
UIImage *image = [UIImage imageNamed:#"myImage"];
[image drawInRect: destinationRect];
UIImage *thumbnail = UIGraphicsGetImageFromCurrentImageContext();
UIImageWriteToSavedPhotosAlbum(image,nil,nil,nil);
The destinationRect will be sized according to the dimensions of the downsized version.
I am using following code to resize an image - it all works well and as expected...
Resize UIImage the right way
I use interpolation quality as kCGInterpolationLow and UIImageJPEGRepresentation(image,0.0) to get the NSData of that image.
The problem is that the image size is still a bit high in size at around 100kb.
My question is can I reduce it further?
The images originate from the iPhone Photo Album and are selected via a imagePickerController.
Many thanks,
I wonder how to save the UIImage object from UIImagePickerController into the App Document directory. I tried to use UIImageJPEGRepresentation() method and UIImagePNGRepresentation(), but it seemed the image data was changed. Is there any method to keep the original image content without any compression?
PNG employs lossless data compression and is a good option for you. You can read more at the wiki link. A UIImage that is created from PNG data should be identical to the UIImage it was created from. I'm not sure where your image is coming from but a PNG will be a different size than a JPEG it was created from, but the bitmap data should be the same.