I am trying to implement a Coverflow ( iCarousel ) with AsyncImageView ( https://github.com/nicklockwood/AsyncImageView ) as cover. It works well with images of size upto 4 MB. But the app crashes when trying to load images of size more than 10 MB. My question is -
1 ) Can i load a image of size 10 MB without tiling ? Since the source of image is from device camera, is it possible to tile those images and then load those images. If so can you share some ideas/code blocks on tiling a large image ?
P.s : I have tried compressing the images, by using UIImageJPEGRepresentation(image,scale), although the image size got reduced from 10 MB to 100 KB, but when i try to load the compressed images, memory issues shows up again. (Looks like iOS decompresses to some extent)
You are confused with the dimensions of the image and its size on disk.
In memory an image take WIDTH*HEIGHT*4 pixels, so let's say your image is 1000x1000px you end up using 4Mb ram.
UIImageJPEGRepresentation save the image with a compression factor, so you end up with a smaller image on disk, but the image has still the same dimensions.
To solve your problem you need to scale the image you downloaded to the correct dimension for your coverflow item.
You can do this using ImageIO framework :
Create a CGImageSource from the downloaded data
Call CGImageSourceCreateThumbnailAtIndex with the 2 properties kCGImageSourceCreateThumbnailFromImageIfAbsent and kCGImageSourceThumbnailMaxPixelSize
Here is the working code
UIImage *result = nil;
if ([data length]) { // NSData of the image
CGImageSourceRef sourceRef = CGImageSourceCreateWithData((CFDataRef)data, nil);
NSMutableDictionary *options = [NSMutableDictionary dictionary];
[options setObject:(id)kCFBooleanTrue forKey:(id)kCGImageSourceCreateThumbnailFromImageIfAbsent];
[options setObject:[NSNumber numberWithInt:400] forKey:(id)kCGImageSourceThumbnailMaxPixelSize];
CGImageRef imageRef = CGImageSourceCreateThumbnailAtIndex(sourceRef, 0, (CFDictionaryRef)options);
if (imageRef) {
result = [UIImage imageWithCGImage:imageRef]; //Resulting image
CGImageRelease(imageRef);
}
if (sourceRef) CFRelease(sourceRef);
Related
Is it possible to lower the resolution of already taken videos/pictures? I need to export low resolution videos/pictures.
U can use "scale" for it, i.e.: 0.5 for half-size
UIImage *ret = nil;
ALAsset *asset =[_assetImages objectAtIndex:page];
ALAssetRepresentation *defaultRep = [asset defaultRepresentation];
ret = [UIImage imageWithCGImage:[defaultRep fullScreenImage] scale:0.5 orientation:0];
You should also look at Joshua Sullivan's answere on:
How to compress/resize image on iPhone OS SDK before uploading to a server?
This changes the reported size of the scaled image, but does not actually change the image data. The data length of the NSData generated on the original and the scaled image is virtually identical.
When I do an NSLog on the size of the image after putting it into a UIImage, it comes out at the expected size. When I try this with CGImageSource however, I get an image twice the size as I was expecting. This is the code I'm using for that:
NSString *fullPath = [self fullPathForThumbnail];
NSURL *imageFileURL = [NSURL fileURLWithPath:fullPath];
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((__bridge CFURLRef)imageFileURL, NULL);
if (imageSource == NULL) {
// Error loading image
return NO;
}
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:NO], (NSString *)kCGImageSourceShouldCache,
nil];
CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, (__bridge CFDictionaryRef)options);
CGSize originalSize;
if (imageProperties) {
NSNumber *width = (NSNumber *)CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelWidth);
NSNumber *height = (NSNumber *)CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelHeight);
originalSize = CGSizeMake(width.floatValue, height.floatValue);
CFRelease(imageProperties);
}
This only happens on retina images; non-retina images seem to be the correct size.
To expand on what bnoble said:
There are two different concepts here: the size of the image and the number of pixels in the image. These two concepts are related by the resolution of the image.
The size of an image is given in units such as inches, centimeters or printer's points. There are actually different definitions of the printer's point, but the one commonly used in IT is the one promoted by Adobe, which is: 72 point = 1 inch.
In Cocoa and previously NeXTStep, the size of an image was always the physical size, the number of pixels was a separate measure. In a device-independent graphics system, you can have 1 cm x 1 cm image that is 72 dpi, 150 dpi, 300 dpi or 2400 dpi. The number of pixels in each of these images will be different, but the size is always the same.
The UIImage class on iOS used to equate the two, assuming (as do many people) that pixel-size = physical size, or in other words that the resolution is 72 dpi.
However, this changed with the retina display, the documentation for the UIImage size property now has the following to say:
In iOS 4.0 and later, this value reflects the logical size of the
image and is measured in points. In iOS 3.x and earlier, this value
always reflects the dimensions of the image measured in pixels.
So the size property of the UIImage is giving you the physical size. The kCGImagePropertyPixelWidth property on the other hand, is giving you the number of pixels, which for a 2x retina image is expected to be twice the number of points of the size.
I m taking images from photo library.I have large images of 4-5 mb but i want to compress those images.As i need to store those images in local memory of iphone.for using less memory or for getting less memory warning i need to compress those images.
I don't know how to compress images and videos.So i want to know hot to compress images?
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
NSData* data = UIImageJPEGRepresentation(image,1.0);
NSLog(#"found an image");
NSString *path = [destinationPath stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.jpeg", name]];
[data writeToFile:path atomically:YES];
This is the code for saving my image. I dont want to store the whole image as its too big. So, I want to compress it to a much smaller size as I'll need to attach multiple images.
Thanks for the reply.
You can choose a lower quality for JPEG encoding
NSData* data = UIImageJPEGRepresentation(image, 0.8);
Something like 0.8 shouldn't be too noticeable, and should really improve file sizes.
On top of this, look into resizing the image before making the JPEG representation, using a method like this:
+ (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Source: The simplest way to resize an UIImage?
UIImageJPEGRepresentation(UIImage,Quality);
1.0 means maximum Quality and 0 means minimum quality.
SO change the quality parameter in below line to reduce file size of the image
NSData* data = UIImageJPEGRepresentation(image,1.0);
NSData *UIImageJPEGRepresentation(UIImage *image, CGFloat compressionQuality);
OR
NSData *image_Data=UIImageJPEGRepresentation(image_Name,compressionQuality);
return image as JPEG. May return nil if image has no CGImageRef or invalid bitmap format. compressionQuality is 0(most) & 1(least).
I have a tableview, and i am loading images to it. I have images which are ranging from 150kb - 2MB. Since this is too much for a tableview to handle (it takes long time to load, and makes the scrolling slow), i thought of using ImageIO framework to create thumbnail images of images.
I found a code that does this, but i can't undestand it.
1.) Can someone please explain me the code
2.) My problem is that, I have a tableview and i need to load thumbnail images to it. So how can i use the following code and display it on my tableview. Can someone show me some sample code or a tutorial that does this ?
heres the code ;
-(void)buildGallery
{
for (NSUInteger i = 0; i < kMaxPictures; i++)
{
NSInteger imgTag = i + 1;
NYXPictureView* v = [[NYXPictureView alloc] initWithFrame:(CGRect){.origin.x = x, .origin.y = y, .size = _thumbSize}];
NSString* imgPath = [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:#"%d", imgTag] ofType:#"jpg"];
CGImageSourceRef src = CGImageSourceCreateWithURL((CFURLRef)[NSURL fileURLWithPath:imgPath], NULL);
CFDictionaryRef options = (CFDictionaryRef)[[NSDictionary alloc] initWithObjectsAndKeys:(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform, (id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageIfAbsent, (id)[NSNumber numberWithDouble:_maxSize], (id)kCGImageSourceThumbnailMaxPixelSize, nil];
CGImageRef thumbnail = CGImageSourceCreateThumbnailAtIndex(src, 0, options); // Create scaled image
CFRelease(options);
CFRelease(src);
UIImage* img = [[UIImage alloc] initWithCGImage:thumbnail];
[v setImage:img];
[img release];
CGImageRelease(thumbnail);
}
}
Basically the problem you have is due to the fact that when you scale down an image, the number of bytes stored in memory doesnt change when you scale it down. The hardware still has to read your 2mb image, and then render it to a smaller scale. What you need to do is to either change the size of your image (use photoshop or other) or the way im suggesting is to compress your image, and then scale it down. The image will look rough at normal size, but will look ok when you scale it down to a thumbview.
To generate an NSData version of your image encoded as a PNG.
NSData *PNGFile = UIImagePNGRepresentation(myImage);
Or a JPEG, with a quality value set
NSData *JPEGFile = UIImageJPEGRepresentation(myImage, 0.9f);
Both of these will give you an image smaller than you currently have, which will be easier to render in the tableView.
In order to get better performance you're going to have to load the image in a background thread, and after it's in memory add the UIImage to the image view on the main thread. There are a couple ways to go about doing this, but the simplest is going to be using GCD's block based methods.
Resizing the image is definitely still important for memory considerations, but get the asynchronous image loading part down first.
I am getting an UIImage from UIGraphicsGetImageFromCurrentImageContext, but they are very heavyweight memory wise.
UIGraphicsBeginImageContext([self bounds].size);
// CGContext procedures
_cacheImage = UIGraphicsGetImageFromCurrentImageContext();
the size of the image is almost as the size of the iPad screen, (a little smaller), and when I do something like this:
NSData *data = UIImagePNGRepresentation(_cacheImage)
NSLog(#"%i", data.lenght);
It gives me like 700,000 in lenght. I'm guessing its a .7MB file?
Anyway if there is some way to reduce the image size, please let me know.
If you wish to reduce quality, try using
NSData * UIImageJPEGRepresentation (
UIImage *image,
CGFloat compressionQuality
);
Pass compressionQuality == 0.0 for maximum quality or compressionQuality == 1.0 for maximum compression.