I am binding the photo Library images to my Table-view using ALAsset Library.
So whenever i am binding the ALAsset Thumbnail image as a TableView cell image, there is a issue with image Clarity.It Shows as a low clarity image.
I have Created a full resolution image from AlAsset and wrote the thumbnail generation method, i have set the resultant image as table-view image.
After done the above process, i got the full resolution image thumbnail on Table-view.
But the issue was the performance with first time table View image bind process(i used a cache to bind the image after bind first time.So the performance will fast after first time).
So May i know, How can i get the Photo-library full clarity thumbnail image from ALAsset?
I have wrote the below code in MyTableView CellForRowIndexPath is
UIImageView *importMediaSaveImage=[[[UIImageView alloc] init] autorelease];
importMediaSaveImage.frame=CGRectMake(0, 0, 200,135 );
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
CGImageRef iref = [myasset thumbnail];
if (iref) {
importMediaSaveImage.image = [UIImage imageWithCGImage:iref];
}
etc...
I have tried the below method which is time consuming
UIImageView *importMediaSaveImage=[[[UIImageView alloc] init] autorelease];
importMediaSaveImage.frame=CGRectMake(0, 0, 200,135 );
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
UIImage *images;
if (iref) {
images = [UIImage imageWithCGImage:iref];
}
NSData *data = [self photolibImageThumbNailData:images];
importMediaSaveImage.image = [UIImage imageWithData:data];
etc....
photolibImageThumbNailData
-(NSData *)photolibImageThumbNailData:(UIImage *)image{
// begin an image context that will essentially "hold" our new image
UIGraphicsBeginImageContext(CGSizeMake(200.0,135.0));
// now redraw our image in a smaller rectangle.
[image drawInRect:CGRectMake(0.0, 0.0, 200.0, 135.0)];
// make a "copy" of the image from the current context
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// now grab the PNG representation of our image
NSData *thumbData = UIImagePNGRepresentation(newImage);
return thumbData;
}
Thanks.
i don't think there is much you can do here.. as those are the only two options we have from ALAsset. you will have to compromise either on quality or time. If you are using the images that you have yourself stored in the library then you can re size them before storing to be a bit smaller to increase the speed of the process.
You should also try [rep fullScreenImage] for a better performant table view but an image with better quality than thumbnail.
You can use
CGImageRef iref = [myasset aspectRatioThumbnail];
and your thumbnail look much better !
Related
I am using UIImagePickerController for bring images from the photo library,
i found this method for saving the uiimage as png or jpeg:
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
// Create paths to output images
NSString *pngPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Test.png"];
NSString *jpgPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Test.jpg"];
// Write a UIImage to JPEG with minimum compression (best quality)
// The value 'image' must be a UIImage object
// The value '1.0' represents image compression quality as value from 0.0 to 1.0
[UIImageJPEGRepresentation(image, 1.0) writeToFile:jpgPath atomically:YES];
// Write image to PNG
[UIImagePNGRepresentation(<#UIImage *image#>) writeToFile:jpgPath atomically:YES];
// Let's check to see if files were successfully written...
// Create file manager
NSError *error;
NSFileManager *fileMgr = [NSFileManager defaultManager];
// Point to Document directory
NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents"];
// Write out the contents of home directory to console
NSLog(#"Documents directory: %#", [fileMgr contentsOfDirectoryAtPath:documentsDirectory error:&error]);
[picker release];
}
the problem is that the file in png is 6.4 mb and in jpeg 3.2 mb, there is a way for saving the image file in smaller size then this?
Image returns from camera has 1200*1600 pixal resolution and one pixel is represented by 4 byte. only option to decrease the memory sizeis the resize Image and then compress image in to JPEG or PNG. You can use below method to resize the Image
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
...
You can decrease the size of the image by decresing its size
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
[picker dismissModalViewControllerAnimated:YES];
NSData *imageData = UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerOriginalImage"],.5);
UIImage *image = [UIImage imageWithData:imageData];
CGSize size=CGSizeMake(150,150);
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
imgView.image =scaledImage; // setting this image imgView"Its a image view" on my view
you could resize image and save image in smaller size.
Here is an example of method for resizing image:
- (void)setThumbnailFromImage:(UIImage *)image {
CGSize origImageSize = [image size];
CGRect newRect;
newRect.origin = CGPointZero;
newRect.size = CGSizeMake(40, 40);
// how do we scale the image?
float ratio = MAX(newRect.size.width/origImageSize.width,
newRect.size.height/origImageSize.height);
// create a bitmap image context
UIGraphicsBeginImageContext(newRect.size);
// Round the corners
UIBezierPath *path = [UIBezierPath bezierPathWithRoundedRect:newRect
cornerRadius:5.0];
[path addClip];
// into what rectangle shall I composite the image?
CGRect projectRect;
projectRect.size.width = ratio * origImageSize.width;
projectRect.size.height = ratio * origImageSize.height;
projectRect.origin.x = (newRect.size.width - projectRect.size.width) / 2.0;
projectRect.origin.y = (newRect.size.height - projectRect.size.height) / 2.0;
// draw the image on it
[image drawInRect:projectRect];
// get the image from the image context, retain it as our thumbnail
UIImage *small = UIGraphicsGetImageFromCurrentImageContext();
[self setThumbnail:small];
// cleanup image contex resources, we're done
UIGraphicsEndImageContext();
}
Hope this will help you! =)
Well, the easiest way is to save it as JPEG and reduce the quality. You set it to 1.0, which results in 3.2 MB, but in my experience a JPEG quality of 0.3 is sufficient for most tasks - you can even read the text in a photo of a newspaper article written with 0.3 quality. The size then is about 700K.
For resizing you should have a look at this excellent article:
http://vocaro.com/trevor/blog/2009/10/12/resize-a-uiimage-the-right-way/
On a constant frequency I want to retrieve UIImage for some functionality
if (!_updateTimer) {
_updateTimer = [[NSTimer scheduledTimerWithTimeInterval:self.updateFrequency
target:self
selector:#selector(updateLocations:)
userInfo:nil
repeats:YES] retain];
}
where self.updateFrequency = 1 / 20.0;
How can I retrieve image constantly every second without clicking capture button ?
Following code is no use of me as it will give image size of iphone 320*480
CGImageRef cgoriginal = UIGetScreenImage();
CGImageRef cgimg = CGImageCreateWithImageInRect(cgoriginal, rect);
I need complete image which all are coming infront of camera on every frequent interval.
Thanks
Call takePicture method whenever u want to capture image. You will get the image in the delegate methods.
extern CGImageRef UIGetScreenImage();
CGImageRef cgoriginal = UIGetScreenImage();
CGImageRef cgimg = CGImageCreateWithImageInRect(cgoriginal, rect);
UIImage *viewImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgoriginal);
CGImageRelease(cgimg);
I have a UIImageView and would like to set it to the contents of an ALAsset.
How is the ALAssets contents used to set a UIImage?
Try this. Asset is ALAsset by the way.
UIImage *image = [UIImage imageWithCGImage:[asset thumbnail] scale:1.0 orientation:[[asset defaultRepresentation] orientation]]
UIImageView *image_view = [[UIImageView alloc] initWithImage:image]
As mentioned by Clayton and Aaron Brager, the key is imageWithCGImage method of UIImage class.
There are a two ways to get image content:
asset.thumbnail or asset.aspectThumbnail
via a presentation of the asset
there are default presentation and presentation for UTI
CGImage prepared
CGImageWithOptions, fullResolutionImage, fullScreenImage
raw data
getBytes:fromOffset:length:error:
(if you want to keep EXIF etc, use imageWithData instead of imageWithCGImage)
You may refer http://developer.apple.com/library/ios/#documentation/AssetsLibrary/Reference/ALAssetRepresentation_Class/Reference/Reference.html#//apple_ref/doc/c_ref/ALAssetRepresentation for more details.
ALAssetRepresentation *rep = [self defaultRepresentation];
UIImage *image = [UIImage imageWithCGImage:[rep fullScreenImage]];
I created a masked image using a function form an iphone blog:
UIImage *imgToSave = [self maskImage:[UIImage imageNamed:#"pic.jpg"] withMask:[UIImage imageNamed:#"sd-face-mask.png"]];
Looks good in a UIImageView
UIImageView *imgView = [[UIImageView alloc] initWithImage:imgToSave];
imgView.center = CGPointMake(160.0f, 140.0f);
[self.view addSubview:imgView];
UIImagePNGRepresentation to save to disk:
[UIImagePNGRepresentation(imgToSave) writeToFile:[self findUniqueSavePath] atomically:YES];
UIImagePNGRepresentation returns NSData of an image that looks different.
The output is inverse image mask.
The area that was cut out in the app is now visible in the file.
The area that was visible in the app is now removed. Visibility is opposite.
My mask is designed to remove everything but the face area in the picture. The UIImage looks right in the app but after I save it on disk, the file looks opposite. The face is removed but everything else this there.
Please let me know if you can help!
In quartz you cam mask either by an image mask (black let through and white blocks), or a normal image (white let through and black blocks) which is the opposite. It seems for some reason saving is treating the image mask as a normal image to mask with. One thought is to render to a bitmap context and then create an image to be saved from that.
I had the exact same issue, when I saved the file it was one way, but the image returned in memory was the exact opposite.
The culprit & the solution was UIImagePNGRepresentation(). It fixes the in-app image before saving it to disk, so I just inserted that function as the last step in creating the masked image and returning that.
This may not be the most elegant solution, but it works. I copied some code from my app and condensed it, not sure if this code below works as is, but if not, its close... maybe just some typos.
Enjoy. :)
// MyImageHelperObj.h
#interface MyImageHelperObj : NSObject
+ (UIImage *) createGrayScaleImage:(UIImage*)originalImage;
+ (UIImage *) createMaskedImageWithSize:(CGSize)newSize sourceImage:(UIImage *)sourceImage maskImage:(UIImage *)maskImage;
#end
// MyImageHelperObj.m
#import <QuartzCore/QuartzCore.h>
#import "MyImageHelperObj.h"
#implementation MyImageHelperObj
+ (UIImage *) createMaskedImageWithSize:(CGSize)newSize sourceImage:(UIImage *)sourceImage maskImage:(UIImage *)maskImage;
{
// create image size rect
CGRect newRect = CGRectZero;
newRect.size = newSize;
// draw source image
UIGraphicsBeginImageContextWithOptions(newRect.size, NO, 0.0f);
[sourceImage drawInRect:newRect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
// draw mask image
[maskImage drawInRect:newRect blendMode:kCGBlendModeNormal alpha:1.0f];
maskImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// create grayscale version of mask image to make the "image mask"
UIImage *grayScaleMaskImage = [MyImageHelperObj createGrayScaleImage:maskImage];
CGFloat width = CGImageGetWidth(grayScaleMaskImage.CGImage);
CGFloat height = CGImageGetHeight(grayScaleMaskImage.CGImage);
CGFloat bitsPerPixel = CGImageGetBitsPerPixel(grayScaleMaskImage.CGImage);
CGFloat bytesPerRow = CGImageGetBytesPerRow(grayScaleMaskImage.CGImage);
CGDataProviderRef providerRef = CGImageGetDataProvider(grayScaleMaskImage.CGImage);
CGImageRef imageMask = CGImageMaskCreate(width, height, 8, bitsPerPixel, bytesPerRow, providerRef, NULL, false);
CGImageRef maskedImage = CGImageCreateWithMask(newImage.CGImage, imageMask);
CGImageRelease(imageMask);
newImage = [UIImage imageWithCGImage:maskedImage];
CGImageRelease(maskedImage);
return [UIImage imageWithData:UIImagePNGRepresentation(newImage)];
}
+ (UIImage *) createGrayScaleImage:(UIImage*)originalImage;
{
//create gray device colorspace.
CGColorSpaceRef space = CGColorSpaceCreateDeviceGray();
//create 8-bit bimap context without alpha channel.
CGContextRef bitmapContext = CGBitmapContextCreate(NULL, originalImage.size.width, originalImage.size.height, 8, 0, space, kCGImageAlphaNone);
CGColorSpaceRelease(space);
//Draw image.
CGRect bounds = CGRectMake(0.0, 0.0, originalImage.size.width, originalImage.size.height);
CGContextDrawImage(bitmapContext, bounds, originalImage.CGImage);
//Get image from bimap context.
CGImageRef grayScaleImage = CGBitmapContextCreateImage(bitmapContext);
CGContextRelease(bitmapContext);
//image is inverted. UIImage inverts orientation while converting CGImage to UIImage.
UIImage* image = [UIImage imageWithCGImage:grayScaleImage];
CGImageRelease(grayScaleImage);
return image;
}
#end
Has anyone else come across this problem? ObjectAlloc climbs as a result of the CGBitmapContextCreateImage. Does Apple's software not fully releasing the objectalloc?
I am resizing images 12 times a second with a NSTimer. During resizing of the image I am also adding a photoshop like Gaussian blur effect by including interpolationQuality.
After using Instruments it does not show any memory leaks but my objectalloc just continues to climb. It points directly to CGBitmapContextCreateImage.
CGBitmapContextCreateImage > create_ bitmap_ data_provide > malloc
Anyone know of a solution? or Even possible ideas?
The Call within the NSTimer
NSString * fileLocation = [[NSBundle mainBundle] pathForResource:imgMain ofType:#"jpg"];
NSData * imageData = [NSData dataWithContentsOfFile:fileLocation];
UIImage * blurMe = [UIImage imageWithData:imageData];
CGRect rect = CGRectMake(0, 0, round(blurMe.size.width /dblBlurLevel), round(blurMe.size.width /dblBlurLevel));
UIImage * imageShrink = [self resizedImage: blurMe : rect : 3.0];
CGRect rect2 = CGRectMake(0, 0, blurMe.size.width , blurMe.size.width );
UIImage * imageReize = [self resizedImage: imageShrink : rect2 : 3.0];
imgView.image = imageReize;
The Resize Function
-(UIImage *) resizedImage:(UIImage *)inImage : (CGRect)thumbRect : (double)interpolationQuality
{
CGImageRef imageRef = [inImage CGImage];
CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef);
if (alphaInfo == kCGImageAlphaNone)
alphaInfo = kCGImageAlphaNoneSkipLast;
// Build a bitmap context that's the size of the thumbRect
CGContextRef bitmap = CGBitmapContextCreate(
NULL,
thumbRect.size.width,
thumbRect.size.height,
CGImageGetBitsPerComponent(imageRef),
4 * thumbRect.size.width,
CGImageGetColorSpace(imageRef),
alphaInfo
);
// Draw into the context, this scales the image
CGContextSetInterpolationQuality(bitmap, interpolationQuality);
CGContextDrawImage(bitmap, thumbRect, imageRef);
// Get an image from the context and a UIImage
CGImageRef ref = CGBitmapContextCreateImage(bitmap);
UIImage* result = [UIImage imageWithCGImage:ref];
CGContextRelease(bitmap); // ok if NULL
CGImageRelease(ref);
return [result autorelease];
}
That code overreleases result.
That said, it's likely that the issue is that the UIImage is not getting deallocated, and the UIImage is holding onto the CGImage, and the CGImage is holding onto the memory that was allocated under CGBitmapContextCreate.
Use instruments to see if UIImages are not getting deallocated, and if so try to debug why.
I compiled and ran your code as you have it and I don't see any leaks nor does my object alloc keep climbing. I run the code a couple of times a second and don't see any object growth in Instruments. I am only running on the simulator. I also tried a kCGInterpolationNone instead of 3.0 in case that is the problem, and still no leak.
Not sure why I don't get them and you do. You might want to just do this in the method:
-(UIImage *) resizedImage:(UIImage *)inImage : (CGRect)thumbRect : (double)interpolationQuality
{
CGImageRef imageRef = [inImage CGImage];
CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef);
return inImage;
...
In order to make this method be meaningless and then watch to see if the object alloc continues to grow. If so, then the problem is elsewhere.