Compressing Images in Iphone programmatically from NSData - iphone

I wish to compress the image before storing it as an NSData object.
Below is the code, that helps me take NSData object of an Image.
NSURL *referenceURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library1 = [[ALAssetsLibrary alloc] init];
[library1 assetForURL:referenceURL resultBlock:^(ALAsset *asset)
{
int byteArraySize = asset.defaultRepresentation.size;
NSMutableData* rawData = [[NSMutableData alloc]initWithCapacity:byteArraySize];
void* bufferPointer = [rawData mutableBytes];
NSError* error=nil;
[asset.defaultRepresentation getBytes:bufferPointer fromOffset:0 length:byteArraySize error:&error];
if (error) {
NSLog(#"%#",error);
}
rawData = [NSMutableData dataWithBytes:bufferPointer length:byteArraySize];
}
Any Help will be appreciated.

UIImagePickerController does return a compressed image, but you can control the format and compression as well with this built in UIKit function and a related function for PNGs:
NSData* UIImageJPEGRepresentation(UIImage *image, CGFloat compressionQuality);
You might need to create an NSURL if referenceURL returns a string.
NSImage *image = [UIImage imageWithData:[NSData dataWithContentsOfURL: referenceURL]];
NSData *compressedImage = UIImageJPEGRepresentation(image, .1); //.1 is low quality

If you're using a UIImagePickerController, the image returned will be a JPEG, which is already compressed (I think). If not, you can use AVAssetWriter to write the image as a JPEG or PNG.

simple to use:-
-(UIImage *)fireYourImageForCompression:(UIImage *)imgComing{
NSData *dataImgBefore = [[NSData alloc] initWithData:UIImageJPEGRepresentation((imgComing), 1.0)];//.1 BEFORE COMPRESSION
int imageSizeBefore = (int)dataImgBefore.length;
NSLog(#"SIZE OF IMAGE: %i ", imageSizeBefore);
NSLog(#"SIZE OF IMAGE in Kb: %i ", imageSizeBefore/1024);
NSData *dataCompressedImage = UIImageJPEGRepresentation(imgComing, .1); //.1 is low quality
int sizeCompressedImage = (int)dataCompressedImage.length;
NSLog(#"SIZE AFTER COMPRESSION OF IMAGE: %i ", sizeCompressedImage);
NSLog(#"SIZE AFTER COMPRESSION OF IMAGE in Kb: %i ", sizeCompressedImage/1024); //AFTER
//now change your image from compressed data
imgComing = [UIImage imageWithData:dataCompressedImage];
return imgComing;}

Related

Save images in a n ios app

I am creating a greeting card app.I have some existing templates and i also can create new greeting cards by adding cliparts ,textstyles etc.I have saved the image as a uiimage object.I am stuck at the point to save these images for future use.Please help.
To save a UIImage to file do the following
NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/img.jpg"];
//Save
NSData* imageData = UIImageJPEGRepresentation(img, 1.0);
[imageData writeToFile:path atomically:YES];
//Load
NSData* imageData = [NSData dataWithContentsOfFile:path];
UIImage *image = [UIImage imageWithData:imageData];
Check out this link.
How to save picture to iPhone photo library?
You can also save the image in iPhone file system by converting image to data and save data to file.
NSData *data = UIImagePNGRepresentation(yourImage);
[data writeToFile:(NSString *)path atomically:(BOOL)useAuxiliaryFile]

Converting byte array coming from Web service to UIImage iPhone

I am new to this technology.
I searched a lot but cant find any relevant.
In my application,I am receiving byte array from web service, my byte array which I receive from web service is
[137,80,78,71,13,10,26,10,0,0,0,13,73,72,68,82,0,0,1,195,0,0,1,195,8,2,0,0,0,215,2... ]
and I want to convert this byte array into UIImage for showing it in UIImageView.
Use below constructor for UIImage.
+ (UIImage *)imageWithData:(NSData *)data;
NSData *data = [NSData dataWithBytes:YOUR_BYTE_ARRAY length:ARRAY_LENGTH];
UIImage *img = [UIImage imageWithData:data];
UIImageView *imgView = [[UIImageView alloc]initWithImage:img];
The first 8 bytes in the byte array above, \211 P N G \r \n \032 \n (or 137,80,78,71,13,10,26,10 in decimal), reveal this to be a PNG file.
At the very least, you should be able to just save your entire byte sequence to a file, and load it using + (UIImage *)imageNamed:(NSString *)name or + (UIImage *)imageWithContentsOfFile:(NSString *)path. For example:
UIImage *myImage = [UIImage imageNamed:#"myfile.png"]
// myfile.png should be in the main bundle
(Apurv's method is more direct, and better for this reason. But since you are having such difficulty with it, I thought I'd suggest a slightly different approach.)
You need to Base64 decode the data first. Data that is returned from many SOAP web services is base 64 encoded as well as sometimes raw data embedded in websites. It is pretty simple to do but just easiest to use a library to someone else has created.
Start by including this in your project and including the .h in this file: https://github.com/nicklockwood/Base64
NSString *base64String = #"**YOUR BYE ARRAY HERE**";
UIImage *imageOrig = [UIImage imageWithData:[NSData dataFromBase64String:base64String]];
UIImageView *imageView = [[UIImageView alloc]initWithImage:imageOrig];
That should do it. In my previous experiences I just put what ever data blob I get over the webservice into a string then create the image using this method and it works great there is a great discussion on the details of Base64 encoding and decoding here : http://cocoadev.com/wiki/BaseSixtyFour which is what I used to create my class but Nick's code on gitHub is much better as its ARC compliant.
From Webservice we get array of NSNumber. We will have to convert it to NSData like this:
NSMutableData *data = [[NSMutableData alloc] initWithCapacity: [strings count]];
for( NSNumber *number in strings) {
char byte = [number charValue];
[data appendBytes: &byte length: 1];
}
Covert NSData to UIImage:
UIImage *imageOrig = [UIImage imageWithData:data];
We get JSON also out of NSData.
NSError *error1 = nil;
NSArray *jsonArray = [NSJSONSerialization JSONObjectWithData:data options:kNilOptions error:&error1];
if (error1 != nil) {
NSLog(#"Error parsing JSON.");
} else {
NSLog(#"Array: %#", jsonArray);
}
//Use this
CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB();
CGContextRef bitmapContext=CGBitmapContextCreate(YOUR_BYTE_ARRAY, w, h, 8, 4*w, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault);
CFRelease(colorSpace);
free(YOUR_BYTE_ARRAY);
CGImageRef cgImage=CGBitmapContextCreateImage(bitmapContext);
CGContextRelease(bitmapContext);
UIImage *newimage = [UIImage imageWithCGImage:cgImage];
[yourImageView setImage:newimage];
CGImageRelease(cgImage);
May be it will help you...
#import "NSDataAdditions.h"
NSData *dataObj = [NSData dataWithBase64EncodedString:StringImage];
UIImage *Image = [UIImage imageWithData:dataObj];

Cannot load image with the path URL returned by ALAssets

I am writing an image in iPad using ALAssets. When it finish I try to create an UIImage with the returned URL but it won't load. This is the code:
LAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[anImage CGImage] orientation:(ALAssetOrientation)[anImage imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if (!error) {
CGImageSourceRef src = CGImageSourceCreateWithURL((CFURLRef) [NSURL fileURLWithPath:[assetURL absoluteString]], NULL);
My purpose is to save an image to the device, then convert it to another format using ImageIO and finally send it to a web service. CGImageSourceRef is null, I also tried with standard UIImage with the same result.
What I am doing wrong here?
EDIT: The problem is when creating the CFURLRef.
If I do
CGImageSourceCreateWithURL((CFURLRef) assetURL, NULL);
I got this error
ImageIO: CGImageSourceCreateWithURL CFURLCreateDataAndPropertiesFromResource failed with error code -11.
But if I try to convert the URL with
[NSURL fileURLWithPath:[assetURL absoluteString]]
the path is changed to
assets-library:/asset/asset.JPG%3Fid=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG -- file://localhost/
I cannot find how to properly create the CFURLRef needed by the method. I tried printing all the conversions I could think of and this are the results
[assetURL relativePath]
[assetURL relativeString]
[assetURL absoluteURL]
[assetURL absoluteString]
/asset.JPG ,
assets-library://asset/asset.JPG?id=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG
assets-library://asset/asset.JPG?id=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG
assets-library://asset/asset.JPG?id=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG
[NSURL fileURLWithPath:[assetURL relativePath]]
[NSURL fileURLWithPath:[assetURL relativeString]]
[NSURL fileURLWithPath:[assetURL absoluteString]]
file://localhost/asset.JPG
assets-library:/asset/asset.JPG%3Fid=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG -- file://localhost/
assets-library:/asset/asset.JPG%3Fid=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG -- file://localhost/
Help please, I am stuck with this :-(
This is what I did for my case.
UIImage* anImage; //this is the original image
NSData * imgData = UIImageJPEGRepresentation(anImage, 0.7);
CGImageSourceRef src = CGImageSourceCreateWithData((CFDataRef) imgData, NULL);
NSMutableData *data = [NSMutableData data];
CFStringRef imageType = CFSTR("com.microsoft.bmp");
CGImageDestinationRef myImageDest = CGImageDestinationCreateWithData((CFMutableDataRef) data, imageType, 1, nil);
//Convert!
CGImageDestinationAddImageFromSource(myImageDest, src, 0, myOptions);
CGImageDestinationFinalize(myImageDest);
//Freeing things
CFRelease(myImageDest);
CFRelease(src);
But this just converts the image, it doesn't store it in any file... Not sure this should be an answer to the original question.
If you already have an ALAsset and your goal is a CGImageRef you can do something like this.
ALAssetRepresentation* rep = [asset defaultRepresentation];
NSDictionary* options = [[NSDictionary alloc] initWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageAlways,
(id)[NSNumber numberWithDouble:400], (id)kCGImageSourceThumbnailMaxPixelSize,
nil];
CGImageRef image = [rep CGImageWithOptions:options];

Convert UIImage to NSString (and vice-versa)

I need a method to convert a UIImage in a NSString and then convert the NSString back to a UIImage.
Thanks.
for >= IOS 7
- (NSString *)imageToNSString:(UIImage *)image
{
NSData *imageData = UIImagePNGRepresentation(image);
return [imageData base64EncodedStringWithOptions:NSDataBase64Encoding64CharacterLineLength];
}
- (UIImage *)stringToUIImage:(NSString *)string
{
NSData *data = [[NSData alloc]initWithBase64EncodedString:string
options:NSDataBase64DecodingIgnoreUnknownCharacters];
return [UIImage imageWithData:data];
}
Convert it to a binary stream instead (NSData). This will depend on the format of your UIImage. If it's a JPEG/PNG for instance, you do:
NSData *data1 = UIImageJPEGRepresentation(image, 1.0);
NSData *data2 = UIImagePNGRepresentation(image);
UPDATE: Converting the binary data to NSString is a bad idea, that is why we have the class NSData. The OP wants to be able to send it as a data stream and then reconstruct it again; NSString will not be needed for this.
Convert to PNG or JPEG using UIImagePNGRepresentation or UIImageJPEGRepresentation, which will return an NSData, and then convert the NSData to a string (not sure how you want to do that mapping). How about just dealing with the NSData? You can read/write that to a file.

To add UIImage directly to file

I want to add my UIImage directly into the file instead of converting into UIImagePNGRepresentation or UIImageJPGRepresentation(as it takes time) like:-
UIImage *im = [UIImage imageWithCGImage:ref];
[array addObject:im];
NSData *data = [array objectAtIndex:i];
[data writeToFile:path atomically:YES];
But it is showing error.
So there is any way that i can do it.
Thanks in Advance.
your use of the array only obfuscates that you are basically doing:
NSData *data = im;
Which cannot possibly work because im is a UIImage, and not an NSData nor a subclass.
What you want to do is to create a new NSData and initialize it with the content of the image. Since you got a CGImageRef, I suggest using it directly, without using a UIImage in between.
CGDataProviderRef imageDataProvider = CGImageGetDataProvider(ref);
CFDataRef imageData = CGDataProviderCopyData(imageDataProvider);
NSData *data = (NSData*) imageData;
Note that it is OK to cast the CFDataRef to NSData* because CFData is “toll-free bridged” with its Cocoa Foundation counterpart, NSData.
I hope that helps.
(don't forget to release data when done)