I took a 13.3MB image and added it to XCode. I'm aware that when compiling, XCode performs some tricks to bring down the file size. I did this, to test how big the image now was, after being converted into data:
UIImage *image = [UIImage imageNamed:#"image.jpg"];
NSData *data = UIImageJPEGRepresentation(image, 1.0);
NSLog(#"length: %i", data.length);
The length I got back was 26758066. If that's in bytes, then it reads to me as 26.7MB. How is the image so big suddenly? Is there another way for me to get the image in data form without going through UIImage first?
EDIT: Further testing reveals that this works, and brings out a data length of ~13.3MB - the expected amount:
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"image" ofType:#"jpg"];
NSData *data = [NSData dataWithContentsOfFile:filePath];
NSLog(#"length: %i", data.length);
What your code is doing is decompressing the image into memory and then recompressing as JPEG, with the highest quality ratio (q=1.0). That’s why the image is suddenly so big.
If you want to check up on the file as stored in the resource bundle, ask NSBundle for the full file path and use NSFileManager to read the file size. You can do the same thing by hand on your Mac, just take a look into the BUILD_PRODUCTS_DIR for your project.
Related
Hello I would like to run a thread and check the current downloaded size of a file.
This is what I use
UIImage *image = [[UIImage alloc] initWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:#"http://lasp.colorado.edu/home/wp-content/uploads/2011/03/suncombo1.jpg"]]];
NSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *jpegFilePath = [NSString stringWithFormat:#"%#/test.jpeg",docDir];
NSData *data2 = [NSData dataWithData:UIImageJPEGRepresentation(image, 1.0f)];//1.0f = 100% quality
[data2 writeToFile:jpegFilePath atomically:YES];
downloadStatus.text =[NSString stringWithFormat:#"size: %zd", malloc_size(data2)];
[image release];
I have also tried to change malloc_size(data2) into image but again it is not the real result. I know this does not have thread and do not check during the download process but what am I supposed to use here to see the file size?
A couple of observations:
Your question presumed that your attempts to retrieve the size of the NSData were failing. They are not. The correct way to get the size of a NSData is via length.
Your confusion, though, stems from a faulty assumption that taking an externally generated JPEG on a roundtrip through UIImage and UIImageJPEGRepresentation would yield the identical NSData. This would have been extraordinarily unlikely. There are too many different JPG settings that could have changed (see the JPEG Wikipedia page). We certainly don't know what settings that original file used. I know that UIImage and/or UIImageJPEGRepresentation changed the color space of the file. I'd wager it's doing a lot of other things, too.
So your results are correct. The original file was 2.6mb and the resulting file was 4.5mb. If you change the compressionQuality from 1.0 to 0.99, the resulting file is only 1.4mb! But if you want the original file, just save it first (like I do below).
Consider the following code which downloads the image file, saves it, loads it into a UIImage, re-extracts it via UIImageJPEGRepresentation, and saves another copy of the image:
// let's make filenames where we'll store the files
NSString *documentsPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *suncomboOrig = [documentsPath stringByAppendingPathExtension:#"suncombo1-orig.jpg"];
NSString *suncomboReprocessed = [documentsPath stringByAppendingPathExtension:#"suncombo1-reprocessed.jpg"];
// let's download the original suncombo1.jpg and save it in Documents and display the size
NSURL *url = [NSURL URLWithString:#"http://lasp.colorado.edu/home/wp-content/uploads/2011/03/suncombo1.jpg"];
NSData *data = [NSData dataWithContentsOfURL:url];
NSLog(#"original = %d", [data length]);
[data writeToFile:suncomboOrig atomically:NO];
// let's load that into a UIImage
UIImage *image = [UIImage imageWithData:data];
// let's extract data out of the image and write that to Documents, too, also logging the size of that
NSData *data2 = UIImageJPEGRepresentation(image, 1.0);
NSLog(#"reprocessed = %d", [data2 length]);
[data2 writeToFile:suncomboReprocessed atomically:NO];
What that does is it reports:
2012-12-13 22:30:39.576 imageapp[90647:c07] original = 2569128
2012-12-13 22:30:40.141 imageapp[90647:c07] reprocessed = 4382876
So the first file I saved (which I suspect is identical to what's on your server) was 2.5mb, and the file after doing a roundtrip to a UIImage and re-extracted via 4.3mb. If I look at the two files that the above code saved, I can confirm that these NSData sizes are correct.
My original answer was predicated on the presumption that the OP was either unable to retrieve the size of a NSData or that there was some subtle issue underlying the simple question (such as wanting to get the size before the download commenced). Anyway, I've expanded my answer above, but I'll keep my original answer for historical purposes:
Original Answer:
The NSData property length tells you how many bytes were downloaded. E.g. [data2 length].
If it's really big, you can use NSURLConnection to download it asynchronously, which, depending upon your web server, may provide total file size before the download commences in the method didReceiveResponse (with the expectedContentLength property in the NSHTTPURLResponse *response parameter).
The other nice thing about NSURLConnection downloading is that you don't have to load the entire file in memory as you're downloading it, but rather you can stream it directly to persistent storage, which is especially useful if you're downloading multiple large files at the same time. If you're downloading a reasonably sized file, using NSURLConnection to download is overkill, but it can be nice when downloading large files and you want a progress indicator (or want to get the file size before the download commences).
But if you just want to know how many bytes were downloaded to your NSData, use length.
You can just use the C FILE class to get the file size.
FILE * handle = fopen([jpegFilePath UTF8String], "r");
fseek(handle, EOF); // seek to end of file
int size = ftell(handle); // returns position in bytes
fclose();
Because you say "current" size and mention a thread, I'm guessing you're trying to determine the file size as it is received. In that case, you can get the thread for free from an NSURLConnection, and you can get the data size from the delegate methods as it's received...
Create an instance variable for the downloaded data:
#property (nonatomic, strong) NSMutableData *data;
Create and launch a connection:
NSURL *url = [NSURL URLWithString:#"http://lasp.colorado.edu/home/wp-content/uploads/2011/03/suncombo1.jpg"];
NSURLRequest *request = [NSURLRequest requestWithURL:url];
self.data = [NSMutableData data];
NSURLConnection *connection = [[NSURLConnection alloc] initWithRequest:request delegate:self];
Implement the required methods in NSURLConnectionDataDelegate. For your question, the special part is this:
- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data {
[self.data appendData:data];
NSUInteger bytesSoFar = [self.data length];
// you're on the app main thread here, so you can do something
// to the UI to indicate progress
downloadStatus.text = [NSString stringWithFormat#"size: %d", bytesSoFar];
}
A good doc on the rest of the protocol is here. When the connection is complete, you can create the image as you did with the dataWithContentsOfURL...
- (void)connectionDidFinishLoading:(NSURLConnection *)connection {
UIImage *image = [[UIImage alloc] initWithData:self.data];
downloadStatus.text = [NSString stringWithFormat#"size: %d", [self.data length]];
}
I have some problem for decoding image data from base 64 encoded string.
I am using base64.h and base 64.m files downloaded from the following link
http://cdn.imthi.com/e6cef8/wp-content/uploads/2010/08/base64.zip
This is my code
[Base64 initialize];
NSData * data = [Base64 decode:imageString];
imgview.image=[UIImage imageWithData:data];
but, nothing displayed in the image view ,
I tested by decoding the base 64 string(taken from debugger console) with an online base 64 decoder,It gives correct image,
I also tested by writing the data to a file like this
[data writeToFile:imagePath atomically:YES];
it gives a jpg file but i can't open that image file,
it gives error message like
The file “test.jpg” could not be opened.
"It may be damaged or use a file format that Preview doesn’t recognize."
What is the problem with my code
Can anyone help me.....
Thank you
Try a different base 64 implementation, I use the one from colloquy open source project:
#import "NSDataAdditions.h"
/* encoded string to image */
NSString *imageString = #"";
NSData * data = [NSData dataWithBase64EncodedString:imageString];
UIImage *image1 = [UIImage imageWithData:data];
/* image to encoded string, back to image */
imageString = [UIImagePNGRepresentation(image) base64Encoding];
data = [NSData dataWithBase64EncodedString:imageString];
UIImage *image2 = [UIImage imageWithData:data];
Get NSAdditions files: NSAdditions.h + NSAdditions.m
I have a NSMutableData containing a JPEG that I want to save in the device library.
I can do it converting it to an UIImage, but then it loses the EXIF data, so UIImaes should be avoided.
EDIT To reflect what I am trying to accomplish
I am using the iphone-exif library to edit the metadata of a picture in my main bundle. I want to save the resulting image in the photo library
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"P1080330" ofType:#"JPG"];
NSData *uiJpeg = [NSData dataWithContentsOfFile:filePath];
EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
[jpegScanner scanImageData: uiJpeg];
NSLog(#" EXIF_Make %# ", [ jpegScanner.exifMetaData tagValue: [NSNumber numberWithInt:EXIF_Make]]); // OUTPUTS PANASONIC
// Change camera maker to FairyGodmother
[jpegScanner.exifMetaData addTagValue: #"FairyGodmother" forKey:[NSNumber numberWithInt:EXIF_Make]];
NSMutableData *newImageData = [[NSMutableData alloc] init];
[jpegScanner populateImageData:newImageData];
// NOW I HAVE THE NSMUTABLEDATA CONTAINGING MY IMAGE WITH ALL MY EXIF DATA. HOW DO I PUT IT TO THE PHOTO LIBRARY?
UIImageWriteToSavedPhotosAlbum (which takes a UIImage) is the only documented path to getting an image in the device's photo library, as far as I know.
What is your concern with "losing EXIF data"? Are you talking about orientation? (UIImage should take care of that for you.) Or other EXIF data?
The only way you can get an image out of the photo library is as a UIImage also, so I'm not sure what you stand to gain by shoving a raw JPEG stream into the library.
To Write a File use:
[yourImage writeToFile:filePath atomically:YES];
[]'s
I am attempting to save and load a UIImage to and from the iPhone documents directory after the image is picked from the iPhone Photo Library. It is able to do so, but for some reason, when I load the image, it rotates it 90 degrees counterclockwise. Here is my saveImage and loadImage methods:
Save Image:
- (void)saveImage: (UIImage*)image{
if (image != nil)
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString* path = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithString: #"lePhoto.png"] ];
//NSData* data = UIImagePNGRepresentation(image);
//[data writeToFile:path atomically:YES];
[UIImagePNGRepresentation(image) writeToFile:path atomically:YES];
}
}
Load Image:
- (NSData*)loadImage{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString* path = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithString: #"lePhoto.png"] ];
NSData *data = [[NSData alloc] initWithContentsOfFile:path];
//UIImage *image = [[UIImage alloc] initWithData:data]; //my second try
//UIImage* image = [UIImage imageWithContentsOfFile:path]; //first try
//return image;
return data;
[data release];
}
And I am now loading the image like this, to see if it would, for some crazy reason, solve my problem (it didn't, obviously):
UIImage* theImage = [[UIImage alloc] initWithData:[self loadImage]];
I have done some testing where I have two UIImageViews side-by-side, one on the left that doesn't save and load the image before display (it just straight-up shows the image once a photo is picked from the ImagePicker), and one on the right that displays AFTER the save load occurs. The one on the right is the only one rotating 90 degrees CCW; the non Save/Load picture is displaying correctly. This only occurs on the actual iPhone. The simulator shows both images correctly. I have spent many, many hours attempting to figure it out, but to no avail. Any ideas as to what could be causing this?
EDIT: It should also be known that the image always says it's oriented up, even when it obviously isn't! It only rotates after ALL of the code has been run. I've tried doing a force redraw in between my code to see if that changes things, but it doesn't (though it is possible I'm even doing that wrong. Who knows, anymore).
Also, it will not autorotate photos that have been screenshot from the iPhone. Only photos that have been taken by the camera. [I haven't tried downloaded photos.] It's the weirdest thing...
And if you're wondering exactly how I'm pulling my picture, here's the method that will at least shed light on what I'm doing (it's not every method needed to do this function, obviously, but it gives insight on how I've written my code):
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
theimageView.image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
[self saveImage:[info objectForKey:#"UIImagePickerControllerOriginalImage"]];
[picker dismissModalViewControllerAnimated:YES];
}
Any advice is greatly appreciated. And I do know this much: if there's a way to do something crazy to get some stupid, never-before-heard-of problem, chances are, I'll find it, apparently :P
Ok. I figured it out. Apparently when you save things with PNG representation, and not JPEG representation, the image orientation information is not saved with it. Because of this, every image loaded will default to showing orientation up. So, the easiest way is to do this:
NSData* data = UIImageJPEGRepresentation(image,0.0);
[data writeToFile:path atomically:YES];
instead of this:
[UIImagePNGRepresentation(image) writeToFile:path atomically:YES];
And, of course, changing all the picture names from .png to .jpg. The 0.0 part of the code above is to control Alpha levels, which is required of jpegs. Hope this helps people in the future!
I assume you're using the Assets Library Framework to get the image. If so, you'll want to get the orientation of the image in your results block, and manipulate the UIImage to adjust. I have something like the following:
ALAssetsLibraryAssetForURLResultBlock resultBlock = ^(ALAsset* asset)
{
ALAssetRepresentation* rep = [asset defaultRepresentation];
_orientation = rep.orientation;
...
}
Aside: What a great type name, huh? Holy moly, does Objective-C need namespaces!
Thanks so much...
This code has worked and saved several hours of work for me.
NSData* data = UIImageJPEGRepresentation(image,0.0);
[data writeToFile:path atomically:YES];
in the place of
[UIImagePNGRepresentation(image) writeToFile:path atomically:YES];
I'm trying to load pictures in iPhone Photos app, then save the selected pictures in to my app's folder, using ALAssetsLibrary, and have two issues:
1. the image file saved to disk is much bigger then original files in Photos app, for example, a picture is 2.8MB, but after saved to my app's folder, it's 6.4MB, following is the code:
CGImageRef cgImage = [localPhoto thumbnail];
NSString *path = #"/documents/test/1.jpeg";//the path is just an example
BOOL succ;
UIImage *image1 = [UIImage imageWithCGImage:cgImage];
NSData *data1 = UIImageJPEGRepresentation(image1, 1.0);
succ = [data1 writeToFile:path atomically:YES];
the above code(saving 6.4MB image to file) costs about 1.6seconds, is it normal? is there anyway to make it faster?
Try with PNG representation of the image.
NSData *data1 = UIImagePNGRepresentation(image1);
or else reduce the image quality or JPG.
NSData *data1 = UIImageJPEGRepresentation(image1, 0.5);