exif data in NSData image representation - iphone

I'm trying to create a PhoneGap plugin which uses AFFoundation to capture photos and return the image as base64 encoded string. This all works fine but the exif data seems to be missing from the returned image. Tried a few variations and got some pointers from other questions asked here but it still doesn't seem to work correctly. The current code is
NSMutableString *stringToReturn = [NSMutableString stringWithString: #""];
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageBuffer, NSError *error)
{
if (imageBuffer != NULL)
{
CFDictionaryRef exifAttachments = CMGetAttachment(imageBuffer, kCGImagePropertyExifDictionary, NULL);
CFMutableDictionaryRef mutable;
if (exifAttachments)
{
//
// Set orientation
//
mutable = CFDictionaryCreateMutableCopy(NULL, 0, exifAttachments);
int orientation = (int)[UIDevice currentDevice].orientation;
CFDictionaryAddValue(mutable, kCGImagePropertyOrientation, CFNumberCreate(NULL, kCFNumberIntType, &orientation));
CMSetAttachments(imageBuffer, mutable, kCMAttachmentMode_ShouldPropagate);
NSLog(#"attachements: %#", mutable);
}
//
// Get the image
//
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageBuffer];
[stringToReturn setString:[imageData base64EncodedString]];
//
// Call the success callback
//
[self performSelectorOnMainThread:#selector(doSuccessCallback:) withObject:[NSString stringWithString:stringToReturn] waitUntilDone:NO];
//
// Save it to the photo album
// It is done after we did the success callback to avoid accidental issues with
// the photo album stopping further processing
//
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImageWriteToSavedPhotosAlbum(image, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
}
}];
Can anyone shed some light please? I seem to be completely stuck here.

Related

Rotation of video to be sent to Web Service

What I'm trying to do sounds simple enough but I can't figure out for the life of me how to do it. When sending a video to my webservice it is always rotated 90 degrees. I want to be able to rotate it so it isn't sideways. I have all the code working perfectly up until this point.
I had the same issue with pictures also, but I was able to make a category that checked the orientation of UIImage object and adjust it accordingly.
The following is the code I'm using to save the movie. At some point before I send the video off, I want to have it rotated.
else if ([mediaType isEqualToString:(NSString *)kUTTypeMovie])
{
NSLog(#"Movie");
NSData *videoData = [NSData dataWithContentsOfURL:videoURL];
NSString *moviePath = [[info objectForKey:UIImagePickerControllerMediaURL]path];
if(UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(moviePath)){
UISaveVideoAtPathToSavedPhotosAlbum(moviePath, self, #selector(video:didFinishSavingWithError:contextInfo:), nil);
[[WebCall sharedClient] postRequestWithVideo:^(AFHTTPRequestOperation *operation, id responseObject) {
connectionLabel.text = operation.responseString;
} failure:^(AFHTTPRequestOperation *operation, NSError *error) {
connectionLabel.text = operation.responseString;
} atPath:videoData];
}
}
Please Help.

UIGraphicsBeginImageContext, email image from app

I am trying to attach an image rendered from UIGraphicsBeginImageContext. As a test, I am adding the image to the photo album as well. It all works perfectly on the simulator, but on the device the correct image get added to the photo album, but will not display in the email attachment correctly. I think it is because it is a big image, and it takes some time on an iPhone 3gs. Which means I have to check if it is done rendering image. Is there a way to to that? Here is my code:
UIGraphicsBeginImageContext(backGround.layer.frame.size);
[backGround.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
UIGraphicsEndImageContext();
MFMailComposeViewController *mailer = [[MFMailComposeViewController alloc] init];
mailer.mailComposeDelegate = self;
NSData *imgData = UIImagePNGRepresentation(image);
[mailer addAttachmentData:imgData mimeType:#"image/png" fileName:#"myfilename"];
I am wondering, maybe it is not completely finished with the image, and it still has corrupted data when I am making a PNG Representation of it. Can I somehow check if the UIImage is 'done' ?
Try implementing the completion method and check it. One example is,
UIImageWriteToSavedPhotosAlbum(image, self, #selector(image:didFinishSavingWithError:contextInfo:), NULL);
- (void)image:(UIImage *) image didFinishSavingWithError:(NSError *) error contextInfo: (void *) contextInfo {
NSLog(#"SAVE IMAGE COMPLETE");
if(error != nil) {
NSLog(#"ERROR SAVING:%#", [error localizedDescription]);
}
}
Based on the error message you can debug for the error message. Check UIImageWriteToSavedPhotosAlbum documentation for more details.

Cannot load image with the path URL returned by ALAssets

I am writing an image in iPad using ALAssets. When it finish I try to create an UIImage with the returned URL but it won't load. This is the code:
LAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[anImage CGImage] orientation:(ALAssetOrientation)[anImage imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if (!error) {
CGImageSourceRef src = CGImageSourceCreateWithURL((CFURLRef) [NSURL fileURLWithPath:[assetURL absoluteString]], NULL);
My purpose is to save an image to the device, then convert it to another format using ImageIO and finally send it to a web service. CGImageSourceRef is null, I also tried with standard UIImage with the same result.
What I am doing wrong here?
EDIT: The problem is when creating the CFURLRef.
If I do
CGImageSourceCreateWithURL((CFURLRef) assetURL, NULL);
I got this error
ImageIO: CGImageSourceCreateWithURL CFURLCreateDataAndPropertiesFromResource failed with error code -11.
But if I try to convert the URL with
[NSURL fileURLWithPath:[assetURL absoluteString]]
the path is changed to
assets-library:/asset/asset.JPG%3Fid=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG -- file://localhost/
I cannot find how to properly create the CFURLRef needed by the method. I tried printing all the conversions I could think of and this are the results
[assetURL relativePath]
[assetURL relativeString]
[assetURL absoluteURL]
[assetURL absoluteString]
/asset.JPG ,
assets-library://asset/asset.JPG?id=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG
assets-library://asset/asset.JPG?id=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG
assets-library://asset/asset.JPG?id=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG
[NSURL fileURLWithPath:[assetURL relativePath]]
[NSURL fileURLWithPath:[assetURL relativeString]]
[NSURL fileURLWithPath:[assetURL absoluteString]]
file://localhost/asset.JPG
assets-library:/asset/asset.JPG%3Fid=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG -- file://localhost/
assets-library:/asset/asset.JPG%3Fid=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG -- file://localhost/
Help please, I am stuck with this :-(
This is what I did for my case.
UIImage* anImage; //this is the original image
NSData * imgData = UIImageJPEGRepresentation(anImage, 0.7);
CGImageSourceRef src = CGImageSourceCreateWithData((CFDataRef) imgData, NULL);
NSMutableData *data = [NSMutableData data];
CFStringRef imageType = CFSTR("com.microsoft.bmp");
CGImageDestinationRef myImageDest = CGImageDestinationCreateWithData((CFMutableDataRef) data, imageType, 1, nil);
//Convert!
CGImageDestinationAddImageFromSource(myImageDest, src, 0, myOptions);
CGImageDestinationFinalize(myImageDest);
//Freeing things
CFRelease(myImageDest);
CFRelease(src);
But this just converts the image, it doesn't store it in any file... Not sure this should be an answer to the original question.
If you already have an ALAsset and your goal is a CGImageRef you can do something like this.
ALAssetRepresentation* rep = [asset defaultRepresentation];
NSDictionary* options = [[NSDictionary alloc] initWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageAlways,
(id)[NSNumber numberWithDouble:400], (id)kCGImageSourceThumbnailMaxPixelSize,
nil];
CGImageRef image = [rep CGImageWithOptions:options];

AVCaptureMovieFileOutput causing wrong orientation when saving photo to Camera Roll

I have an odd problem with captureStillImageAsynchronouslyFromConnection. If I save the image using jpegStillImageNSDataRepresentation while the video is mirrored, the image in the camera roll is rotated 90 degrees clockwise. However, if it's not mirrored, the orientation is fine.
I'll post the code, anyone else have this problem/know of a fix?
Update: Just ran some tests, the heights and widths (640x480) are fine and reflect the device's orientation. When I Take a picture in portrait, it reports UIImageOrientationLeft and when mirrored, UIImageOrientationLeftMirrored.
Update 2: When I view the saved photo in the camera roll, the preview of the image has the right orientation, as does the image when you swipe between photos, but when the photo is fully loaded, it rotates 90 degrees. Could this be a camera roll problem? (I'm on 4.3.3)
- (void) captureImageAndSaveToCameraRoll
{
AVCaptureConnection *stillImageConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self stillImageOutput] connections]];
if ([stillImageConnection isVideoOrientationSupported])
[stillImageConnection setVideoOrientation:[self orientation]];
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL *assetURL, NSError *error) {
if (error) {
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)]) {
[[self delegate] captureManager:self didFailWithError:error];
}
}
};
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[library writeImageToSavedPhotosAlbum:[image CGImage]
orientation:(ALAssetOrientation)[image imageOrientation]
completionBlock:completionBlock];
[image release];
[library release];
}
else
completionBlock(nil, error);
if ([[self delegate] respondsToSelector:#selector(captureManagerStillImageCaptured:)]) {
[[self delegate] captureManagerStillImageCaptured:self];
}
}];
}
I'm wondering if you the newly created image really has it's orientation set as you are assuming here:
[library writeImageToSavedPhotosAlbum:[image CGImage] orientation: (ALAssetOrientation) [image imageOrientation] completionBlock:completionBlock];
[image imageOrientation] in particular seems potentially the source of the problem, especially with the cast required...
you say you are seeing the image orientation somewhere, is it from [image imageOrientation] just after creating it with:
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation: imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
I'm wondering if you are assuming that this metadata is in the imageData returned from AVCaptureStillImageOutput jpegStillImageNSDataRepresentation: when that in fact only contains the image pixel data itself?
According to some reports, the iPhone's photos application back in 2009 didn't support any of the mirrored orientations at all; it's certainly possible that they partially fixed this but still have bugs in some cases (I know, for example, that UIImageView doesn't handle contentStretch correctly in some cases). This should be easy enough to test: just load the sample images from that blog into the camera roll and see what happens.
Create the category for UIImage described in iOS UIImagePickerController result image orientation after upload
Call this method on the image before saving it to your library.

iOS: Select a GIF from the photo library, convert to NSData for use in multipart/form-data

What's currently working in my code:
I select a JPG or PNG from the Photo Library (using standard ImagePicker methods), and convert that image to NSData using:
self.myImageData = UIImageJPEGRepresentation(myImage, 0.9);
which I then post to a server using multipart/form-data.
I now want to do the same for a GIF, while retaining the original GIF data (so that an animated GIF going into the library, comes back out still animating).
In didFinishPickingMediaWithInfo, I am able to get the URL of the original GIF using
self.myGIFURL = [info objectForKey:UIImagePickerControllerReferenceURL].
Here's one example of what that might get me:
assets-library://asset/asset.GIF?id=1000000034&ext=GIF
Here are two ways I've tried now to push this GIF into NSData, and each time I myImageData shows (null).
I've tried to use initWithContentsOfURL:
NSData *dataFromGIFURL = [[NSData alloc] initWithContentsOfURL: myGIFURL];
self.myImageData = dataFromGIFURL;
[dataFromGIFURL release];
Then I tried converting the NSURL to a string for initWithContentsOfFile:
NSString *stringFromURL = [NSString stringWithFormat:#"%#", myGIFURL];
NSData *dataFromGIFURL = [[NSData alloc] initWithContentsOfFile: stringFromURL];
self.myImageData = dataFromGIFURL;
[dataFromGIFURL release];
Any suggestions? Thanks.
The UIImagePickerControllerReferenceURL key doesn't appear until iOS 4.1. I therefore take it as implicit in your question that it's fine to use the AssetsLibrary framework, which appeared in iOS only at 4.0. In which case, you can use the following:
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:[info objectForKey:UIImagePickerControllerReferenceURL]
resultBlock:^(ALAsset *asset)
{
ALAssetRepresentation *representation = [asset defaultRepresentation];
NSLog(#"size of asset in bytes: %d", [representation size]);
unsigned char bytes[4];
[representation getBytes:bytes fromOffset:0 length:4 error:nil];
NSLog(#"first four bytes: %02x (%c) %02x (%c) %02x (%c) %02x (%c)",
bytes[0], bytes[0],
bytes[1], bytes[1],
bytes[2], bytes[2],
bytes[3], bytes[3]);
[library autorelease];
}
failureBlock:^(NSError *error)
{
NSLog(#"couldn't get asset: %#", error);
[library autorelease];
}
];
}
So, you create an ALAssetsLibrary, ask it to find you the asset with the URL specified (it understands the assets-library:// URL scheme), then when you get the asset you grab its default representation and use that to feed you the bytes. They'll be the actual on-disk bytes, the default representation for an asset from the library being its on-disk form.
For example, selecting a particular GIF I grabbed at random from Google images, from an image picker wired up to a delegate with that method in it gives me the output:
2011-03-03 23:17:37.451
IPTest[1199:307] size of asset in
bytes: 174960
2011-03-03 23:17:37.459
IPTest[1199:307] first four bytes: 47
(G) 49 (I) 46 (F) 38 (8)
So that's the beginning of the standard GIF header. Picking PNGs or JPGs gives the recognisable first four bytes of the PNG and JPG headers.
EDIT: to finish the thought, obviously you can use ALAssetRepresentation to read all of the bytes describing the file into a suitably malloc'd C array, then use NSData +(id)dataWithBytes:length: (or, more likely, +dataWithBytesNoCopy:length:freeWhenDone:) to wrap that into an NSData.
Here's a version that uses the newer Photos framework:
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSURL * refUrl = [info objectForKey:UIImagePickerControllerReferenceURL];
if (refUrl) {
PHAsset * asset = [[PHAsset fetchAssetsWithALAssetURLs:#[refUrl] options:nil] lastObject];
if (asset) {
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.synchronous = YES;
options.networkAccessAllowed = NO;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:options resultHandler:^(NSData * _Nullable imageData, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info) {
NSNumber * isError = [info objectForKey:PHImageErrorKey];
NSNumber * isCloud = [info objectForKey:PHImageResultIsInCloudKey];
if ([isError boolValue] || [isCloud boolValue] || ! imageData) {
// fail
} else {
// success, data is in imageData
}
}];
}
}
}
Here's Eli's version using Swift 3:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String: Any]) {
guard let imageURL = info[UIImagePickerControllerReferenceURL] as? URL else { return }
guard let asset = PHAsset.fetchAssets(withALAssetURLs: [imageURL], options: nil).lastObject else { return }
if picker.sourceType == .photoLibrary || picker.sourceType == .savedPhotosAlbum {
let options = PHImageRequestOptions()
options.isSynchronous = true
options.isNetworkAccessAllowed = false
options.deliveryMode = .highQualityFormat
PHImageManager.default().requestImageData(for: asset, options: options) { data, uti, orientation, info in
guard let info = info else { return }
if let error = info[PHImageErrorKey] as? Error {
log.error("Cannot fetch data for GIF image: \(error)")
return
}
if let isInCould = info[PHImageResultIsInCloudKey] as? Bool, isInCould {
log.error("Cannot fetch data from cloud. Option for network access not set.")
return
}
// do something with data (it is a Data object)
}
} else {
// do something with media taken via camera
}
}