App crash when get fullResolutionImage - iphone

I try using my app ALAssetRepresentation.and when i loop om an images there are couple of image that crash the app
for(ALAsset *asset in _assets) {
NSMutableDictionary *workingDictionary = [[NSMutableDictionary alloc] init];
[workingDictionary setObject:[asset valueForProperty:ALAssetPropertyType] forKey:#"UIImagePickerControllerMediaType"];
ALAssetRepresentation *representation = [asset defaultRepresentation];
if (!representation) {
[workingDictionary release];
continue;
}
CGImageRef imageRef = [representation fullResolutionImage];//here the app crash
UIImage *img = [UIImage imageWithCGImage:imageRef];
if (!img) {
[workingDictionary release];
continue;
}
if (!img) {
[workingDictionary release];
continue;
}
[workingDictionary setObject:img forKey:#"UIImagePickerControllerOriginalImage"];
[workingDictionary setObject:[asset valueForProperty:ALAssetPropertyOrientation] forKey:#"orientation"];
[returnArray addObject:workingDictionary];
[workingDictionary release];
}
in this line i get crash without any msg:
CGImageRef imageRef = [representation fullResolutionImage];
This is the crash msg
Program received signal: “0”.
Data Formatters temporarily unavailable, will re-try after a 'continue'. (Unknown error loading shared library "/Developer/usr/lib/libXcodeDebuggerSupport.dylib")

That is most likely due to running out of memory, how big are the images that cause the crash?

I had a similar problem and after hours of lookin for solution I found this - the best solution of too big Asset bug:
// For details, see http://mindsea.com/2012/12/18/downscaling-huge-alassets-without-fear-of-sigkill
#import <AssetsLibrary/AssetsLibrary.h>
#import <ImageIO/ImageIO.h>
// Helper methods for thumbnailForAsset:maxPixelSize:
static size_t getAssetBytesCallback(void *info, void *buffer, off_t position, size_t count) {
ALAssetRepresentation *rep = (__bridge id)info;
NSError *error = nil;
size_t countRead = [rep getBytes:(uint8_t *)buffer fromOffset:position length:count error:&error];
if (countRead == 0 && error) {
// We have no way of passing this info back to the caller, so we log it, at least.
NSLog(#"thumbnailForAsset:maxPixelSize: got an error reading an asset: %#", error);
}
return countRead;
}
static void releaseAssetCallback(void *info) {
// The info here is an ALAssetRepresentation which we CFRetain in thumbnailForAsset:maxPixelSize:.
// This release balances that retain.
CFRelease(info);
}
// Returns a UIImage for the given asset, with size length at most the passed size.
// The resulting UIImage will be already rotated to UIImageOrientationUp, so its CGImageRef
// can be used directly without additional rotation handling.
// This is done synchronously, so you should call this method on a background queue/thread.
- (UIImage *)thumbnailForAsset:(ALAsset *)asset maxPixelSize:(NSUInteger)size {
NSParameterAssert(asset != nil);
NSParameterAssert(size > 0);
ALAssetRepresentation *rep = [asset defaultRepresentation];
CGDataProviderDirectCallbacks callbacks = {
.version = 0,
.getBytePointer = NULL,
.releaseBytePointer = NULL,
.getBytesAtPosition = getAssetBytesCallback,
.releaseInfo = releaseAssetCallback,
};
CGDataProviderRef provider = CGDataProviderCreateDirect((void *)CFBridgingRetain(rep), [rep size], &callbacks);
CGImageSourceRef source = CGImageSourceCreateWithDataProvider(provider, NULL);
CGImageRef imageRef = CGImageSourceCreateThumbnailAtIndex(source, 0, (__bridge CFDictionaryRef) #{
(NSString *)kCGImageSourceCreateThumbnailFromImageAlways : #YES,
(NSString *)kCGImageSourceThumbnailMaxPixelSize : [NSNumber numberWithInt:size],
(NSString *)kCGImageSourceCreateThumbnailWithTransform : #YES,
});
CFRelease(source);
CFRelease(provider);
if (!imageRef) {
return nil;
}
UIImage *toReturn = [UIImage imageWithCGImage:imageRef];
CFRelease(imageRef);
return toReturn;
}

Related

Memory warning while loading above 100 images from specific album of photo library

Please find the below code for fetching images from Photo library
- (void) initilizeAssetForPhotoLibrary {
if (!assets) {
assets = [[NSMutableArray alloc] init];
} else {
[assets removeAllObjects];
}
ALAssetsGroupEnumerationResultsBlock assetsEnumerationBlock = ^(ALAsset *result, NSUInteger index, BOOL *stop) {
if (result) {
[assets addObject:result];
}
};
ALAssetsFilter *onlyPhotosFilter = [ALAssetsFilter allPhotos];
[assetsGroup setAssetsFilter:onlyPhotosFilter];
[assetsGroup enumerateAssetsUsingBlock:assetsEnumerationBlock];
}
- (NSMutableArray *) getImagesFromPhotoLibrary {
for(int i=0; i<assets.count; i++) {
ALAsset *asset = [assets objectAtIndex:i];
ALAssetRepresentation *assetRepresentation = [asset defaultRepresentation];
UIImage *getImage = [UIImage imageWithCGImage:[assetRepresentation fullScreenImage] scale:[assetRepresentation scale] orientation:(UIImageOrientation)[assetRepresentation orientation]];
[arrImages addObject:getImage];
}
return arrImages;
}
Actually my requirement is that I need to load all image from specific album in application for creating the slideshow.
When I am loading less than 100 images then it works fine but above it gets memory warning and after that it crashed.
Please help me if anyone has different idea to load images and less memory consumption. Thanks in advanced.
Keep all the images in memory will not do, there is just not enough memory for this.
You will need to fill the array with the ALAssetRepresentation of the images and load the image only when you are ready to display it. This way you will only have the image in memory that you are really displaying.
then write down below code and let me know it is working or not!!!
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT,
(unsigned long)NULL), ^(void) {
for(int i=0; i<assets.count; i++) {
ALAsset *asset = [assets objectAtIndex:i];
ALAssetRepresentation *assetRepresentation = [asset defaultRepresentation];
UIImage *getImage = [UIImage imageWithCGImage:[assetRepresentation fullScreenImage] scale:[assetRepresentation scale] orientation:(UIImageOrientation)[assetRepresentation orientation]];
[arrImages addObject:getImage];
}
return arrImages;
});
Happy Coding...!!!!

How to retrieve images from Instagram which has special hashtag?

My client wants to share an image on Instagram. I have implemeted sharing image on instagram.But i could not share it with a special hashtag. Here is my code so far.
- (IBAction)sharePhotoOnInstagram:(id)sender {
UIImagePickerController *imgpicker=[[UIImagePickerController alloc] init];
imgpicker.delegate=self;
[self storeimage];
NSURL *instagramURL = [NSURL URLWithString:#"instagram://app"];
if ([[UIApplication sharedApplication] canOpenURL:instagramURL])
{
CGRect rect = CGRectMake(0 ,0 , 612, 612);
NSString *jpgPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/15717.ig"];
NSURL *igImageHookFile = [[NSURL alloc] initWithString:[[NSString alloc] initWithFormat:#"file://%#", jpgPath]];
dic.UTI = #"com.instagram.photo";
dic.delegate = self;
dic = [self setupControllerWithURL:igImageHookFile usingDelegate:self];
dic = [UIDocumentInteractionController interactionControllerWithURL:igImageHookFile];
dic.delegate = self;
[dic presentOpenInMenuFromRect: rect inView: self.view animated: YES ];
// [[UIApplication sharedApplication] openURL:instagramURL];
}
else
{
// NSLog(#"instagramImageShare");
UIAlertView *errorToShare = [[UIAlertView alloc] initWithTitle:#"Instagram unavailable " message:#"You need to install Instagram in your device in order to share this image" delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil];
errorToShare.tag=3010;
[errorToShare show];
}
}
- (void) storeimage
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *savedImagePath = [documentsDirectory stringByAppendingPathComponent:#"15717.ig"];
UIImage *NewImg = [self resizedImage:picTaken :CGRectMake(0, 0, 612, 612) ];
NSData *imageData = UIImagePNGRepresentation(NewImg);
[imageData writeToFile:savedImagePath atomically:NO];
}
-(UIImage*) resizedImage:(UIImage *)inImage: (CGRect) thumbRect
{
CGImageRef imageRef = [inImage CGImage];
CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef);
// There's a wierdness with kCGImageAlphaNone and CGBitmapContextCreate
// see Supported Pixel Formats in the Quartz 2D Programming Guide
// Creating a Bitmap Graphics Context section
// only RGB 8 bit images with alpha of kCGImageAlphaNoneSkipFirst, kCGImageAlphaNoneSkipLast, kCGImageAlphaPremultipliedFirst,
// and kCGImageAlphaPremultipliedLast, with a few other oddball image kinds are supported
// The images on input here are likely to be png or jpeg files
if (alphaInfo == kCGImageAlphaNone)
alphaInfo = kCGImageAlphaNoneSkipLast;
// Build a bitmap context that's the size of the thumbRect
CGContextRef bitmap = CGBitmapContextCreate(
NULL,
thumbRect.size.width, // width
thumbRect.size.height, // height
CGImageGetBitsPerComponent(imageRef), // really needs to always be 8
4 * thumbRect.size.width, // rowbytes
CGImageGetColorSpace(imageRef),
alphaInfo
);
// Draw into the context, this scales the image
CGContextDrawImage(bitmap, thumbRect, imageRef);
// Get an image from the context and a UIImage
CGImageRef ref = CGBitmapContextCreateImage(bitmap);
UIImage* result = [UIImage imageWithCGImage:ref];
CGContextRelease(bitmap); // ok if NULL
CGImageRelease(ref);
return result;
}
- (UIDocumentInteractionController *) setupControllerWithURL: (NSURL*) fileURL usingDelegate: (id <UIDocumentInteractionControllerDelegate>) interactionDelegate
{
UIDocumentInteractionController *interactionController = [UIDocumentInteractionController interactionControllerWithURL:fileURL];
interactionController.delegate = self;
return interactionController;
}
- (void)documentInteractionControllerWillPresentOpenInMenu:(UIDocumentInteractionController *)controller
{
}
- (BOOL)documentInteractionController:(UIDocumentInteractionController *)controller canPerformAction:(SEL)action
{
// NSLog(#"5dsklfjkljas");
return YES;
}
- (BOOL)documentInteractionController:(UIDocumentInteractionController *)controller performAction:(SEL)action
{
// NSLog(#"dsfa");
return YES;
}
- (void)documentInteractionController:(UIDocumentInteractionController *)controller didEndSendingToApplication:(NSString *)application
{
// NSLog(#"fsafasd;");
}
Note : This is working fine.
I have followed their documentation on http://instagram.com/developer/iphone-hooks/ but couldn't get better idea from it!. Now don't know what to do next step for sharing an image with hashtag and other information.
Secondly I want to retrieve all the images shared with a particular hashtag into the application.
Please guide me! Thanks in advance!
First, from iPhone Hooks, under 'Document Interaction':
To include a pre-filled caption with your photo, you can set the annotation property on the document interaction request to an NSDictionary containing an NSString under the key "InstagramCaption". Note: this feature will be available on Instagram 2.1 and later.
You'll need to add something like:
dic.annotation = [NSDictionary dictionaryWithObject:#"#yourTagHere" forKey:#"InstagramCaption"];
Second, you'll need to take a look at Tag Endpoints if you want to pull down images with a specific tag.

Crash on CFDataGetBytes(image, CFRangeMake(0, CFDataGetLength(image)), destPixels);

I am making video of screen but crashes on this line.
CFDataGetBytes(image, CFRangeMake(0, CFDataGetLength(image)), destPixels);
Note: It will work if the pixel buffer is contiguous and has the same bytesPerRow as the input data
So I am providing my code that grabs frames from the camera - maybe it will help After grabbing the data, I put it on a queue for further processing. I had to remove some of the code as it was not relevant to you - so what you see here has pieces you should be able to use.
- (void)captureOutput:(AVCaptureVideoDataOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
#autoreleasepool {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
//NSLog(#"PE: value=%lld timeScale=%d flags=%x", prStamp.value, prStamp.timescale, prStamp.flags);
/*Lock the image buffer*/
CVPixelBufferLockBaseAddress(imageBuffer,0);
NSRange captureRange;
/*Get information about the image*/
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
// Note Apple sample code cheats big time - the phone is big endian so this reverses the "apparent" order of bytes
CGContextRef newContext = CGBitmapContextCreate(NULL, width, captureRange.length, 8, bytesPerRow, colorSpace, kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little); // Video in ARGB format
assert(newContext);
uint8_t *newPtr = (uint8_t *)CGBitmapContextGetData(newContext);
size_t offset = captureRange.location * bytesPerRow;
memcpy(newPtr, baseAddress + offset, captureRange.length * bytesPerRow);
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
CMTime prStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); // when it was taken?
//CMTime deStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer); // now?
NSDictionary *dict = [NSDictionary dictionaryWithObjectsAndKeys:
[NSValue valueWithBytes:&saveState objCType:#encode(saveImages)], kState,
[NSValue valueWithNonretainedObject:(__bridge id)newContext], kImageContext,
[NSValue valueWithBytes:&prStamp objCType:#encode(CMTime)], kPresTime,
nil ];
dispatch_async(imageQueue, ^
{
// could be on any thread now
OSAtomicDecrement32(&queueDepth);
if(!isCancelled) {
saveImages state; [(NSValue *)[dict objectForKey:kState] getValue:&state];
CGContextRef context; [(NSValue *)[dict objectForKey:kImageContext] getValue:&context];
CMTime stamp; [(NSValue *)[dict objectForKey:kPresTime] getValue:&stamp];
CGImageRef newImageRef = CGBitmapContextCreateImage(context);
CGContextRelease(context);
UIImageOrientation orient = state == saveOne ? UIImageOrientationLeft : UIImageOrientationUp;
UIImage *image = [UIImage imageWithCGImage:newImageRef scale:1.0 orientation:orient]; // imageWithCGImage: UIImageOrientationUp UIImageOrientationLeft
CGImageRelease(newImageRef);
NSData *data = UIImagePNGRepresentation(image);
// NSLog(#"STATE:[%d]: value=%lld timeScale=%d flags=%x", state, stamp.value, stamp.timescale, stamp.flags);
{
NSString *name = [NSString stringWithFormat:#"%d.png", num];
NSString *path = [[wlAppDelegate snippetsDirectory] stringByAppendingPathComponent:name];
BOOL ret = [data writeToFile:path atomically:NO];
//NSLog(#"WROTE %d err=%d w/time %f path:%#", num, ret, (double)stamp.value/(double)stamp.timescale, path);
if(!ret) {
++errors;
} else {
dispatch_async(dispatch_get_main_queue(), ^
{
if(num) [delegate progress:(CGFloat)num/(CGFloat)(MORE_THAN_ONE_REV * SNAPS_PER_SEC) file:path];
} );
}
++num;
}
} else NSLog(#"CANCELLED");
} );
}
}

Get Exif data from UIImage - UIImagePickerController [duplicate]

This question already has answers here:
UIImagePickerController and extracting EXIF data from existing photos
(18 answers)
Closed 7 years ago.
How can we get Exif information from UIImage selected from UIImagePickerController?
I had done much R&D for this and got many replies but still failed to implement this.
I had gone through this this and this link
Please help me to solve this problem.
Thanks in advance..
Interesting question! I came up with the following solution working for images picked from your photo library (note my code is using ARC):
Import AssetsLibrary.framework and ImageIO.framework.
Then include the needed classes inside your .h-file:
#import <AssetsLibrary/ALAsset.h>
#import <AssetsLibrary/ALAssetRepresentation.h>
#import <ImageIO/CGImageSource.h>
#import <ImageIO/CGImageProperties.h>
And put this inside your imagePickerController:didFinishPickingMediaWithInfo: delegate method:
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:[info objectForKey:UIImagePickerControllerReferenceURL]
resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *image_representation = [asset defaultRepresentation];
// create a buffer to hold image data
uint8_t *buffer = (Byte*)malloc(image_representation.size);
NSUInteger length = [image_representation getBytes:buffer fromOffset: 0.0 length:image_representation.size error:nil];
if (length != 0) {
// buffer -> NSData object; free buffer afterwards
NSData *adata = [[NSData alloc] initWithBytesNoCopy:buffer length:image_representation.size freeWhenDone:YES];
// identify image type (jpeg, png, RAW file, ...) using UTI hint
NSDictionary* sourceOptionsDict = [NSDictionary dictionaryWithObjectsAndKeys:(id)[image_representation UTI] ,kCGImageSourceTypeIdentifierHint,nil];
// create CGImageSource with NSData
CGImageSourceRef sourceRef = CGImageSourceCreateWithData((__bridge CFDataRef) adata, (__bridge CFDictionaryRef) sourceOptionsDict);
// get imagePropertiesDictionary
CFDictionaryRef imagePropertiesDictionary;
imagePropertiesDictionary = CGImageSourceCopyPropertiesAtIndex(sourceRef,0, NULL);
// get exif data
CFDictionaryRef exif = (CFDictionaryRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyExifDictionary);
NSDictionary *exif_dict = (__bridge NSDictionary*)exif;
NSLog(#"exif_dict: %#",exif_dict);
// save image WITH meta data
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSURL *fileURL = nil;
CGImageRef imageRef = CGImageSourceCreateImageAtIndex(sourceRef, 0, imagePropertiesDictionary);
if (![[sourceOptionsDict objectForKey:#"kCGImageSourceTypeIdentifierHint"] isEqualToString:#"public.tiff"])
{
fileURL = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/%#.%#",
documentsDirectory,
#"myimage",
[[[sourceOptionsDict objectForKey:#"kCGImageSourceTypeIdentifierHint"] componentsSeparatedByString:#"."] objectAtIndex:1]
]];
CGImageDestinationRef dr = CGImageDestinationCreateWithURL ((__bridge CFURLRef)fileURL,
(__bridge CFStringRef)[sourceOptionsDict objectForKey:#"kCGImageSourceTypeIdentifierHint"],
1,
NULL
);
CGImageDestinationAddImage(dr, imageRef, imagePropertiesDictionary);
CGImageDestinationFinalize(dr);
CFRelease(dr);
}
else
{
NSLog(#"no valid kCGImageSourceTypeIdentifierHint found …");
}
// clean up
CFRelease(imageRef);
CFRelease(imagePropertiesDictionary);
CFRelease(sourceRef);
}
else {
NSLog(#"image_representation buffer length == 0");
}
}
failureBlock:^(NSError *error) {
NSLog(#"couldn't get asset: %#", error);
}
];
One thing I noticed is, that iOS will ask the user to allow location services – if he denies, you won't be abled to get the image data …
EDIT
Added code to save the image including its meta data. It's a quick approach, so maybe there is a better way, but it works!
These answers all seem extremely complex. If the image has been saved to the Camera Roll, and you have the ALAsset (either from UIImagePicker or ALAssetLibrary) you can get the metadata like so:
asset.defaultRepresentation.metadata;
If you want to save that image from camera roll to another location (say in Sandbox/Documents) simply do:
CGImageDestinationRef imageDestinationRef = CGImageDestinationCreateWithURL((__bridge CFURLRef)urlToSaveTo, kUTTypeJPEG, 1, NULL);
CFDictionaryRef imagePropertiesRef = (__bridge CFDictionaryRef)asset.defaultRepresentation.metadata;
CGImageDestinationAddImage(imageDestinationRef, asset.defaultRepresentation.fullResolutionImage, imagePropertiesRef);
if (!CGImageDestinationFinalize(imageDestinationRef)) NSLog(#"Failed to copy photo on save to %#", urlToSaveTo);
CFRelease(imageDestinationRef);
I had found solution and got answer from here
From here We can get GPS info as well..
Amazing and thanks all for helping me to solve this problem.
UPDATE
This is another function that I had created myself, also return Exif data as well as GPS data and in this function we doesn't need any third party library.. but you have to turn on location services for this. and use current latitude and longitude for that. so have to use CoreLocation.framework
//FOR CAMERA IMAGE
-(NSMutableData *)getImageWithMetaData:(UIImage *)pImage
{
NSData* pngData = UIImagePNGRepresentation(pImage);
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)pngData, NULL);
NSDictionary *metadata = (NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source, 0, NULL);
NSMutableDictionary *metadataAsMutable = [[metadata mutableCopy]autorelease];
[metadata release];
//For GPS Dictionary
NSMutableDictionary *GPSDictionary = [[[metadataAsMutable objectForKey:(NSString *)kCGImagePropertyGPSDictionary]mutableCopy]autorelease];
if(!GPSDictionary)
GPSDictionary = [NSMutableDictionary dictionary];
[GPSDictionary setValue:[NSNumber numberWithDouble:currentLatitude] forKey:(NSString*)kCGImagePropertyGPSLatitude];
[GPSDictionary setValue:[NSNumber numberWithDouble:currentLongitude] forKey:(NSString*)kCGImagePropertyGPSLongitude];
NSString* ref;
if (currentLatitude <0.0)
ref = #"S";
else
ref =#"N";
[GPSDictionary setValue:ref forKey:(NSString*)kCGImagePropertyGPSLatitudeRef];
if (currentLongitude <0.0)
ref = #"W";
else
ref =#"E";
[GPSDictionary setValue:ref forKey:(NSString*)kCGImagePropertyGPSLongitudeRef];
[GPSDictionary setValue:[NSNumber numberWithFloat:location.altitude] forKey:(NSString*)kCGImagePropertyGPSAltitude];
//For EXIF Dictionary
NSMutableDictionary *EXIFDictionary = [[[metadataAsMutable objectForKey:(NSString *)kCGImagePropertyExifDictionary]mutableCopy]autorelease];
if(!EXIFDictionary)
EXIFDictionary = [NSMutableDictionary dictionary];
[EXIFDictionary setObject:[NSDate date] forKey:(NSString*)kCGImagePropertyExifDateTimeOriginal];
[EXIFDictionary setObject:[NSDate date] forKey:(NSString*)kCGImagePropertyExifDateTimeDigitized];
//add our modified EXIF data back into the image’s metadata
[metadataAsMutable setObject:EXIFDictionary forKey:(NSString *)kCGImagePropertyExifDictionary];
[metadataAsMutable setObject:GPSDictionary forKey:(NSString *)kCGImagePropertyGPSDictionary];
CFStringRef UTI = CGImageSourceGetType(source);
NSMutableData *dest_data = [NSMutableData data];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((CFMutableDataRef)dest_data, UTI, 1, NULL);
if(!destination)
dest_data = [[pngData mutableCopy] autorelease];
else
{
CGImageDestinationAddImageFromSource(destination, source, 0, (CFDictionaryRef) metadataAsMutable);
BOOL success = CGImageDestinationFinalize(destination);
if(!success)
dest_data = [[pngData mutableCopy] autorelease];
}
if(destination)
CFRelease(destination);
CFRelease(source);
return dest_data;
}
//FOR PHOTO LIBRARY IMAGE
-(NSMutableData *)getImagedataPhotoLibrary:(NSDictionary *)pImgDictionary andImage:(UIImage *)pImage
{
NSData* data = UIImagePNGRepresentation(pImage);
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)data, NULL);
NSMutableDictionary *metadataAsMutable = [[pImgDictionary mutableCopy]autorelease];
CFStringRef UTI = CGImageSourceGetType(source);
NSMutableData *dest_data = [NSMutableData data];
//For Mutabledata
CGImageDestinationRef destination = CGImageDestinationCreateWithData((CFMutableDataRef)dest_data, UTI, 1, NULL);
if(!destination)
dest_data = [[data mutableCopy] autorelease];
else
{
CGImageDestinationAddImageFromSource(destination, source, 0, (CFDictionaryRef) metadataAsMutable);
BOOL success = CGImageDestinationFinalize(destination);
if(!success)
dest_data = [[data mutableCopy] autorelease];
}
if(destination)
CFRelease(destination);
CFRelease(source);
return dest_data;
}
and We will retrieve that data like this
//FOR CAMERA IMAGE
NSData *originalImgData = [self getImageWithMetaData:imgOriginal];
//FOR PHOTO LIBRARY IMAGE
[self getImagedataPhotoLibrary:[[myasset defaultRepresentation] metadata] andImage:imgOriginal];
For all of this you should have to Import AssetsLibrary.framework and ImageIO.framework.
I have used this method for getting the exifdata dictionary from image , I hope this will also work for you
-(void)getExifDataFromImage:(UIImage *)currentImage
{
NSData* pngData = UIImageJPEGRepresentation(currentImage, 1.0);
CGImageSourceRef mySourceRef = CGImageSourceCreateWithData((CFDataRef)pngData, NULL);
//CGImageSourceRef mySourceRef = CGImageSourceCreateWithURL((__bridge CFURLRef)myURL, NULL);
if (mySourceRef != NULL)
{
NSDictionary *myMetadata = (__bridge NSDictionary *)CGImageSourceCopyPropertiesAtIndex(mySourceRef,0,NULL);
NSDictionary *exifDic = [myMetadata objectForKey:(NSString *)kCGImagePropertyExifDictionary];
NSDictionary *tiffDic = [myMetadata objectForKey:(NSString *)kCGImagePropertyTIFFDictionary];
NSLog(#"exifDic properties: %#", myMetadata); //all data
float rawShutterSpeed = [[exifDic objectForKey:(NSString *)kCGImagePropertyExifExposureTime] floatValue];
int decShutterSpeed = (1 / rawShutterSpeed);
NSLog(#"Camera %#",[tiffDic objectForKey:(NSString *)kCGImagePropertyTIFFModel]);
NSLog(#"Focal Length %#mm",[exifDic objectForKey:(NSString *)kCGImagePropertyExifFocalLength]);
NSLog(#"Shutter Speed %#", [NSString stringWithFormat:#"1/%d", decShutterSpeed]);
NSLog(#"Aperture f/%#",[exifDic objectForKey:(NSString *)kCGImagePropertyExifFNumber]);
NSNumber *ExifISOSpeed = [[exifDic objectForKey:(NSString*)kCGImagePropertyExifISOSpeedRatings] objectAtIndex:0];
NSLog(#"ISO %ld",[ExifISOSpeed integerValue]);
NSLog(#"Taken %#",[exifDic objectForKey:(NSString*)kCGImagePropertyExifDateTimeDigitized]);
}
}
You need ALAssetsLibrary to actually retrieve the EXIF info from an image. The EXIF is added to an image only when it is saved to the Photo Library. Even if you use ALAssetLibrary to get an image asset from the library, it will lose all EXIF info if you set it to a UIImage.
I have tried to insert GPS coordinates into image metadata picked by iPad Camera as it was suggested by Mehul.
It Works, Thank you for your post.
P.S.
Who intends to use that code, just substitude the two geolocations at the top of the function -(NSMutableData *)getImageWithMetaData:(UIImage *)pImage {
double currentLatitude = [locationManager location].coordinate.latitude;
double currentLongitude = [locationManager location].coordinate.longitude;
...
By supposing that you have already initializied somewhere locationManager in your code, like this:
locationManager = [[CLLocationManager alloc] init];
[locationManager setDesiredAccuracy:kCLLocationAccuracyBest];
[locationManager setDelegate:self]; // Not necessary in this case
[locationManager startUpdatingLocation]; // Not neccessary in this case
and by importing CoreLocation/CoreLocation.h and ImageIO/ImageIO.h headers with associated frameworks.

How to write exif metadata to an image (not the camera roll, just a UIImage or JPEG)

I am aware of how to save metadata using ALAssets. But, I want to save an image, or upload it somewhere, with exif intact. I have exif data as an NSDictionary. But how can I inject it properly into a UIImage (or probably an NSData JPEG representation)?
I am using UIImagePickerController to get the image from the camera and my flow is a bit different than the one described by Chiquis. Here it is:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = info[#"UIImagePickerControllerOriginalImage"];
NSString *fullPhotoFilename = ...; // generate the photo name and path here
NSData *photoData = [UIImage taggedImageData:image.jpegData metadata:info[#"UIImagePickerControllerMediaMetadata"] orientation:image.imageOrientation];
[photoData writeToFile:fullPhotoFilename atomically:YES];
}
And using a UIImage category to put combine the image data with its metadata:
#import <ImageIO/ImageIO.h>
#import "UIImage+Tagging.h"
#import "LocationHelper.h"
#implementation UIImage (Tagging)
+ (NSData *)writeMetadataIntoImageData:(NSData *)imageData metadata:(NSMutableDictionary *)metadata {
// create an imagesourceref
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) imageData, NULL);
// this is the type of image (e.g., public.jpeg)
CFStringRef UTI = CGImageSourceGetType(source);
// create a new data object and write the new image into it
NSMutableData *dest_data = [NSMutableData data];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data, UTI, 1, NULL);
if (!destination) {
NSLog(#"Error: Could not create image destination");
}
// add the image contained in the image source to the destination, overidding the old metadata with our modified metadata
CGImageDestinationAddImageFromSource(destination, source, 0, (__bridge CFDictionaryRef) metadata);
BOOL success = NO;
success = CGImageDestinationFinalize(destination);
if (!success) {
NSLog(#"Error: Could not create data from image destination");
}
CFRelease(destination);
CFRelease(source);
return dest_data;
}
+ (NSData *)taggedImageData:(NSData *)imageData metadata:(NSDictionary *)metadata orientation:(UIImageOrientation)orientation {
CLLocationManager *locationManager = [CLLocationManager new];
CLLocation *location = [locationManager location];
NSMutableDictionary *newMetadata = [NSMutableDictionary dictionaryWithDictionary:metadata];
if (!newMetadata[(NSString *)kCGImagePropertyGPSDictionary] && location) {
newMetadata[(NSString *)kCGImagePropertyGPSDictionary] = [LocationHelper gpsDictionaryForLocation:location];
}
// Reference: http://sylvana.net/jpegcrop/exif_orientation.html
int newOrientation;
switch (orientation) {
case UIImageOrientationUp:
newOrientation = 1;
break;
case UIImageOrientationDown:
newOrientation = 3;
break;
case UIImageOrientationLeft:
newOrientation = 8;
break;
case UIImageOrientationRight:
newOrientation = 6;
break;
case UIImageOrientationUpMirrored:
newOrientation = 2;
break;
case UIImageOrientationDownMirrored:
newOrientation = 4;
break;
case UIImageOrientationLeftMirrored:
newOrientation = 5;
break;
case UIImageOrientationRightMirrored:
newOrientation = 7;
break;
default:
newOrientation = -1;
}
if (newOrientation != -1) {
newMetadata[(NSString *)kCGImagePropertyOrientation] = #(newOrientation);
}
NSData *newImageData = [self writeMetadataIntoImageData:imageData metadata:newMetadata];
return newImageData;
}
And finally, here is the method I am using to generate the needed GPS dictionary:
+ (NSDictionary *)gpsDictionaryForLocation:(CLLocation *)location {
NSTimeZone *timeZone = [NSTimeZone timeZoneWithName:#"UTC"];
NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
[formatter setTimeZone:timeZone];
[formatter setDateFormat:#"HH:mm:ss.SS"];
NSDictionary *gpsDict = #{(NSString *)kCGImagePropertyGPSLatitude: #(fabs(location.coordinate.latitude)),
(NSString *)kCGImagePropertyGPSLatitudeRef: ((location.coordinate.latitude >= 0) ? #"N" : #"S"),
(NSString *)kCGImagePropertyGPSLongitude: #(fabs(location.coordinate.longitude)),
(NSString *)kCGImagePropertyGPSLongitudeRef: ((location.coordinate.longitude >= 0) ? #"E" : #"W"),
(NSString *)kCGImagePropertyGPSTimeStamp: [formatter stringFromDate:[location timestamp]],
(NSString *)kCGImagePropertyGPSAltitude: #(fabs(location.altitude)),
};
return gpsDict;
}
Hope it helps someone. Thanks to Gustavo Ambrozio, Chiquis and several others SO members I was able to piece it together and use it in my project.
UIImage does not contain metadata information (it is stripped). So if you want to save it without using the imagepicker method (not in camera roll):
Follow the answer here to write to a file with the metadata intact:
Problem setting exif data for an image
no idea why would this be downvoted but here is the method:
In this case im getting the image through AVFoundation and this is what goes in the
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
// code here
}
block code:
CFDictionaryRef metaDict = CMCopyDictionaryOfAttachments(NULL, imageSampleBuffer, kCMAttachmentMode_ShouldPropagate);
CFMutableDictionaryRef mutable = CFDictionaryCreateMutableCopy(NULL, 0, metaDict);
// Create formatted date
NSTimeZone *timeZone = [NSTimeZone timeZoneWithName:#"UTC"];
NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
[formatter setTimeZone:timeZone];
[formatter setDateFormat:#"HH:mm:ss.SS"];
// Create GPS Dictionary
NSDictionary *gpsDict = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:fabs(loc.coordinate.latitude)], kCGImagePropertyGPSLatitude
, ((loc.coordinate.latitude >= 0) ? #"N" : #"S"), kCGImagePropertyGPSLatitudeRef
, [NSNumber numberWithFloat:fabs(loc.coordinate.longitude)], kCGImagePropertyGPSLongitude
, ((loc.coordinate.longitude >= 0) ? #"E" : #"W"), kCGImagePropertyGPSLongitudeRef
, [formatter stringFromDate:[loc timestamp]], kCGImagePropertyGPSTimeStamp
, [NSNumber numberWithFloat:fabs(loc.altitude)], kCGImagePropertyGPSAltitude
, nil];
// The gps info goes into the gps metadata part
CFDictionarySetValue(mutable, kCGImagePropertyGPSDictionary, (__bridge void *)gpsDict);
// Here just as an example im adding the attitude matrix in the exif comment metadata
CMRotationMatrix m = att.rotationMatrix;
GLKMatrix4 attMat = GLKMatrix4Make(m.m11, m.m12, m.m13, 0, m.m21, m.m22, m.m23, 0, m.m31, m.m32, m.m33, 0, 0, 0, 0, 1);
NSMutableDictionary *EXIFDictionary = (__bridge NSMutableDictionary*)CFDictionaryGetValue(mutable, kCGImagePropertyExifDictionary);
[EXIFDictionary setValue:NSStringFromGLKMatrix4(attMat) forKey:(NSString *)kCGImagePropertyExifUserComment];
CFDictionarySetValue(mutable, kCGImagePropertyExifDictionary, (__bridge void *)EXIFDictionary);
NSData *jpeg = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer] ;
After this code you will have your image in the jpeg nsdata and the correspoding dictionary for that image in the mutable cfdictionary.
All you have to do now is:
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)jpeg, NULL);
CFStringRef UTI = CGImageSourceGetType(source); //this is the type of image (e.g., public.jpeg)
NSMutableData *dest_data = [NSMutableData data];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,UTI,1,NULL);
if(!destination) {
NSLog(#"***Could not create image destination ***");
}
//add the image contained in the image source to the destination, overidding the old metadata with our modified metadata
CGImageDestinationAddImageFromSource(destination,source,0, (CFDictionaryRef) mutable);
//tell the destination to write the image data and metadata into our data object.
//It will return false if something goes wrong
BOOL success = CGImageDestinationFinalize(destination);
if(!success) {
NSLog(#"***Could not create data from image destination ***");
}
//now we have the data ready to go, so do whatever you want with it
//here we just write it to disk at the same path we were passed
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0]; // Get documents folder
NSString *dataPath = [documentsDirectory stringByAppendingPathComponent:#"ImagesFolder"];
NSError *error;
if (![[NSFileManager defaultManager] fileExistsAtPath:dataPath])
[[NSFileManager defaultManager] createDirectoryAtPath:dataPath withIntermediateDirectories:NO attributes:nil error:&error]; //Create folder
// NSString *imageName = #"ImageName";
NSString *fullPath = [dataPath stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.jpg", name]]; //add our image to the path
[dest_data writeToFile:fullPath atomically:YES];
//cleanup
CFRelease(destination);
CFRelease(source);
Note how I'm not saving using the ALAssets but directly into a folder of my choice.
Btw most of this code can be found in the link I posted at first.
There is easier way. If you need to save some exif, you can use SimpleExif pod
First create a ExifContainer:
ExifContainer *container = [[ExifContainer alloc] init];
and populate it with all requred data:
[container addUserComment:#"A long time ago, in a galaxy far, far away"];
[container addCreationDate:[NSDate dateWithTimeIntervalSinceNow:-10000000]];
[container addLocation:locations[0]];
Then you can add this data to image:
NSData *imageData = [[UIImage imageNamed:#"DemoImage"] addExif:container];
Then you just save this data as a JPEG
I faced the same problem, now I can upload files with EXIF data, also you can compress photo if need it, this solved the issue for me:
// Get your image.
UIImage *loImgPhoto = [self getImageFromAsset:loPHAsset];
// Get your metadata (includes the EXIF data).
CGImageSourceRef loImageOriginalSource = CGImageSourceCreateWithData(( CFDataRef) loDataFotoOriginal, NULL);
NSDictionary *loDicMetadata = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(loImageOriginalSource, 0, NULL);
// Set your compression quality (0.0 to 1.0).
NSMutableDictionary *loDicMutableMetadata = [loDicMetadata mutableCopy];
[loDicMutableMetadata setObject:#(lfCompressionQualityValue) forKey:(__bridge NSString *)kCGImageDestinationLossyCompressionQuality];
// Create an image destination.
NSMutableData *loNewImageDataWithExif = [NSMutableData data];
CGImageDestinationRef loImgDestination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)loNewImageDataWithExif, CGImageSourceGetType(loImageOriginalSource), 1, NULL);
// Add your image to the destination.
CGImageDestinationAddImage(loImgDestination, loImgPhoto.CGImage, (__bridge CFDictionaryRef) loDicMutableMetadata);
// Finalize the destination.
if (CGImageDestinationFinalize(loImgDestination))
{
NSLog(#"Successful image creation.");
// process the image rendering, adjustment data creation and finalize the asset edit.
//Upload photo with EXIF metadata
[self myUploadMethod:loNewImageDataWithExif];
}
else
{
NSLog(#"Error -> failed to finalize the image.");
}
CFRelease(loImageOriginalSource);
CFRelease(loImgDestination);
getImageFromAsset method:
-(UIImage *)getImageFromAsset:(PHAsset *)aPHAsset
{
__block UIImage *limgImageResult;
PHImageRequestOptions *lPHImageRequestOptions = [PHImageRequestOptions new];
lPHImageRequestOptions.synchronous = YES;
[self.imageManager requestImageForAsset:aPHAsset
targetSize:PHImageManagerMaximumSize
contentMode:PHImageContentModeDefault//PHImageContentModeAspectFit
options:lPHImageRequestOptions
resultHandler:^(UIImage *limgImage, NSDictionary *info) {
limgImageResult = limgImage;
}];
return limgImageResult;
}
Here's the basics of setting Make and Model metadata on a .jpg file in Swift 3 https://gist.github.com/lacyrhoades/09d8a367125b6225df5038aec68ed9e7 The higher level versions, like using ExifContainer pod, did not work for me.