How to Create UIImage from NSData and Avatar Data of XMPP? - iphone

This question is related to Iphone SDK, NSData and UIImage.
I am trying to create an image from the Avatar Data returned from the xmpp like the following:
<presence from='yyy#184.73.164.51/spark' to='ken#184.73.164.51/424978324712783686768453' id='Oj02v-45'><status>Away due to idle.</status><priority>0</priority><show>away</show><x xmlns='vcard-temp:x:update'><photo>a3f549fa9705e7ead2905de0b6a804227ecdd404</photo></x><x xmlns='jabber:x:avatar'><hash>a3f549fa9705e7ead2905de0b6a804227ecdd404</hash></x></presence>
So in this case, I assume that a3f549fa9705e7ead2905de0b6a804227ecdd404 is the photo data.
So How can I transfer this into NSData?
I think if I can get the NSData object,
I can easily create the UIImage, right?
I think "a3f549fa9705e7ead2905de0b6a804227ecdd404" is the photo data
this is my codes:
NSString* command = #"a3f549fa9705e7ead2905de0b6a804227ecdd404";
command = [command stringByReplacingOccurrencesOfString:#" " withString:#""];
NSMutableData *commandToSend= [[NSMutableData alloc] init];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [command length]/2; i++) {
byte_chars[0] = [command characterAtIndex:i*2];
byte_chars[1] = [command characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[commandToSend appendBytes:&whole_byte length:1];
}
UIImage *image = [UIImage imageWithData: commandToSend];
However,
it doesn't work.
Anyone knows what's wrong with it?

In XMPPPresence.m add this method
-(NSString *)photo {
NSXMLElement *xElement = [self elementForName:#"x" xmlns:#"vcard-temp:x:update"];
NSString *photoHash = [[xElement elementForName:#"photo"]stringValue];
return photoHash;
}
// In XMPPStream's delegate:
- (void)xmppStream:(XMPPStream *)stream didReceivePresence:
(XMPPPresence *)presence {
NSString *photoHash = [presence photo];
if ([photoHash length] > 0) { // in case when there's no photo hash
XMPPJID *rosterJID = [presence from];
BOOL requestPhoto = ... // determine if you need to request new
photo or nor
if (requestPhoto) {
NSXMLElement *iqAvatar = [NSXMLElement elementWithName:#"iq"];
NSXMLElement *queryAvatar = [NSXMLElement elementWithName:#"vCard"
xmlns:#"vcard-temp"];
[iqAvatar addAttributeWithName:#"type" stringValue:#"get"];
[iqAvatar addAttributeWithName:#"to" stringValue:[rosterJID full]];
[iqAvatar addChild:queryAvatar];
XMPPIQ *avatarRequestIQ = [XMPPIQ iqFromElement:iqAvatar];
[stream sendElement:avatarRequestIQ];
}
}
}
// And when buddy will send photo, it will be in vcard BASE64-encoded.
// You will receive it as IQ:
- (BOOL)xmppStream:(XMPPStream *)stream didReceiveIQ:(XMPPIQ *)iq {
XMPPElement *vCardPhotoElement = (XMPPElement *)[[iq
elementForName:#"vCard"] elementForName:#"PHOTO"];
if (vCardPhotoElement != nil) {
// avatar data
NSString *base64DataString = [[vCardPhotoElement
elementForName:#"BINVAL"] stringValue];
NSData *imageData = [NSData
dataFromBase64String:base64DataString]; // you need to get NSData
BASE64 category
UIImage *avatarImage = [UIImage imageWithData:imageData];
XMPPJID *senderJID = [iq from];
[self xmppStream:stream didReceiveImage:avatarImage
forBuddy:senderJID]; // this is my custom delegate method where I
save new avatar to cache
}
return NO;
}
Hope this will help you.

That is the picture hash you now have to send a vcard request which will contain the same hash for verification and binval containing the picture data in base64

Related

UIImage Not being Set from NSData

The goal is to pull a image stored as a varbinary in a sql server through a web service that sends a sqlbinary as a JSON to an iphone. I'm having trouble setting the UIImage from the base64binary sent from the JSON. I'm able to convert the binary to NSData but the image is not being set through the data.
for (int i = 0; i < array.count; i++) {
NSDictionary *mealInfo = [array objectAtIndex:i];
Meal *meal =[[Meal alloc]initWithRestaurant:[mealInfo objectForKey:#"restaurantname"]
mealName:[mealInfo objectForKey:#"itemname"]
description:[mealInfo objectForKey:#"itemdescription"]
Time:[mealInfo objectForKey:#"mealTime"]
price:[mealInfo objectForKey:#"itemprice"]];
//NSString *str = #"data:image/jpg;base64,";
//str = [str stringByAppendingString:[mealInfo objectForKey:#"itemImage"]];
//NSData *imageData = [NSData dataWithContentsOfURL:[NSURL URLWithString:str]];
NSString *str = [mealInfo objectForKey:#"itemImage"];
NSLog(#"%#", str);
NSData *d = [[NSData alloc]initWithData:[NSData dataFromBase64String:str]];
UIImage *image = [UIImage imageWithData:d];
[meal setMealImage:image];
[meals addObject:meal];
}
NSLog(#"%#",[[meals objectAtIndex:0]mealPrice]);
NSLog(#"This is how many meals %d", meals.count);
Assuming the string containing the base 64 encoded is good, your code looks OK. I would look at the dataFromBase64String method to see if that is causing the problem. Here is a version I use based on some else's work:
-(NSData *)dataFromBase64EncodedString:(NSString *)string{
if (string.length > 0) {
//the iPhone has base 64 decoding built in but not obviously. The trick is to
//create a data url that's base 64 encoded and ask an NSData to load it.
NSString *data64URLString = [NSString stringWithFormat:#"data:;base64,%#", string];
NSData *data = [NSData dataWithContentsOfURL:[NSURL URLWithString:data64URLString]];
return data;
}
return nil;
}
You could add a NSData category with this method to make its use very convenient.

iOS zip with gzipDeflate

I'm using the NSData+compression.h and the Base64Transcoder.h elements to be able to zip and unzip content.
Basically to unzip the server responses.
The unzip method works perfectly
+ (NSString *) unzip: (NSString*) stringValue{
Byte inputData[[stringValue lengthOfBytesUsingEncoding:NSUTF8StringEncoding]];
[[stringValue dataUsingEncoding:NSUTF8StringEncoding] getBytes:inputData];
size_t inputDataSize = (size_t)[stringValue length];
size_t outputDataSize = EstimateBas64DecodedDataSize(inputDataSize);
Byte outputData[outputDataSize];//prepare a Byte[] for the decoded data
Base64DecodeData(inputData, inputDataSize, outputData, &outputDataSize);
NSData *theData = [[NSData alloc] initWithBytes:outputData length:outputDataSize];
//And now we gunzip:
NSData* result = [theData gzipInflate];//make bigger==gunzip
NSString *temp = [[NSString alloc] initWithData:result encoding:NSUTF8StringEncoding];
return temp;
}
But when I try to zip a content, using the simetric way, the gzipDeflate fails, and return an empty or nil value.
This is my zip code
+ (NSData *) zip:(NSData *) theSourceData {
// And now we zip:
NSData *result = [theSourceData gzipDeflate];
Byte inputData[[result length]];
[result getBytes:inputData];
size_t inputDataSize = (size_t)[result length];
size_t outputDataSize = EstimateBas64DecodedDataSize(inputDataSize);
char outputData[outputDataSize];//prepare a Byte[] for the decoded data
Base64EncodeData(inputData, inputDataSize, outputData, &outputDataSize, NO);
NSData *theData = [[NSData alloc] initWithBytes:outputData length:outputDataSize];
return theData;
}
Any suggestions?
Thanks
The problem was on the Base64 encoder.
+ (NSString *) zip:(NSData *) theSourceData {
// And now we zip:
NSData *result = [theSourceData gzipDeflate];
NSString *source = [NSString base64StringFromData:result length:[result length]];
return source;
}
We've integrated the base64StringFromData:length: method to solve it.
Thanks,
Ivan

Get Exif data from UIImage - UIImagePickerController [duplicate]

This question already has answers here:
UIImagePickerController and extracting EXIF data from existing photos
(18 answers)
Closed 7 years ago.
How can we get Exif information from UIImage selected from UIImagePickerController?
I had done much R&D for this and got many replies but still failed to implement this.
I had gone through this this and this link
Please help me to solve this problem.
Thanks in advance..
Interesting question! I came up with the following solution working for images picked from your photo library (note my code is using ARC):
Import AssetsLibrary.framework and ImageIO.framework.
Then include the needed classes inside your .h-file:
#import <AssetsLibrary/ALAsset.h>
#import <AssetsLibrary/ALAssetRepresentation.h>
#import <ImageIO/CGImageSource.h>
#import <ImageIO/CGImageProperties.h>
And put this inside your imagePickerController:didFinishPickingMediaWithInfo: delegate method:
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:[info objectForKey:UIImagePickerControllerReferenceURL]
resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *image_representation = [asset defaultRepresentation];
// create a buffer to hold image data
uint8_t *buffer = (Byte*)malloc(image_representation.size);
NSUInteger length = [image_representation getBytes:buffer fromOffset: 0.0 length:image_representation.size error:nil];
if (length != 0) {
// buffer -> NSData object; free buffer afterwards
NSData *adata = [[NSData alloc] initWithBytesNoCopy:buffer length:image_representation.size freeWhenDone:YES];
// identify image type (jpeg, png, RAW file, ...) using UTI hint
NSDictionary* sourceOptionsDict = [NSDictionary dictionaryWithObjectsAndKeys:(id)[image_representation UTI] ,kCGImageSourceTypeIdentifierHint,nil];
// create CGImageSource with NSData
CGImageSourceRef sourceRef = CGImageSourceCreateWithData((__bridge CFDataRef) adata, (__bridge CFDictionaryRef) sourceOptionsDict);
// get imagePropertiesDictionary
CFDictionaryRef imagePropertiesDictionary;
imagePropertiesDictionary = CGImageSourceCopyPropertiesAtIndex(sourceRef,0, NULL);
// get exif data
CFDictionaryRef exif = (CFDictionaryRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyExifDictionary);
NSDictionary *exif_dict = (__bridge NSDictionary*)exif;
NSLog(#"exif_dict: %#",exif_dict);
// save image WITH meta data
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSURL *fileURL = nil;
CGImageRef imageRef = CGImageSourceCreateImageAtIndex(sourceRef, 0, imagePropertiesDictionary);
if (![[sourceOptionsDict objectForKey:#"kCGImageSourceTypeIdentifierHint"] isEqualToString:#"public.tiff"])
{
fileURL = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/%#.%#",
documentsDirectory,
#"myimage",
[[[sourceOptionsDict objectForKey:#"kCGImageSourceTypeIdentifierHint"] componentsSeparatedByString:#"."] objectAtIndex:1]
]];
CGImageDestinationRef dr = CGImageDestinationCreateWithURL ((__bridge CFURLRef)fileURL,
(__bridge CFStringRef)[sourceOptionsDict objectForKey:#"kCGImageSourceTypeIdentifierHint"],
1,
NULL
);
CGImageDestinationAddImage(dr, imageRef, imagePropertiesDictionary);
CGImageDestinationFinalize(dr);
CFRelease(dr);
}
else
{
NSLog(#"no valid kCGImageSourceTypeIdentifierHint found …");
}
// clean up
CFRelease(imageRef);
CFRelease(imagePropertiesDictionary);
CFRelease(sourceRef);
}
else {
NSLog(#"image_representation buffer length == 0");
}
}
failureBlock:^(NSError *error) {
NSLog(#"couldn't get asset: %#", error);
}
];
One thing I noticed is, that iOS will ask the user to allow location services – if he denies, you won't be abled to get the image data …
EDIT
Added code to save the image including its meta data. It's a quick approach, so maybe there is a better way, but it works!
These answers all seem extremely complex. If the image has been saved to the Camera Roll, and you have the ALAsset (either from UIImagePicker or ALAssetLibrary) you can get the metadata like so:
asset.defaultRepresentation.metadata;
If you want to save that image from camera roll to another location (say in Sandbox/Documents) simply do:
CGImageDestinationRef imageDestinationRef = CGImageDestinationCreateWithURL((__bridge CFURLRef)urlToSaveTo, kUTTypeJPEG, 1, NULL);
CFDictionaryRef imagePropertiesRef = (__bridge CFDictionaryRef)asset.defaultRepresentation.metadata;
CGImageDestinationAddImage(imageDestinationRef, asset.defaultRepresentation.fullResolutionImage, imagePropertiesRef);
if (!CGImageDestinationFinalize(imageDestinationRef)) NSLog(#"Failed to copy photo on save to %#", urlToSaveTo);
CFRelease(imageDestinationRef);
I had found solution and got answer from here
From here We can get GPS info as well..
Amazing and thanks all for helping me to solve this problem.
UPDATE
This is another function that I had created myself, also return Exif data as well as GPS data and in this function we doesn't need any third party library.. but you have to turn on location services for this. and use current latitude and longitude for that. so have to use CoreLocation.framework
//FOR CAMERA IMAGE
-(NSMutableData *)getImageWithMetaData:(UIImage *)pImage
{
NSData* pngData = UIImagePNGRepresentation(pImage);
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)pngData, NULL);
NSDictionary *metadata = (NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source, 0, NULL);
NSMutableDictionary *metadataAsMutable = [[metadata mutableCopy]autorelease];
[metadata release];
//For GPS Dictionary
NSMutableDictionary *GPSDictionary = [[[metadataAsMutable objectForKey:(NSString *)kCGImagePropertyGPSDictionary]mutableCopy]autorelease];
if(!GPSDictionary)
GPSDictionary = [NSMutableDictionary dictionary];
[GPSDictionary setValue:[NSNumber numberWithDouble:currentLatitude] forKey:(NSString*)kCGImagePropertyGPSLatitude];
[GPSDictionary setValue:[NSNumber numberWithDouble:currentLongitude] forKey:(NSString*)kCGImagePropertyGPSLongitude];
NSString* ref;
if (currentLatitude <0.0)
ref = #"S";
else
ref =#"N";
[GPSDictionary setValue:ref forKey:(NSString*)kCGImagePropertyGPSLatitudeRef];
if (currentLongitude <0.0)
ref = #"W";
else
ref =#"E";
[GPSDictionary setValue:ref forKey:(NSString*)kCGImagePropertyGPSLongitudeRef];
[GPSDictionary setValue:[NSNumber numberWithFloat:location.altitude] forKey:(NSString*)kCGImagePropertyGPSAltitude];
//For EXIF Dictionary
NSMutableDictionary *EXIFDictionary = [[[metadataAsMutable objectForKey:(NSString *)kCGImagePropertyExifDictionary]mutableCopy]autorelease];
if(!EXIFDictionary)
EXIFDictionary = [NSMutableDictionary dictionary];
[EXIFDictionary setObject:[NSDate date] forKey:(NSString*)kCGImagePropertyExifDateTimeOriginal];
[EXIFDictionary setObject:[NSDate date] forKey:(NSString*)kCGImagePropertyExifDateTimeDigitized];
//add our modified EXIF data back into the image’s metadata
[metadataAsMutable setObject:EXIFDictionary forKey:(NSString *)kCGImagePropertyExifDictionary];
[metadataAsMutable setObject:GPSDictionary forKey:(NSString *)kCGImagePropertyGPSDictionary];
CFStringRef UTI = CGImageSourceGetType(source);
NSMutableData *dest_data = [NSMutableData data];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((CFMutableDataRef)dest_data, UTI, 1, NULL);
if(!destination)
dest_data = [[pngData mutableCopy] autorelease];
else
{
CGImageDestinationAddImageFromSource(destination, source, 0, (CFDictionaryRef) metadataAsMutable);
BOOL success = CGImageDestinationFinalize(destination);
if(!success)
dest_data = [[pngData mutableCopy] autorelease];
}
if(destination)
CFRelease(destination);
CFRelease(source);
return dest_data;
}
//FOR PHOTO LIBRARY IMAGE
-(NSMutableData *)getImagedataPhotoLibrary:(NSDictionary *)pImgDictionary andImage:(UIImage *)pImage
{
NSData* data = UIImagePNGRepresentation(pImage);
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)data, NULL);
NSMutableDictionary *metadataAsMutable = [[pImgDictionary mutableCopy]autorelease];
CFStringRef UTI = CGImageSourceGetType(source);
NSMutableData *dest_data = [NSMutableData data];
//For Mutabledata
CGImageDestinationRef destination = CGImageDestinationCreateWithData((CFMutableDataRef)dest_data, UTI, 1, NULL);
if(!destination)
dest_data = [[data mutableCopy] autorelease];
else
{
CGImageDestinationAddImageFromSource(destination, source, 0, (CFDictionaryRef) metadataAsMutable);
BOOL success = CGImageDestinationFinalize(destination);
if(!success)
dest_data = [[data mutableCopy] autorelease];
}
if(destination)
CFRelease(destination);
CFRelease(source);
return dest_data;
}
and We will retrieve that data like this
//FOR CAMERA IMAGE
NSData *originalImgData = [self getImageWithMetaData:imgOriginal];
//FOR PHOTO LIBRARY IMAGE
[self getImagedataPhotoLibrary:[[myasset defaultRepresentation] metadata] andImage:imgOriginal];
For all of this you should have to Import AssetsLibrary.framework and ImageIO.framework.
I have used this method for getting the exifdata dictionary from image , I hope this will also work for you
-(void)getExifDataFromImage:(UIImage *)currentImage
{
NSData* pngData = UIImageJPEGRepresentation(currentImage, 1.0);
CGImageSourceRef mySourceRef = CGImageSourceCreateWithData((CFDataRef)pngData, NULL);
//CGImageSourceRef mySourceRef = CGImageSourceCreateWithURL((__bridge CFURLRef)myURL, NULL);
if (mySourceRef != NULL)
{
NSDictionary *myMetadata = (__bridge NSDictionary *)CGImageSourceCopyPropertiesAtIndex(mySourceRef,0,NULL);
NSDictionary *exifDic = [myMetadata objectForKey:(NSString *)kCGImagePropertyExifDictionary];
NSDictionary *tiffDic = [myMetadata objectForKey:(NSString *)kCGImagePropertyTIFFDictionary];
NSLog(#"exifDic properties: %#", myMetadata); //all data
float rawShutterSpeed = [[exifDic objectForKey:(NSString *)kCGImagePropertyExifExposureTime] floatValue];
int decShutterSpeed = (1 / rawShutterSpeed);
NSLog(#"Camera %#",[tiffDic objectForKey:(NSString *)kCGImagePropertyTIFFModel]);
NSLog(#"Focal Length %#mm",[exifDic objectForKey:(NSString *)kCGImagePropertyExifFocalLength]);
NSLog(#"Shutter Speed %#", [NSString stringWithFormat:#"1/%d", decShutterSpeed]);
NSLog(#"Aperture f/%#",[exifDic objectForKey:(NSString *)kCGImagePropertyExifFNumber]);
NSNumber *ExifISOSpeed = [[exifDic objectForKey:(NSString*)kCGImagePropertyExifISOSpeedRatings] objectAtIndex:0];
NSLog(#"ISO %ld",[ExifISOSpeed integerValue]);
NSLog(#"Taken %#",[exifDic objectForKey:(NSString*)kCGImagePropertyExifDateTimeDigitized]);
}
}
You need ALAssetsLibrary to actually retrieve the EXIF info from an image. The EXIF is added to an image only when it is saved to the Photo Library. Even if you use ALAssetLibrary to get an image asset from the library, it will lose all EXIF info if you set it to a UIImage.
I have tried to insert GPS coordinates into image metadata picked by iPad Camera as it was suggested by Mehul.
It Works, Thank you for your post.
P.S.
Who intends to use that code, just substitude the two geolocations at the top of the function -(NSMutableData *)getImageWithMetaData:(UIImage *)pImage {
double currentLatitude = [locationManager location].coordinate.latitude;
double currentLongitude = [locationManager location].coordinate.longitude;
...
By supposing that you have already initializied somewhere locationManager in your code, like this:
locationManager = [[CLLocationManager alloc] init];
[locationManager setDesiredAccuracy:kCLLocationAccuracyBest];
[locationManager setDelegate:self]; // Not necessary in this case
[locationManager startUpdatingLocation]; // Not neccessary in this case
and by importing CoreLocation/CoreLocation.h and ImageIO/ImageIO.h headers with associated frameworks.

How to write exif metadata to an image (not the camera roll, just a UIImage or JPEG)

I am aware of how to save metadata using ALAssets. But, I want to save an image, or upload it somewhere, with exif intact. I have exif data as an NSDictionary. But how can I inject it properly into a UIImage (or probably an NSData JPEG representation)?
I am using UIImagePickerController to get the image from the camera and my flow is a bit different than the one described by Chiquis. Here it is:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = info[#"UIImagePickerControllerOriginalImage"];
NSString *fullPhotoFilename = ...; // generate the photo name and path here
NSData *photoData = [UIImage taggedImageData:image.jpegData metadata:info[#"UIImagePickerControllerMediaMetadata"] orientation:image.imageOrientation];
[photoData writeToFile:fullPhotoFilename atomically:YES];
}
And using a UIImage category to put combine the image data with its metadata:
#import <ImageIO/ImageIO.h>
#import "UIImage+Tagging.h"
#import "LocationHelper.h"
#implementation UIImage (Tagging)
+ (NSData *)writeMetadataIntoImageData:(NSData *)imageData metadata:(NSMutableDictionary *)metadata {
// create an imagesourceref
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) imageData, NULL);
// this is the type of image (e.g., public.jpeg)
CFStringRef UTI = CGImageSourceGetType(source);
// create a new data object and write the new image into it
NSMutableData *dest_data = [NSMutableData data];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data, UTI, 1, NULL);
if (!destination) {
NSLog(#"Error: Could not create image destination");
}
// add the image contained in the image source to the destination, overidding the old metadata with our modified metadata
CGImageDestinationAddImageFromSource(destination, source, 0, (__bridge CFDictionaryRef) metadata);
BOOL success = NO;
success = CGImageDestinationFinalize(destination);
if (!success) {
NSLog(#"Error: Could not create data from image destination");
}
CFRelease(destination);
CFRelease(source);
return dest_data;
}
+ (NSData *)taggedImageData:(NSData *)imageData metadata:(NSDictionary *)metadata orientation:(UIImageOrientation)orientation {
CLLocationManager *locationManager = [CLLocationManager new];
CLLocation *location = [locationManager location];
NSMutableDictionary *newMetadata = [NSMutableDictionary dictionaryWithDictionary:metadata];
if (!newMetadata[(NSString *)kCGImagePropertyGPSDictionary] && location) {
newMetadata[(NSString *)kCGImagePropertyGPSDictionary] = [LocationHelper gpsDictionaryForLocation:location];
}
// Reference: http://sylvana.net/jpegcrop/exif_orientation.html
int newOrientation;
switch (orientation) {
case UIImageOrientationUp:
newOrientation = 1;
break;
case UIImageOrientationDown:
newOrientation = 3;
break;
case UIImageOrientationLeft:
newOrientation = 8;
break;
case UIImageOrientationRight:
newOrientation = 6;
break;
case UIImageOrientationUpMirrored:
newOrientation = 2;
break;
case UIImageOrientationDownMirrored:
newOrientation = 4;
break;
case UIImageOrientationLeftMirrored:
newOrientation = 5;
break;
case UIImageOrientationRightMirrored:
newOrientation = 7;
break;
default:
newOrientation = -1;
}
if (newOrientation != -1) {
newMetadata[(NSString *)kCGImagePropertyOrientation] = #(newOrientation);
}
NSData *newImageData = [self writeMetadataIntoImageData:imageData metadata:newMetadata];
return newImageData;
}
And finally, here is the method I am using to generate the needed GPS dictionary:
+ (NSDictionary *)gpsDictionaryForLocation:(CLLocation *)location {
NSTimeZone *timeZone = [NSTimeZone timeZoneWithName:#"UTC"];
NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
[formatter setTimeZone:timeZone];
[formatter setDateFormat:#"HH:mm:ss.SS"];
NSDictionary *gpsDict = #{(NSString *)kCGImagePropertyGPSLatitude: #(fabs(location.coordinate.latitude)),
(NSString *)kCGImagePropertyGPSLatitudeRef: ((location.coordinate.latitude >= 0) ? #"N" : #"S"),
(NSString *)kCGImagePropertyGPSLongitude: #(fabs(location.coordinate.longitude)),
(NSString *)kCGImagePropertyGPSLongitudeRef: ((location.coordinate.longitude >= 0) ? #"E" : #"W"),
(NSString *)kCGImagePropertyGPSTimeStamp: [formatter stringFromDate:[location timestamp]],
(NSString *)kCGImagePropertyGPSAltitude: #(fabs(location.altitude)),
};
return gpsDict;
}
Hope it helps someone. Thanks to Gustavo Ambrozio, Chiquis and several others SO members I was able to piece it together and use it in my project.
UIImage does not contain metadata information (it is stripped). So if you want to save it without using the imagepicker method (not in camera roll):
Follow the answer here to write to a file with the metadata intact:
Problem setting exif data for an image
no idea why would this be downvoted but here is the method:
In this case im getting the image through AVFoundation and this is what goes in the
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
// code here
}
block code:
CFDictionaryRef metaDict = CMCopyDictionaryOfAttachments(NULL, imageSampleBuffer, kCMAttachmentMode_ShouldPropagate);
CFMutableDictionaryRef mutable = CFDictionaryCreateMutableCopy(NULL, 0, metaDict);
// Create formatted date
NSTimeZone *timeZone = [NSTimeZone timeZoneWithName:#"UTC"];
NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
[formatter setTimeZone:timeZone];
[formatter setDateFormat:#"HH:mm:ss.SS"];
// Create GPS Dictionary
NSDictionary *gpsDict = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:fabs(loc.coordinate.latitude)], kCGImagePropertyGPSLatitude
, ((loc.coordinate.latitude >= 0) ? #"N" : #"S"), kCGImagePropertyGPSLatitudeRef
, [NSNumber numberWithFloat:fabs(loc.coordinate.longitude)], kCGImagePropertyGPSLongitude
, ((loc.coordinate.longitude >= 0) ? #"E" : #"W"), kCGImagePropertyGPSLongitudeRef
, [formatter stringFromDate:[loc timestamp]], kCGImagePropertyGPSTimeStamp
, [NSNumber numberWithFloat:fabs(loc.altitude)], kCGImagePropertyGPSAltitude
, nil];
// The gps info goes into the gps metadata part
CFDictionarySetValue(mutable, kCGImagePropertyGPSDictionary, (__bridge void *)gpsDict);
// Here just as an example im adding the attitude matrix in the exif comment metadata
CMRotationMatrix m = att.rotationMatrix;
GLKMatrix4 attMat = GLKMatrix4Make(m.m11, m.m12, m.m13, 0, m.m21, m.m22, m.m23, 0, m.m31, m.m32, m.m33, 0, 0, 0, 0, 1);
NSMutableDictionary *EXIFDictionary = (__bridge NSMutableDictionary*)CFDictionaryGetValue(mutable, kCGImagePropertyExifDictionary);
[EXIFDictionary setValue:NSStringFromGLKMatrix4(attMat) forKey:(NSString *)kCGImagePropertyExifUserComment];
CFDictionarySetValue(mutable, kCGImagePropertyExifDictionary, (__bridge void *)EXIFDictionary);
NSData *jpeg = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer] ;
After this code you will have your image in the jpeg nsdata and the correspoding dictionary for that image in the mutable cfdictionary.
All you have to do now is:
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)jpeg, NULL);
CFStringRef UTI = CGImageSourceGetType(source); //this is the type of image (e.g., public.jpeg)
NSMutableData *dest_data = [NSMutableData data];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,UTI,1,NULL);
if(!destination) {
NSLog(#"***Could not create image destination ***");
}
//add the image contained in the image source to the destination, overidding the old metadata with our modified metadata
CGImageDestinationAddImageFromSource(destination,source,0, (CFDictionaryRef) mutable);
//tell the destination to write the image data and metadata into our data object.
//It will return false if something goes wrong
BOOL success = CGImageDestinationFinalize(destination);
if(!success) {
NSLog(#"***Could not create data from image destination ***");
}
//now we have the data ready to go, so do whatever you want with it
//here we just write it to disk at the same path we were passed
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0]; // Get documents folder
NSString *dataPath = [documentsDirectory stringByAppendingPathComponent:#"ImagesFolder"];
NSError *error;
if (![[NSFileManager defaultManager] fileExistsAtPath:dataPath])
[[NSFileManager defaultManager] createDirectoryAtPath:dataPath withIntermediateDirectories:NO attributes:nil error:&error]; //Create folder
// NSString *imageName = #"ImageName";
NSString *fullPath = [dataPath stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.jpg", name]]; //add our image to the path
[dest_data writeToFile:fullPath atomically:YES];
//cleanup
CFRelease(destination);
CFRelease(source);
Note how I'm not saving using the ALAssets but directly into a folder of my choice.
Btw most of this code can be found in the link I posted at first.
There is easier way. If you need to save some exif, you can use SimpleExif pod
First create a ExifContainer:
ExifContainer *container = [[ExifContainer alloc] init];
and populate it with all requred data:
[container addUserComment:#"A long time ago, in a galaxy far, far away"];
[container addCreationDate:[NSDate dateWithTimeIntervalSinceNow:-10000000]];
[container addLocation:locations[0]];
Then you can add this data to image:
NSData *imageData = [[UIImage imageNamed:#"DemoImage"] addExif:container];
Then you just save this data as a JPEG
I faced the same problem, now I can upload files with EXIF data, also you can compress photo if need it, this solved the issue for me:
// Get your image.
UIImage *loImgPhoto = [self getImageFromAsset:loPHAsset];
// Get your metadata (includes the EXIF data).
CGImageSourceRef loImageOriginalSource = CGImageSourceCreateWithData(( CFDataRef) loDataFotoOriginal, NULL);
NSDictionary *loDicMetadata = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(loImageOriginalSource, 0, NULL);
// Set your compression quality (0.0 to 1.0).
NSMutableDictionary *loDicMutableMetadata = [loDicMetadata mutableCopy];
[loDicMutableMetadata setObject:#(lfCompressionQualityValue) forKey:(__bridge NSString *)kCGImageDestinationLossyCompressionQuality];
// Create an image destination.
NSMutableData *loNewImageDataWithExif = [NSMutableData data];
CGImageDestinationRef loImgDestination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)loNewImageDataWithExif, CGImageSourceGetType(loImageOriginalSource), 1, NULL);
// Add your image to the destination.
CGImageDestinationAddImage(loImgDestination, loImgPhoto.CGImage, (__bridge CFDictionaryRef) loDicMutableMetadata);
// Finalize the destination.
if (CGImageDestinationFinalize(loImgDestination))
{
NSLog(#"Successful image creation.");
// process the image rendering, adjustment data creation and finalize the asset edit.
//Upload photo with EXIF metadata
[self myUploadMethod:loNewImageDataWithExif];
}
else
{
NSLog(#"Error -> failed to finalize the image.");
}
CFRelease(loImageOriginalSource);
CFRelease(loImgDestination);
getImageFromAsset method:
-(UIImage *)getImageFromAsset:(PHAsset *)aPHAsset
{
__block UIImage *limgImageResult;
PHImageRequestOptions *lPHImageRequestOptions = [PHImageRequestOptions new];
lPHImageRequestOptions.synchronous = YES;
[self.imageManager requestImageForAsset:aPHAsset
targetSize:PHImageManagerMaximumSize
contentMode:PHImageContentModeDefault//PHImageContentModeAspectFit
options:lPHImageRequestOptions
resultHandler:^(UIImage *limgImage, NSDictionary *info) {
limgImageResult = limgImage;
}];
return limgImageResult;
}
Here's the basics of setting Make and Model metadata on a .jpg file in Swift 3 https://gist.github.com/lacyrhoades/09d8a367125b6225df5038aec68ed9e7 The higher level versions, like using ExifContainer pod, did not work for me.

NSString to NSData conversion Problem

I have some Bytes of image in my string and i want to draw it to UIImageView ...Here is my code
NSString* str= #"<89504e47 0d0a1a0a 0000000d 49484452 ........... 454e44ae 426082>";
NSData* data=[str dataUsingEncoding:NSUTF8StringEncoding];
NSLog(#"My NSDATA %#",data);
imageView.image=[UIImage imageWithData:data];
Now when i saw that printed data on console it is not in same format what i gave to that string..The output is something like.....
<3c383935 30346534 37203064 30613161..........
So my imageview show nothing..... please help
if question was: How to convert string data to image then this is answer.
NSData *imgData = [NSData dataWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"icon" ofType:#"png"]];
// set your string data into inputString var
NSString *inputString = [imgData description];
NSLog(#"input string %#",inputString);
// clearing string from trashes
NSString *dataStr = [inputString stringByTrimmingCharactersInSet:[NSCharacterSet characterSetWithCharactersInString:#"<>"]];
// separate by words of 4 bytes
NSArray *words = [dataStr componentsSeparatedByString:#" "];
// calculate number of bytes
NSArray *sizes = [words valueForKey:#"length"];
int sizeOfBytes = 0;
for (NSNumber *size in sizes) {
sizeOfBytes += [size intValue]/2;
}
int bytes[sizeOfBytes];
int counts = 0;
for (NSString *word in words) {
// convert each word from string to int
NSMutableString *ostr = [NSMutableString stringWithCapacity:[word length]];
while ([word length] > 0) {
[ostr appendFormat:#"%#", [word substringFromIndex:[word length] - 2]];
word = [word substringToIndex:[word length] - 2];
}
NSScanner *scaner = [NSScanner scannerWithString:ostr];
unsigned int val;
[scaner scanHexInt:&val];
bytes[counts] = val;
counts++;
}
// get NSData form c array
NSData* data = [NSData dataWithBytes:bytes length:sizeOfBytes];
NSLog(#"My NSDATA %#",data);
// your image is ready
UIImage *image = [UIImage imageWithData:data];
NSLog(#"image: %#",image);
what you are seeing in NSLog output are the ASCII codes of the string characters.
for example:
NSString* str = #"A";
NSData* data=[str dataUsingEncoding:NSUTF8StringEncoding];
NSLog(#"%#",data);
you will see something like:
<41....
that's because 0x41 is the code for letter A.
Same is happening with your string.
The data is exactly what you're feeding it: a simple string (printed as raw byte values). But I guess your input string is a hexdump and you manually need to turn into bytes.