Error saving UIImage to Core Data - iphone

I am having trouble saving a photo into core data. I am trying to save it as an attribute set to 'Transformable' in an Entity. I have seen various discussions on this on SO and the consensus seems to be that in iOS5 and above, I don't need to use a coder as UIImage now conforms to NSCoding. I am getting an error when I try and save Core Data. Please see below the code I am using to save the photo...
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *mediaType = [info objectForKey: UIImagePickerControllerMediaType];
UIImage *originalImage, *editedImage, *imageToSave;
// Handle a still image capture
if (CFStringCompare ((CFStringRef) mediaType, kUTTypeImage, 0) == kCFCompareEqualTo) {
editedImage = (UIImage *) [info objectForKey:UIImagePickerControllerEditedImage];
originalImage = (UIImage *) [info objectForKey:UIImagePickerControllerOriginalImage];
if (editedImage) {
imageToSave = editedImage;
} else {
imageToSave = originalImage;
}
// Convert image to Data for entry into Core Data
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(imageToSave)];
// Add image to Core Data
myEntity.attribute = imageData;
NSError *error = nil;
if (![managedObjectContext save:&error]) {
NSLog(#"Error when saving core data");
NSLog(#"Unresolved error %#, %#", error, [error userInfo]);
abort();
}
}
[[picker parentViewController] dismissModalViewControllerAnimated: YES];
[picker release];
}

I agree to Joseph's answer. But looking at Apple's recommendation for storing image if your image is(Courtesy - Marcus S. Zarra's answer here):
Less than 100K;store as a binary property in your main table
Less than 1M; store as a binary property in a ancillary table to avoid over fetching
Greater than 1M; store on disk and store its file path in the Core Data table.
From your code what I see is you are trying to save image taken from camera to Core Data. We know images taken from Phone/iPad camera are approx 2.5 Mbs nowadays. So it is quite possible that you will get performance issues. So I would advice you to store image in a document directory and save it's path as NSString in your entity. It will be a more efficient way.

I have done this many times. Change the storage type from transformable to Binary Data and you should be fine.
You also want to keep a couple of things in mind. If the image is small (1MB or less), there should be no issue storing it in your main entity. If it is larger, you should have the image stored in an entity by itself for performance reasons. If the image is very large, you may want to consider storing it off in the documents directory like anonymous suggests above.

Related

iphone, Saving data in to-many relationship, Core Data

I'm working on a small iphone app using Core Data, where I got a Person and an Image entities. The relationship between Person and Image is to-many. One person to many images.
Problem happens at save method. During saving first I need 'ADD' more than one image data, that I got from ImagePickerViewController, to Person and then 'SAVE' the Person entity to actual DB. ((in the same method))
if (managedObjectContext == nil)
{
managedObjectContext = [(CoreDataCombine01AppDelegate *)[[UIApplication sharedApplication] delegate] managedObjectContext];
}
//???: how to init _PersonImage?
_PersonImage = (PersonImage *)[NSEntityDescription insertNewObjectForEntityForName:#"PersonImage" inManagedObjectContext:managedObjectContext];
//_PersonImage = [_PersonImage init];
//_PersonImage = [[_PersonImage alloc] init];
_PersonImage.originalImage = imageData;
managedObjectContext = nil;
[managedObjectContext release];
.
if (managedObjectContext == nil)
{
managedObjectContext = [(CoreDataCombine01AppDelegate *)[[UIApplication sharedApplication] delegate] managedObjectContext];
}
person = (Person *)[NSEntityDescription insertNewObjectForEntityForName:#"Person" inManagedObjectContext:managedObjectContext];
[person addPersonToPersonImageObject:_PersonImage];//runTime error
[_PersonImage release];
.
NSError *error = nil;
if (![person.managedObjectContext save:&error]) {
// Handle error
NSLog(#"Unresolved error at save %#, %#", error, [error userInfo]);
exit(-1); // Fail
}
then I got error saying:
"-[NSConcreteMutableData CGImage]: unrecognized selector sent to instance"
and:
"*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[NSConcreteMutableData CGImage]: unrecognized selector sent to instance".
I guessed the error came from double use of NSEntityDescription from one managedObjectContext. So I just init Person class, that derived automatically from data model and manually imported, like the commented line instead of using managedObjectContext.
It doesn't give any error but give me runtime error when I hit my save button.
When I was using one-to-one relationship there was no problem with saving so I guess using managedObjectContext for Person is right way. However, once I use to-many relationship I need save Image data to PersonImage entity. And the entity has to be initialized in a way or another.
What am I missing?
Alright.
I've had a look at the code you posted on GitHub and found a few issues with your DetailView controller, which I am outlining below.
Firstly, you were not passing your managed object context to the detail view properly, meaning that when you were trying to save your objects there was no context for them to be saved from. Think of the Managed Object Context as a "draft" of your persistent store. Any changes you make to an NSManagedObject will be tracked and kept on your context until you persist them with the [managedObjectContext save] command.
So just to be clear, you are creating your context in your AppDelegate, then you passed a reference to it to your RootViewController with rootViewController.managedObjectContext = self.managedObjectContext
From the RootViewController, you need to pass the managedObjectContext down to the DetailView using the same technique. So in your tableView:didSelectRowAtIndexPath: method, you should pass a reference to the context so any changes you make on that view will be tracked and can eventually be persisted:
- (void)tableView:(UITableView *)tableView didSelectRowAtIndexPath:(NSIndexPath *)indexPath {
DetailView *detailView = [[DetailView alloc] initWithNibName:#"DetailView" bundle:nil];
detailView.event = (Event *)[[self fetchedResultsController] objectAtIndexPath:indexPath];
detailView.managedObjectContext = self.managedObjectContext;
// ...
[self.navigationController pushViewController:detailView animated:YES];
[detailView release];
}
I've also updated all the other references to managedObjectContext where you were instantiating a new context object to simply point to self.managedObjectContext,i.e:
JustString *_justString = [NSEntityDescription insertNewObjectForEntityForName:#"JustString" inManagedObjectContext:self.managedObjectContext];
Once that's out of the way, there is only one more thing that was preventing you from saving the image object properly, which TechZen touched on above.
In your DetailView controller, you have a imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{ method where you transform the UIImage into NSData and then assign it to your Image entity instance.
The problem with that is that you already have a transformer method in your object model (see Event.m) called -(id)transformedValue:(id)value.
So basically you were trying to transform the UIImage into NSData and then passing NSData to the entity, which was actually expecting an UIImage. In this case, I'd recommend that you let your object model deal with the data transformation so in your DetailView controller, comment out the UIImage to NSData transformer code, and pass the image directly to your managed object:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
UIImage *selectedImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
// Transform the image to NSData
// ImageToDataTransformer *transformer = [[[ImageToDataTransformer alloc] init] autorelease];
// NSData *imageData = [transformer transformedValue:selectedImage];
Image *_image = [NSEntityDescription insertNewObjectForEntityForName:#"Image" inManagedObjectContext:self.managedObjectContext];
_image.justImage = selectedImage;
[event addEventToImageObject:_image];
[picker dismissModalViewControllerAnimated:YES];
}
After that, you should be good to go so give it a try and let us know if it solves your problems.
CoreData can have a bit of a steep learning curve but once you 'get it' you will realise it's one of the most beautifully crafted APIs available in iOS development.
Good luck with it and let us know how you go!
Rog
The error message you are getting is caused by you confusing an NSData instance for an UIImage instance.
Like NSString, NSArray and similar core classes, NSData is a class cluster instead of single class. NSConcreteMutableData is one of the classes in the cluster.
Your problem arises because you are trying to send the message CGImage to an NSData instance which of course does not have a CGImage method.
Since you have to use a value transformer to store an image in a Core Data attribute, I would suggest looking at you value transformer and/or the method where you assign the image.
Looking at your code, I suspect you have the Person.originalImageset as a transformable attribute but you attempt to assign an NSData object to it. When the value transformer attempts to transform the NSData instance while thinking it is UIImage, it generates the error.

iPhone - User Defaults and UIImages

I've been developing an iPhone app for the last few months. Recently I wanted to up performance and cache a few of the images that are used in the UI. The images are downloaded randomly from the web by the user so I can't add specific images to the project. I'm also already using NSUserDefaults to save other info within the app.
So now I'm attempting to save a dictionary of UIImages to my NSUserDefaults object and get...
-[UIImage encodeWithCoder:]: unrecognized selector sent to instance
I then decided to subclass UIImage with a class named UISaveableImage and implement NSCoding. So now I'm at...
#implementation UISaveableImage
-(void)encodeWithCoder:(NSCoder *)encoder
{
[encoder encodeObject:super forKey:#"image"];
}
-(id)initWithCoder:(NSCoder *)decoder
{
if (self=[super init]){
super = [decoder decodeObjectForKey:#"image"];
}
return self;
}
#end
which isn't any better than where I started. If I was able to convert an UIImage to NSData I would be good, but all I can find are function like UIImagePNGRepresentation which require me to know what type of image this was. Something that UIImage doesn't allow me to do. Thoughts? I feel like I might have wandered down the wrong path...
You don't want to store images in NSUserDefaults. They're big blobs of data, and NSUserDefaults is stored as a plist; you want to write small bits of info to it.
You should write the images to disk, and then store the filenames to defaults:
NSString *filename = myImageFilename;
[UIImagePNGRepresentation(image) writeToFile: myImageFilename atomically];
[[NSUserDefaults standardDefaults] setObject: myImageFilename forKey: #"lastImageFilename"];
Stumbling upon this a year later. I would add (in case someone else stumbles here as well) that you should store the images in the cache directory and avoid iTunes trying to back them up.
- (NSString *)pathForSearchPath:(NSSearchPathDirectory)searchPath {
NSArray *paths = NSSearchPathForDirectoriesInDomains(searchPath, NSUserDomainMask, YES);
NSString *directoryPath = [paths objectAtIndex:0];
return directoryPath;
}
- (NSString *)cacheDirectoryPath {
return [self pathForSearchPath:NSCachesDirectory];
}

Truncated Core Data NSData objects

I am saving arrays of doubles in an NSData* object that is persisted as a binary property in a Core Data (SQLite) data model. I am doing this to store sampled data for graphing in an iPhone app. Sometimes when there are more than 300 doubles in the binary object not all the doubles are getting saved to disk. When I quit and relaunch my app there may be as few as 25 data points that have persisted or as many as 300.
Using NSSQLitePragmasOption with synchronous = FULL and this may be making a difference. It is hard to tell, as bug is intermittent.
Given the warnings about performance problems as a result of using synchronous = FULL, I am seeking advice and pointers.
Thanks.
[[Edit: here is code.]]
The (as yet unrealized) intent of -addToCache: is to add each new datum to the cache but only flush (fault?) Data object periodically.
From Data.m
#dynamic dataSet; // NSData * attribute of Data entity
- (void) addDatum:(double_t)datum
{
DLog(#"-[Data addDatum:%f]", datum);
[self addToCache:datum];
}
- (void) addToCache:(double_t)datum
{
if (cache == nil)
{
cache = [NSMutableData dataWithData:[self dataSet]];
[cache retain];
}
[cache appendBytes:&datum length:sizeof(double_t)];
DLog(#"-[Data addToCache:%f] ... [cache length] = %d; cache = %p", datum, [cache length], cache);
[self flushCache];
}
- (void) wrapup
{
DLog(#"-[Data wrapup]");
[self flushCache];
[cache release];
cache = nil;
DLog(#"[self isFault] = %#", [self isFault] ? #"YES" : #"NO"); // [self isFault] is always NO.
}
- (void) flushCache
{
DLog(#"flushing cache to store");
[self setDataSet:cache];
DLog(#"-[Data flushCache:] [[self dataSet] length] = %d", [[self dataSet] length]);
}
- (double*) bytes
{
return (double*)[[self dataSet] bytes];
}
- (NSInteger) count
{
return [[self dataSet] length]/sizeof(double);
}
- (void) dump
{
ALog(#"Dump Data");
NSInteger numDataPoints = [self count];
double *data = (double*)[self bytes];
ALog(#"numDataPoints = %d", numDataPoints);
for (int i = 0; i
I was trying to get behavior as if my Core Data entity could have an NSMutableData attribute. To do this my NSManagedObject (called Data) had an NSData attribute and an NSMutableData ivar. My app takes sample data from a sensor and appends each data point to the data set - this is why I needed this design.
On each new data point was appended to the NSMutableData and then the NSData attribute was set to the NSMutableData.
I suspect that because the NSData pointer wasn't changing (though its content was), that Core Data did not appreciate the amount of change. Calling -hasChanged on the NSManagedObjectContext showed that there had been changes, and calling -updatedObjects even listed the Data object as having changed. But the actual data that was being written seems to have been truncated (sometimes).
To work around this I changed things slightly. New data points are still appended to NSMutableData but NSData attribute is only set when sampling is completed. This means that there is a chance that a crash might result in truncated data - but for the most part this work around seems to have solved the problem.
Caveat emptor: the bug was always intermittent, so it is possible that is still there - but just harder to manifest.

iPhone UIImage initWithData fails

I'm trying to code up an async image downloader. I use NSURLConnection to get the data into an NSMutableData and use that data once it is complete to initialize a UIImage.
I checked the bytes and it downloads the entire image correctly (right number of bytes at least), however; when I call
[UIImage imageWithData:data]
and then check the properties of the image, it is zero width and a garbage number for height, in fact, same number no matter what the image is. I tried with bunch of different images, png, jpg, different urls, it always downloads the image completely but UIImage can't initialize with that data. What could I be doing wrong here?
Thanks.
Code is really as you'd expect it to look like:
Connection Delegate:
-(void)connectionDidFinishLoading:(NSURLConnection*)theConnection {
[[ImageManager sharedInstance] dataDownloadedBy:self]; }
ImageManager:
-(void)dataDownloadedBy:(WebConnection *)connection{
WebImage *image = [[WebImage alloc] initWithLink:connection.url];
[image setImageFromData:connection.data];
[images addObject:image];
[connection release];}
WebImage:
-(void)setImageFromData:(NSMutableData *)data{
image = [[UIImage alloc] initWithData:data];}
First, I'm sure the UIImage will not initialize with garbage data. The constructor initWithData analyzes the data to determine the file format. If your data is corrupted, the image returned will be nil. Check this at first.
-(void)dataDownloadedBy:(WebConnection *)connection{
WebImage *image = [[WebImage alloc] initWithLink:connection.url];
[image setImageFromData:connection.data];
if (image.image != nil) { [images addObject:image]; }
[connection release];
}
Second, make sure you append the data during the download process. Here is the callback method:
- (void) connection:(NSURLConnection *)connection didReceiveData:(NSData *)data
{
[self.receivedData appendData:data];
}
Finally, your code must absolutely includes the second case: a download failure.
- (void) connection:(NSURLConnection *)connection didFailWithError:(NSError *)error
{
[connection release];
}
Your problem description lacks some important pieces of code.
The URLCache demo from Apple is a very good project to understand async image download.
http://developer.apple.com/iphone/library/samplecode/URLCache/Introduction/Intro.html
I hope this will help you!
I can't tell what is wrong, it could be you don't use the release properly, say i don't know why you release connection at the end of dataDownloadedBy, it is supposed to be image ?
it could help if you post more your code here.
I used to do the same thing, you can have a look the post here
http://blog.163.com/lionyue#126/blog/static/1079307120096895433277/
Hope it helps
If initWithData is failing, most likely the image data you're getting is corrupt. You should save it to a file like this:
[data writeToFile:#"/tmp/foo.jpg" atomically:NO];
and then try to open it in Preview.app.
a different problem, but that gives teh same painful error is if You ask for a file that is not reachable: HTTP server will give back error 404 message, your code read that bytes (they are NOT valid bytes for an image, AND You will got NIL.

UIImagePickerController and extracting EXIF data from existing photos

It's well known that UIImagePickerController doesn't return the metadata of the photo after selection. However, a couple of apps in the app store (Mobile Fotos, PixelPipe) seem to be able to read the original files and the EXIF data stored within them, enabling the app to extract the geodata from the selected photo.
They seem to do this by reading the original file from the /private/var/mobile/Media/DCIM/100APPLE/ folder and running it through an EXIF library.
However, I can't work out a way of matching a photo returned from the UIImagePickerController to a file on disk. I've explored file sizes, but the original file is a JPEG, whilst the returned image is a raw UIImage, making it impossible to know the file size of the image that was selected.
I'm considering making a table of hashes and matching against the first x pixels of each image. This seems a bit over the top though, and probably quite slow.
Any suggestions?
Have you took a look at this exif iPhone library?
http://code.google.com/p/iphone-exif/
Gonna try it on my side. I'd like to get the GPS (geotags) coordinates from the picture that has been taken with the UIImagePickerController :/
After a deeper look, this library seems to take NSData info as an input and the UIImagePickerController returns a UIImage after taking a snapshot. In theory, if we use the selected from the UIkit category for UIImage
NSData * UIImageJPEGRepresentation (
UIImage *image,
CGFloat compressionQuality
);
Then we can convert the UIImage into a NSData instance and then use it with the iPhone exif library.
UPDATE:
I gave a test to the library mentioned above and it seems to work. However because of my limited knwoledge about the EXIF format and the lack of high level API in the library, I don't manage to get the values for the EXIF tags.
Here's my code in case any of you can go further :
#import "EXFJpeg.h"
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)image editingInfo:(NSDictionary *)editingInfo {
NSLog(#"image picked %# with info %#", image, editingInfo);
NSData* jpegData = UIImageJPEGRepresentation (image,0.5);
EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
[jpegScanner scanImageData: jpegData];
EXFMetaData* exifData = jpegScanner.exifMetaData;
EXFJFIF* jfif = jpegScanner.jfif;
EXFTag* tagDefinition = [exifData tagDefinition: [NSNumber numberWithInt:EXIF_DateTime]];
//EXFTag* latitudeDef = [exifData tagDefinition: [NSNumber numberWithInt:EXIF_GPSLatitude]];
//EXFTag* longitudeDef = [exifData tagDefinition: [NSNumber numberWithInt:EXIF_GPSLongitude]];
id latitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLatitude]];
id longitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLongitude]];
id datetime = [exifData tagValue:[NSNumber numberWithInt:EXIF_DateTime]];
id t = [exifData tagValue:[NSNumber numberWithInt:EXIF_Model]];
....
....
The retrieving of tags definition is OK, but all tag values returns nil :(
In case you want to give a try to the library, you need to define a global variable to get it running (as explained in the doc but hum.. :/)
BOOL gLogging = FALSE;
UPDATE 2
Answer here : iPhone - access location information from a photo
A UIImage does not encapsulate the meta information, so we're stuck : for sure, no EXIF info will be given through this interface.
FINAL UPDATE
Ok I managed to get it working, at least to geotag properly pictures returned by the picker.
Before triggering the UIImagePickerController, it's up to you to use the CLLocationManager to retrieve the current CLocation
Once you have it, you can use this method that uses exif-iPhone library to geotag the UIImage from the CLLocation :
-(NSData*) geotagImage:(UIImage*)image withLocation:(CLLocation*)imageLocation {
NSData* jpegData = UIImageJPEGRepresentation(image, 0.8);
EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
[jpegScanner scanImageData: jpegData];
EXFMetaData* exifMetaData = jpegScanner.exifMetaData;
// end of helper methods
// adding GPS data to the Exif object
NSMutableArray* locArray = [self createLocArray:imageLocation.coordinate.latitude];
EXFGPSLoc* gpsLoc = [[EXFGPSLoc alloc] init];
[self populateGPS: gpsLoc :locArray];
[exifMetaData addTagValue:gpsLoc forKey:[NSNumber numberWithInt:EXIF_GPSLatitude] ];
[gpsLoc release];
[locArray release];
locArray = [self createLocArray:imageLocation.coordinate.longitude];
gpsLoc = [[EXFGPSLoc alloc] init];
[self populateGPS: gpsLoc :locArray];
[exifMetaData addTagValue:gpsLoc forKey:[NSNumber numberWithInt:EXIF_GPSLongitude] ];
[gpsLoc release];
[locArray release];
NSString* ref;
if (imageLocation.coordinate.latitude <0.0)
ref = #"S";
else
ref =#"N";
[exifMetaData addTagValue: ref forKey:[NSNumber numberWithInt:EXIF_GPSLatitudeRef] ];
if (imageLocation.coordinate.longitude <0.0)
ref = #"W";
else
ref =#"E";
[exifMetaData addTagValue: ref forKey:[NSNumber numberWithInt:EXIF_GPSLongitudeRef] ];
NSMutableData* taggedJpegData = [[NSMutableData alloc] init];
[jpegScanner populateImageData:taggedJpegData];
[jpegScanner release];
return [taggedJpegData autorelease];
}
// Helper methods for location conversion
-(NSMutableArray*) createLocArray:(double) val{
val = fabs(val);
NSMutableArray* array = [[NSMutableArray alloc] init];
double deg = (int)val;
[array addObject:[NSNumber numberWithDouble:deg]];
val = val - deg;
val = val*60;
double minutes = (int) val;
[array addObject:[NSNumber numberWithDouble:minutes]];
val = val - minutes;
val = val*60;
double seconds = val;
[array addObject:[NSNumber numberWithDouble:seconds]];
return array;
}
-(void) populateGPS:(EXFGPSLoc* ) gpsLoc :(NSArray*) locArray{
long numDenumArray[2];
long* arrPtr = numDenumArray;
[EXFUtils convertRationalToFraction:&arrPtr :[locArray objectAtIndex:0]];
EXFraction* fract = [[EXFraction alloc] initWith:numDenumArray[0]:numDenumArray[1]];
gpsLoc.degrees = fract;
[fract release];
[EXFUtils convertRationalToFraction:&arrPtr :[locArray objectAtIndex:1]];
fract = [[EXFraction alloc] initWith:numDenumArray[0] :numDenumArray[1]];
gpsLoc.minutes = fract;
[fract release];
[EXFUtils convertRationalToFraction:&arrPtr :[locArray objectAtIndex:2]];
fract = [[EXFraction alloc] initWith:numDenumArray[0] :numDenumArray[1]];
gpsLoc.seconds = fract;
[fract release];
}
This works with iOS5 (beta 4) and the camera roll (you need type defs for the blocks in the .h):
-(void) imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if ([mediaType isEqualToString:(NSString*)kUTTypeImage]) {
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
if (url) {
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset) {
CLLocation *location = [myasset valueForProperty:ALAssetPropertyLocation];
// location contains lat/long, timestamp, etc
// extracting the image is more tricky and 5.x beta ALAssetRepresentation has bugs!
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror) {
NSLog(#"cant get image - %#", [myerror localizedDescription]);
};
ALAssetsLibrary *assetsLib = [[ALAssetsLibrary alloc] init];
[assetsLib assetForURL:url resultBlock:resultblock failureBlock:failureblock];
}
}
There is a way in iOS 8
Without using any 3rd party EXIF library.
#import <Photos/Photos.h>
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithALAssetURLs:#[url] options:nil];
PHAsset *asset = fetchResult.firstObject;
//All you need is
//asset.location.coordinate.latitude
//asset.location.coordinate.longitude
//Other useful properties of PHAsset
//asset.favorite
//asset.modificationDate
//asset.creationDate
}
Apple has added an Image I/O Framework in iOS4 which can be used to read EXIF data from pictures. I don't know if the UIImagePickerController returns a picture with the EXIF data embedded though.
Edit: In iOS4 you can fetch the EXIF data by grabbing the value of the UIImagePickerControllerMediaMetadata key in the info dictionary which is passed to the UIImagePickerControllerDelegate delegate.
I had a similar question where I wanted just the date a picture was taken and none of the above appear to solve my problem in a simple way (e.g. no external libraries), so here is all of the data I could find which you can extract from an image after selecting it with the picker:
// Inside whatever implements UIImagePickerControllerDelegate
#import AssetsLibrary;
// ... your other code here ...
#implementation MYImagePickerDelegate
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = info[UIImagePickerControllerMediaType];
UIImage *originalImage = info[UIImagePickerControllerOriginalImage];
UIImage *editedImage = info[UIImagePickerControllerEditedImage];
NSValue *cropRect = info[UIImagePickerControllerCropRect];
NSURL *mediaUrl = info[UIImagePickerControllerMediaURL];
NSURL *referenceUrl = info[UIImagePickerControllerReferenceURL];
NSDictionary *mediaMetadata = info[UIImagePickerControllerMediaMetadata];
NSLog(#"mediaType=%#", mediaType);
NSLog(#"originalImage=%#", originalImage);
NSLog(#"editedImage=%#", editedImage);
NSLog(#"cropRect=%#", cropRect);
NSLog(#"mediaUrl=%#", mediaUrl);
NSLog(#"referenceUrl=%#", referenceUrl);
NSLog(#"mediaMetadata=%#", mediaMetadata);
if (!referenceUrl) {
NSLog(#"Media did not have reference URL.");
} else {
ALAssetsLibrary *assetsLib = [[ALAssetsLibrary alloc] init];
[assetsLib assetForURL:referenceUrl
resultBlock:^(ALAsset *asset) {
NSString *type =
[asset valueForProperty:ALAssetPropertyType];
CLLocation *location =
[asset valueForProperty:ALAssetPropertyLocation];
NSNumber *duration =
[asset valueForProperty:ALAssetPropertyDuration];
NSNumber *orientation =
[asset valueForProperty:ALAssetPropertyOrientation];
NSDate *date =
[asset valueForProperty:ALAssetPropertyDate];
NSArray *representations =
[asset valueForProperty:ALAssetPropertyRepresentations];
NSDictionary *urls =
[asset valueForProperty:ALAssetPropertyURLs];
NSURL *assetUrl =
[asset valueForProperty:ALAssetPropertyAssetURL];
NSLog(#"type=%#", type);
NSLog(#"location=%#", location);
NSLog(#"duration=%#", duration);
NSLog(#"assetUrl=%#", assetUrl);
NSLog(#"orientation=%#", orientation);
NSLog(#"date=%#", date);
NSLog(#"representations=%#", representations);
NSLog(#"urls=%#", urls);
}
failureBlock:^(NSError *error) {
NSLog(#"Failed to get asset: %#", error);
}];
}
[picker dismissViewControllerAnimated:YES
completion:nil];
}
#end
So when you select an image, you get output that looks like this (including date!):
mediaType=public.image
originalImage=<UIImage: 0x7fb38e00e870> size {1280, 850} orientation 0 scale 1.000000
editedImage=<UIImage: 0x7fb38e09e1e0> size {640, 424} orientation 0 scale 1.000000
cropRect=NSRect: {{0, 0}, {1280, 848}}
mediaUrl=(null)
referenceUrl=assets-library://asset/asset.JPG?id=AC072879-DA36-4A56-8A04-4D467C878877&ext=JPG
mediaMetadata=(null)
type=ALAssetTypePhoto
location=(null)
duration=ALErrorInvalidProperty
assetUrl=assets-library://asset/asset.JPG?id=AC072879-DA36-4A56-8A04-4D467C878877&ext=JPG
orientation=0
date=2014-07-14 04:28:18 +0000
representations=(
"public.jpeg"
)
urls={
"public.jpeg" = "assets-library://asset/asset.JPG?id=AC072879-DA36-4A56-8A04-4D467C878877&ext=JPG";
}
Anyway, hopefully that saves someone else some time.
I spend a while working on this as well for an application I was contracted to build. Basically as the API currently stands it is not possible. The basic problem is the UIImage class STRIPS all EXIF data except for the orientation out. Also the function to save to the camera roll strips this data out. So basically the only way to grab and maintain any extra EXIF data is to save it in a private "camera roll" in your application. I have filed this bug with apple as well and emphasized the need to the app reviewer reps we've been in contact with. Hopefully someday they'll add it in.. Otherwise it makes having GEO tagging completely useless as it only works in the "stock" camera application.
NOTE Some applications on the app store hack around this. By, what I have found, directly accessing the camera roll and SAVING photos straight to it to save GEO data. However this only works with the camera roll/saved photos and NOT the rest of the photo library. The photos "synced" to your phone from your computer have all EXIF data except for orientation stripped.
I still can't understand why those applications were approved (heck they even DELETE from the camera roll) and our application which does none of that is still being held back.
For iOS 8 and later you can use Photos Framework.
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
let url = info[UIImagePickerControllerReferenceURL] as? URL
if url != nil {
let fetchResult = PHAsset.fetchAssets(withALAssetURLs: [url!], options: nil)
let asset = fetchResult.firstObject
print(asset?.location?.coordinate.latitude)
print(asset?.creationDate)
}
}
This is something that the public API does not provide, but could be useful to many people. Your primary recourse is to file a bug with Apple that describes what you need (and it can be helpful to explain why you need it as well). Hopefully your request could make it into a future release.
After filing a bug, you could also use one of the Developer Technical Support (DTS) incidents that came with your iPhone Developer Program membership. If there is a public way to do this, an Apple engineer will know. Otherwise, it may at least help get your plight a bit more attention within the mothership. Best of luck!
Use the UIImagePickerControllerMediaURL dictionary key to get the file URL to the original file. Despite what the documentation says, you can get the file URL for photos and not only movies.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
// Try to get the original file.
NSURL *originalFile = [info objectForKey:UIImagePickerControllerMediaURL];
if (originalFile) {
NSData *fileData = [NSData dataWithContentsOfURL:originalFile];
}
}
You might be able to hash the image data returned by the UIImagePickerController and each of the images in the directory and compare them.
Just a thought, but have you tried TTPhotoViewController in the Three20 project on GitHub?
That provides an image picker that can read from multiple sources. You may be able to use it as an alternative to UIImagePickerController, or the source might give you a clue how to work out how to get the info you need.
Is there a specific reason you want to extract the location data from the image? An alternative could be to get the location separately using the CoreLocation framework. If it's only the geodata you need, this might save you some headaches.
it seems that photo attained by UIImagePickerControllerMediaURL don't have exif tags at all
In order to get this metadata you'll have to use the lower level framework AVFoundation.
Take a look at Apple's Squarecam example (http://developer.apple.com/library/ios/#samplecode/SquareCam/Introduction/Intro.html)
Find the method below and add the line, I've added to the code. The metadata dictionary returned also contains a diagnostics NSDictionary object.
- (BOOL)writeCGImageToCameraRoll:(CGImageRef)cgImage withMetadata:(NSDictionary *)metadata
{
NSDictionary *Exif = [metadata objectForKey:#"Exif"]; // Add this line
}
I'm using this for camera roll images
-(CLLocation*)locationFromAsset:(ALAsset*)asset
{
if (!asset)
return nil;
NSDictionary* pickedImageMetadata = [[asset defaultRepresentation] metadata];
NSDictionary* gpsInfo = [pickedImageMetadata objectForKey:(__bridge NSString *)kCGImagePropertyGPSDictionary];
if (gpsInfo){
NSNumber* nLat = [gpsInfo objectForKey:(__bridge NSString *)kCGImagePropertyGPSLatitude];
NSNumber* nLng = [gpsInfo objectForKey:(__bridge NSString *)kCGImagePropertyGPSLongitude];
if (nLat && nLng)
return [[CLLocation alloc]initWithLatitude:[nLat doubleValue] longitude:[nLng doubleValue]];
}
return nil;
}
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
//UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
// create the asset library in the init method of your custom object or view controller
//self.library = [[ALAssetsLibrary alloc] init];
//
[self.library assetForURL:assetURL resultBlock:^(ALAsset *asset) {
// try to retrieve gps metadata coordinates
CLLocation* myLocation = [self locationFromAsset:asset];
// Do your stuff....
} failureBlock:^(NSError *error) {
NSLog(#"Failed to get asset from library");
}];
}
It works obviously if the image contains gps meta informations
Hope it helps
This is in Swift 3 if you still want support for iOS 8:
import AssetsLibrary
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
if picker.sourceType == UIImagePickerControllerSourceType.photoLibrary,
let url = info[UIImagePickerControllerReferenceURL] as? URL {
let assetLibrary = ALAssetsLibrary()
assetLibrary.asset(for: url, resultBlock: { (asset) in
if let asset = asset {
let assetRep: ALAssetRepresentation = asset.defaultRepresentation()
let metaData: NSDictionary = assetRep.metadata() as NSDictionary
print(metaData)
}
}, failureBlock: { (error) in
print(error!)
})
}
}
For iOS 10 - Swift 3
The picker's callback has an info dict where there is a key with metadata: UIImagePickerControllerMediaMetadata
The naughty way to do this is to traverse the UIImagePickerViewController's views and pick out the selected image in the delegate callback.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
id thumbnailView = [[[[[[[[[[picker.view subviews]
objectAtIndex:0] subviews]
objectAtIndex:0] subviews]
objectAtIndex:0] subviews]
objectAtIndex:0] subviews]
objectAtIndex:0];
NSString *fullSizePath = [[[thumbnailView selectedPhoto] fileGroup] pathForFullSizeImage];
NSString *thumbnailPath = [[[thumbnailView selectedPhoto] fileGroup] pathForThumbnailFile];
NSLog(#"%# and %#", fullSizePath, thumbnailPath);
}
That will give you the path to the full size image, which you can then open with an EXIF library of your choice.
But, this calls a Private API and these method names will be detected by Apple if you submit this app. So don't do this, OK?