How to call UIImagePickerController Delegate programmatically or forcefully [without user interaction]? - iphone

In my app, i have Used UiimagepickerController for taking Video.in bettween my programm received any web service Which belongs to my app,
i have to stop Video Capture and save video.
i have Used StopVideoCapture to do above thing ,but it doesn't call delegate - `
(void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
How to force call above delegate ??.or How to handle interruption Handling inUIImagePickerController`.. any idea?

The idea with delegate methods is not that you call those methods - "They call you".
So I would not consider calling the delegate method yourself a good practise. However, if you present the UIImagePickerViewController with a modal dialogue (which I guess is common for such a picker) then you can close it like this outside of your delegate method:
[[yourPicker parentViewController] dismissModalViewControllerAnimated:YES];
Source
Update: You can use the ALAssetsLibrary for accessing the stored data in your iPhone media library. I recently had to do a similar project where I had to list all images on the iPhone. The Github project ELCImagePickerController.git was very useful since it shows how the items in your library can be accessed. So you'll do something like this:
#import <AssetsLibrary/AssetsLibrary.h>
// ....
-(void)fetchPhotoAlbums{
if(!self.assetsGroups){
self.assetsGroups = [NSMutableDictionary dictionary];
}
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
NSMutableArray *returnArray = [[NSMutableArray alloc] init];
#autoreleasepool {
void (^assetGroupEnumerator)(ALAssetsGroup *, BOOL *) = ^(ALAssetsGroup *group, BOOL *stop){
if (group == nil){
// Completed
[self.delegate pictureService:self fetchedAlbums:returnArray];
return;
}
Album *currentAlbum = [self albumForAssetsGroup:group];
// Store the Group for later retrieving the pictures for the album
[self.assetsGroups setObject:group forKey:currentAlbum.identifier];
[returnArray addObject:currentAlbum];
[self.delegate pictureService:self fetchedAlbums:returnArray];
};
void (^assetGroupEnumberatorFailure)(NSError *) = ^(NSError *error) {
NSLog(#"A problem occured %#", [error description]);
};
[library enumerateGroupsWithTypes:ALAssetsGroupAll
usingBlock:assetGroupEnumerator
failureBlock:assetGroupEnumberatorFailure];
}
}
-(void)fetchPhotosForAlbum:(Album *)album{
ALAssetsGroup *currentGroup = [self.assetsGroups objectForKey:album.identifier];
NSMutableArray *photos = [NSMutableArray array];
[currentGroup enumerateAssetsUsingBlock:^(ALAsset *asset, NSUInteger index, BOOL *stop){
if(asset == nil){
[self.delegate pictureService:self fetchedPictures:photos forAlbum:album];
return;
}
[photos addObject:[self pictureForAsset:asset]];
}];
}
Additionally I use two mapper methods to convert the AL-classes into my own model classes.
- (Album *)albumForAssetsGroup:(ALAssetsGroup *)assetsGroup{
Album *album = [[Album alloc] init];
album.title = [assetsGroup valueForProperty:ALAssetsGroupPropertyName];
album.identifier = [assetsGroup valueForProperty: ALAssetsGroupPropertyPersistentID];
album.assetsCount = assetsGroup.numberOfAssets;
album.thumbnail = [UIImage imageWithCGImage:assetsGroup.posterImage];
return album;
}
- (Picture *)pictureForAsset:(ALAsset *)asset{
Picture *picture = [[Picture alloc]init];
picture.identifier = [((NSArray *)[asset valueForProperty: ALAssetPropertyRepresentations]) objectAtIndex:0];
picture.thumbnail = [UIImage imageWithCGImage:asset.thumbnail];
return picture;
}
See the AssetsLibrary Documentation

Related

IOS: Save Array when app is close or enter background

I have a cartArray(in AppDelegate.h #interface above) that need to be saved when the app in background mode or the app closed. The app worked fine when the cartArray has nothing but crashed when I added an item (Cart) in it and entered the background or closed the application by pressing the minus sign. My cartArray contains cart class in it.
May I know what is happening? The tutorial online is so complicated and I always find myself lost in the middle of explanation.
[AppDelegate.m]
- (void)applicationDidEnterBackground:(UIApplication *)application {
[AppDelegate saveData];
}
- (void)applicationWillEnterForeground:(UIApplication *)application {
[AppDelegate getData];
}
+(NSString *) getPathToAchieve{ NSLog(#"+++++getPathToAchieve");
static NSString* docsDir = nil;
if (docsDir == nil) {
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
docsDir = [paths objectAtIndex:0];
}
NSString *fullFileName = [NSString stringWithFormat:#"%#.plist", docsDir];
return fullFileName;
}
- (void)applicationWillTerminate:(NSNotification *)notification{
[cartArray writeToFile:[AppDelegate getPathToAchieve] atomically:YES];
}
-(id)initWithCoder:(NSCoder *)aDecoder
{ self = [super init];
if (self != nil)
{
cartArray = [aDecoder decodeObjectForKey:#"cartArrayKeys"];
}
return self;
}
-(void)encodeWithCoder:(NSCoder *)anEncoder
{
[anEncoder encodeObject:cartArray forKey:#"cartArrayKeys"];
}
+(void)saveData{
[NSKeyedArchiver archiveRootObject:cartArray toFile:[self getPathToAchieve] ];
}
+(id)getData{
return [NSKeyedUnarchiver unarchiveObjectWithFile:[self getPathToAchieve]];
}
Your code is pretty messy. First, implement -(id)initWithCoder: and -(void)encodeWithCoder: in your Cart class, not AppDelegate class (and make sure Cart conforms to NSCoding protocol):
- (void) encodeWithCoder:(NSCoder *)encoder {
[encoder encodeObject:self.title forKey:#"title"];
[encoder encodeObject:self.description forKey:#"description"];
.....
}
- (id)initWithCoder:(NSCoder *)decoder {
if (self = [super init]) {
self.title = [decoder decodeObjectForKey:#"title"] ;
self.description = [decoder decodeObjectForKey:#"description"] ;
....
}
return self;
}
Second, implement -(void)saveData and -(void)getData:
-(void)saveData{
[[NSUserDefaults standardUserDefaults] setObject:[NSKeyedArchiver archivedDataWithRootObject:cartArray] forKey:#"cartArray"];
}
-(void)getData{
NSData *savedArray = [[NSUserDefaults standardUserDefaults] objectForKey:#"cartArray"];
if (savedArray != nil)
{
NSArray *oldArray = [NSKeyedUnarchiver unarchiveObjectWithData:savedArray];
if (oldArray != nil) {
cartArray = [[NSMutableArray alloc] initWithArray:oldArray];
} else {
cartArray = [[NSMutableArray alloc] init];
}
}
}
Call saveData when application is going to be terminated / entered background.
Call getData when application has loaded.
do all your saving in
- (void)applicationWillResignActive:(UIApplication *)application
from the documentation:
Tells the delegate that the application is about to become inactive.
This method is called to let your application know that it is about to
move from the active to inactive state. This can occur for certain
types of temporary interruptions (such as an incoming phone call or
SMS message) or when the user quits the application and it begins the
transition to the background state. An application in the inactive
state continues to run but does not dispatch incoming events to
responders. You should use this method to pause ongoing tasks, disable
timers, and throttle down OpenGL ES frame rates. Games should use this
method to pause the game. An application in the inactive state should
do minimal work while it waits to transition to either the active or
background state.
Array is a temporary storage if you want to save some data in permanent then try to use NSUSerDefult. After closing app array may vanish. NSUSerDefult data not vanish. NSUserDefault vanish only when app delete or Remove app from simulator.

Zeroing ALAssetRepresentation after saving image captured via UIImagePickerController in iOS5

In my app (which worked under iOS 4) I collect pictures selected via UIImagePickerController. Unfortunately, I have a strange problem after upgrading to iOS 5.
In a nutshell, I store ALAssetRepresentation in NSMutableArray. When I add photos from Library, everything is ok. However, when I capture and save a picture, all ALAssetRepresentations (including a new one) become 0-sized. ALAssetRepresentation.size and ALAssetRepresentation.getBytes:fromOffset:length:error: return 0 and getBytes:error is nil.
I init ALAssetsLibrary in AppDelegate, so the
“The lifetimes of objects you get back from a library instance are tied to the lifetime of the library instance.” condition is OK.
Is there a way to prevent ALAssetRepresentation from zeroing? Or how can I read image by bytes after this?
My code:
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
if ([picker sourceType] == UIImagePickerControllerSourceTypePhotoLibrary){
[self addPhoto:[info valueForKey:UIImagePickerControllerReferenceURL]];
}
else if ([picker sourceType] == UIImagePickerControllerSourceTypeCamera){
[self savePhoto:[info valueForKey:UIImagePickerControllerOriginalImage]];
}
[self dismissModalViewControllerAnimated:YES];
}
-(ALAssetsLibrary*) getLibrary{
if (!library){
testAppDelegate *appDelegate = (testAppDelegate *)[[UIApplication sharedApplication] delegate];
library = appDelegate.library;
}
NSLog(#"getLibrary: %#", library);
return library;
}
-(void) addPhoto:(NSURL*) url{
ALAssetsLibraryAssetForURLResultBlock successBlock = ^(ALAsset *asset_){
ALAssetRepresentation *assetRepresentation = [[asset_ defaultRepresentation] retain];
[photos addObject: assetRepresentation];
};
ALAssetsLibraryAccessFailureBlock failureBlock = ^(NSError *error){
NSLog(#"Error: Cannot get image. %#", [error localizedDescription]);
};
[[self getLibrary] assetForURL:url resultBlock:successBlock failureBlock:failureBlock];
}
- (void)savePhoto:(UIImage *)image {
[[self getLibrary] writeImageToSavedPhotosAlbum:[image CGImage] orientation:[image imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(#"Error: Cannot save image. %#", [error localizedDescription]);
} else {
NSLog(#"photo saved");
[self addPhoto:assetURL];
}
}];
}
I solved it!
ALAssetsLibraryChangedNotification
Sent when the contents of the assets library have changed from under the app that is using the data.
When you receive this notification, you should discard any cached information and query the assets library again. You should consider invalid any ALAsset, ALAssetsGroup, or ALAssetRepresentation objects you are referencing after finishing processing the notification.
Have you tried retaining the ALAssets instead of the ALAssetRepresentation? This should work, I have used this approach before.

Iphone: unable to show photos using AlAssetsLibrary

I currently sending ipa to friends for testing. Funny thing is, one of my tester able view her photos stored on her phone which was running IOS 5 using iPhone 4.
Another 2 testers: one has iPhone 4 (IOS 4.3.3) , and iPhone 3GS (IOS 5.0.1) both of them can't see photos stored on their phone.
These are the code I have used:
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
void (^assetEnumerator)(ALAsset *, NSUInteger, BOOL *) = ^(ALAsset *result, NSUInteger index, BOOL *stop) {
if(result != NULL) {
//NSLog(#"See Asset: %#", #"ggg");
[assets addObject:result];
}
};
NSLog(#"location = %i length = %i ", range->location, range->length );
void (^assetGroupEnumerator)(ALAssetsGroup *, BOOL *) = ^(ALAssetsGroup *group, BOOL *stop) {
if(group != nil) {
NSRange *datarange = malloc(sizeof(NSRange));
range->total = [group numberOfAssets];
datarange->location = [group numberOfAssets] - range->location - range->length;
datarange->length = range->length;
NSLog(#" total = %i", range->total);
int location = [group numberOfAssets] - range->location - range->length;
if (location < 0)
{
datarange->location = 0;
datarange->length = [group numberOfAssets] - range->location;
}
NSIndexSet *indexset = [ [NSIndexSet alloc] initWithIndexesInRange:*datarange];
[group enumerateAssetsAtIndexes:indexset options:NULL
usingBlock:assetEnumerator];
[indexset release];
free(datarange);
[self loadAssetToScrollView:assets];
}
};
[assets release];
assets = [[NSMutableArray alloc] init];
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos
usingBlock:assetGroupEnumerator
failureBlock: ^(NSError *error) {
NSLog(#"Failure");
}];
[library release];
I saw somebody say about asynchronous thing in some other threads but don't know is it the case. He say put dispatch_async in the enumerate group block.
Does anyone know what is wrong.
Additionally, one of tester with iOS 4.3.3 can show his photos after enabling location services under General->Setting. Why we have to enable it? Can we enabled it on code since it will be quite disturbing to the user who using our application.
Also on iOS 5.x you must retain your ALAssetsLibrary instance so long as you need to work with the collected assets. When you release your ALAssetsLibrary instance like in your code just after calling [library enumerateGroupsWithTypes:…] all the collected assets will be invalid.
See also the ALAssetsLibrary doc - overview:
"… The lifetimes of objects you get back from a library instance are tied to the lifetime of the library instance. …"
Yes, it is incredibly frustrating, but that is how it is, and you cannot enable location services in code (that is a good thing though).
I would move the first block ^assetGroupEnumerator to the heap by [[<#block#> copy] autorelease]. Why? Because this block would be autoreleased by the runloop, if there are many assets need to be enumerated through.
One more thing: don't use [self loadAssetToScrollView:assets]; inside the block but get the weak reference of self before the block like this:
__block YourExampleClassInstance *weakSelf = self;
and further use this weakSelf instance inside the block:
[weakSelf loadAssetToScrollView:assets];
void (^assetGroupEnumerator)… = ^(ALAssetsGroup *group, BOOL *stop) {
…
};
Why? To avoid retain cycles.

Static variable for communication among like-typed objects

I have a method that asynchronously downloads images. If the images are related to an array of objects (a common use-case in the app I'm building), I want to cache them. The idea is, I pass in an index number (based on the indexPath.row of the table I'm making by way through), and I stash the image in a static NSMutableArray, keyed on the row of the table I'm dealing with.
Thusly:
#implementation ImageDownloader
...
#synthesize cacheIndex;
static NSMutableArray *imageCache;
-(void)startDownloadWithImageView:(UIImageView *)imageView andImageURL:(NSURL *)url withCacheIndex:(NSInteger)index
{
self.theImageView = imageView;
self.cacheIndex = index;
NSLog(#"Called to download %# for imageview %#", url, self.theImageView);
if ([imageCache objectAtIndex:index]) {
NSLog(#"We have this image cached--using that instead");
self.theImageView.image = [imageCache objectAtIndex:index];
return;
}
self.activeDownload = [NSMutableData data];
NSURLConnection *conn = [[NSURLConnection alloc]
initWithRequest:[NSURLRequest requestWithURL:url] delegate:self];
self.imageConnection = conn;
[conn release];
}
//build up the incoming data in self.activeDownload with calls to didReceiveData...
- (void)connectionDidFinishLoading:(NSURLConnection *)connection
{
NSLog(#"Finished downloading.");
UIImage *image = [[UIImage alloc] initWithData:self.activeDownload];
self.theImageView.image = image;
NSLog(#"Caching %# for %d", self.theImageView.image, self.cacheIndex);
[imageCache insertObject:image atIndex:self.cacheIndex];
NSLog(#"Cache now has %d items", [imageCache count]);
[image release];
}
My index is getting through okay, I can see that by my NSLog output. But even after my insertObject: atIndex: call, [imageCache count] never leaves zero.
This is my first foray into static variables, so I presume I'm doing something wrong.
(The above code is heavily pruned to show only the main thing of what's going on, so bear that in mind as you look at it.)
You seem to never initialize the imageCache and probably got lucky with it having the value 0. The initialization would best be done in the class' initialization, e.g.:
#implementation ImageDownloader
// ...
+(void)initialize {
imageCache = [[NSMutableArray alloc] init];
}
// ...

UIImagePickerController and extracting EXIF data from existing photos

It's well known that UIImagePickerController doesn't return the metadata of the photo after selection. However, a couple of apps in the app store (Mobile Fotos, PixelPipe) seem to be able to read the original files and the EXIF data stored within them, enabling the app to extract the geodata from the selected photo.
They seem to do this by reading the original file from the /private/var/mobile/Media/DCIM/100APPLE/ folder and running it through an EXIF library.
However, I can't work out a way of matching a photo returned from the UIImagePickerController to a file on disk. I've explored file sizes, but the original file is a JPEG, whilst the returned image is a raw UIImage, making it impossible to know the file size of the image that was selected.
I'm considering making a table of hashes and matching against the first x pixels of each image. This seems a bit over the top though, and probably quite slow.
Any suggestions?
Have you took a look at this exif iPhone library?
http://code.google.com/p/iphone-exif/
Gonna try it on my side. I'd like to get the GPS (geotags) coordinates from the picture that has been taken with the UIImagePickerController :/
After a deeper look, this library seems to take NSData info as an input and the UIImagePickerController returns a UIImage after taking a snapshot. In theory, if we use the selected from the UIkit category for UIImage
NSData * UIImageJPEGRepresentation (
UIImage *image,
CGFloat compressionQuality
);
Then we can convert the UIImage into a NSData instance and then use it with the iPhone exif library.
UPDATE:
I gave a test to the library mentioned above and it seems to work. However because of my limited knwoledge about the EXIF format and the lack of high level API in the library, I don't manage to get the values for the EXIF tags.
Here's my code in case any of you can go further :
#import "EXFJpeg.h"
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)image editingInfo:(NSDictionary *)editingInfo {
NSLog(#"image picked %# with info %#", image, editingInfo);
NSData* jpegData = UIImageJPEGRepresentation (image,0.5);
EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
[jpegScanner scanImageData: jpegData];
EXFMetaData* exifData = jpegScanner.exifMetaData;
EXFJFIF* jfif = jpegScanner.jfif;
EXFTag* tagDefinition = [exifData tagDefinition: [NSNumber numberWithInt:EXIF_DateTime]];
//EXFTag* latitudeDef = [exifData tagDefinition: [NSNumber numberWithInt:EXIF_GPSLatitude]];
//EXFTag* longitudeDef = [exifData tagDefinition: [NSNumber numberWithInt:EXIF_GPSLongitude]];
id latitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLatitude]];
id longitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLongitude]];
id datetime = [exifData tagValue:[NSNumber numberWithInt:EXIF_DateTime]];
id t = [exifData tagValue:[NSNumber numberWithInt:EXIF_Model]];
....
....
The retrieving of tags definition is OK, but all tag values returns nil :(
In case you want to give a try to the library, you need to define a global variable to get it running (as explained in the doc but hum.. :/)
BOOL gLogging = FALSE;
UPDATE 2
Answer here : iPhone - access location information from a photo
A UIImage does not encapsulate the meta information, so we're stuck : for sure, no EXIF info will be given through this interface.
FINAL UPDATE
Ok I managed to get it working, at least to geotag properly pictures returned by the picker.
Before triggering the UIImagePickerController, it's up to you to use the CLLocationManager to retrieve the current CLocation
Once you have it, you can use this method that uses exif-iPhone library to geotag the UIImage from the CLLocation :
-(NSData*) geotagImage:(UIImage*)image withLocation:(CLLocation*)imageLocation {
NSData* jpegData = UIImageJPEGRepresentation(image, 0.8);
EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
[jpegScanner scanImageData: jpegData];
EXFMetaData* exifMetaData = jpegScanner.exifMetaData;
// end of helper methods
// adding GPS data to the Exif object
NSMutableArray* locArray = [self createLocArray:imageLocation.coordinate.latitude];
EXFGPSLoc* gpsLoc = [[EXFGPSLoc alloc] init];
[self populateGPS: gpsLoc :locArray];
[exifMetaData addTagValue:gpsLoc forKey:[NSNumber numberWithInt:EXIF_GPSLatitude] ];
[gpsLoc release];
[locArray release];
locArray = [self createLocArray:imageLocation.coordinate.longitude];
gpsLoc = [[EXFGPSLoc alloc] init];
[self populateGPS: gpsLoc :locArray];
[exifMetaData addTagValue:gpsLoc forKey:[NSNumber numberWithInt:EXIF_GPSLongitude] ];
[gpsLoc release];
[locArray release];
NSString* ref;
if (imageLocation.coordinate.latitude <0.0)
ref = #"S";
else
ref =#"N";
[exifMetaData addTagValue: ref forKey:[NSNumber numberWithInt:EXIF_GPSLatitudeRef] ];
if (imageLocation.coordinate.longitude <0.0)
ref = #"W";
else
ref =#"E";
[exifMetaData addTagValue: ref forKey:[NSNumber numberWithInt:EXIF_GPSLongitudeRef] ];
NSMutableData* taggedJpegData = [[NSMutableData alloc] init];
[jpegScanner populateImageData:taggedJpegData];
[jpegScanner release];
return [taggedJpegData autorelease];
}
// Helper methods for location conversion
-(NSMutableArray*) createLocArray:(double) val{
val = fabs(val);
NSMutableArray* array = [[NSMutableArray alloc] init];
double deg = (int)val;
[array addObject:[NSNumber numberWithDouble:deg]];
val = val - deg;
val = val*60;
double minutes = (int) val;
[array addObject:[NSNumber numberWithDouble:minutes]];
val = val - minutes;
val = val*60;
double seconds = val;
[array addObject:[NSNumber numberWithDouble:seconds]];
return array;
}
-(void) populateGPS:(EXFGPSLoc* ) gpsLoc :(NSArray*) locArray{
long numDenumArray[2];
long* arrPtr = numDenumArray;
[EXFUtils convertRationalToFraction:&arrPtr :[locArray objectAtIndex:0]];
EXFraction* fract = [[EXFraction alloc] initWith:numDenumArray[0]:numDenumArray[1]];
gpsLoc.degrees = fract;
[fract release];
[EXFUtils convertRationalToFraction:&arrPtr :[locArray objectAtIndex:1]];
fract = [[EXFraction alloc] initWith:numDenumArray[0] :numDenumArray[1]];
gpsLoc.minutes = fract;
[fract release];
[EXFUtils convertRationalToFraction:&arrPtr :[locArray objectAtIndex:2]];
fract = [[EXFraction alloc] initWith:numDenumArray[0] :numDenumArray[1]];
gpsLoc.seconds = fract;
[fract release];
}
This works with iOS5 (beta 4) and the camera roll (you need type defs for the blocks in the .h):
-(void) imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if ([mediaType isEqualToString:(NSString*)kUTTypeImage]) {
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
if (url) {
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset) {
CLLocation *location = [myasset valueForProperty:ALAssetPropertyLocation];
// location contains lat/long, timestamp, etc
// extracting the image is more tricky and 5.x beta ALAssetRepresentation has bugs!
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror) {
NSLog(#"cant get image - %#", [myerror localizedDescription]);
};
ALAssetsLibrary *assetsLib = [[ALAssetsLibrary alloc] init];
[assetsLib assetForURL:url resultBlock:resultblock failureBlock:failureblock];
}
}
There is a way in iOS 8
Without using any 3rd party EXIF library.
#import <Photos/Photos.h>
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithALAssetURLs:#[url] options:nil];
PHAsset *asset = fetchResult.firstObject;
//All you need is
//asset.location.coordinate.latitude
//asset.location.coordinate.longitude
//Other useful properties of PHAsset
//asset.favorite
//asset.modificationDate
//asset.creationDate
}
Apple has added an Image I/O Framework in iOS4 which can be used to read EXIF data from pictures. I don't know if the UIImagePickerController returns a picture with the EXIF data embedded though.
Edit: In iOS4 you can fetch the EXIF data by grabbing the value of the UIImagePickerControllerMediaMetadata key in the info dictionary which is passed to the UIImagePickerControllerDelegate delegate.
I had a similar question where I wanted just the date a picture was taken and none of the above appear to solve my problem in a simple way (e.g. no external libraries), so here is all of the data I could find which you can extract from an image after selecting it with the picker:
// Inside whatever implements UIImagePickerControllerDelegate
#import AssetsLibrary;
// ... your other code here ...
#implementation MYImagePickerDelegate
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = info[UIImagePickerControllerMediaType];
UIImage *originalImage = info[UIImagePickerControllerOriginalImage];
UIImage *editedImage = info[UIImagePickerControllerEditedImage];
NSValue *cropRect = info[UIImagePickerControllerCropRect];
NSURL *mediaUrl = info[UIImagePickerControllerMediaURL];
NSURL *referenceUrl = info[UIImagePickerControllerReferenceURL];
NSDictionary *mediaMetadata = info[UIImagePickerControllerMediaMetadata];
NSLog(#"mediaType=%#", mediaType);
NSLog(#"originalImage=%#", originalImage);
NSLog(#"editedImage=%#", editedImage);
NSLog(#"cropRect=%#", cropRect);
NSLog(#"mediaUrl=%#", mediaUrl);
NSLog(#"referenceUrl=%#", referenceUrl);
NSLog(#"mediaMetadata=%#", mediaMetadata);
if (!referenceUrl) {
NSLog(#"Media did not have reference URL.");
} else {
ALAssetsLibrary *assetsLib = [[ALAssetsLibrary alloc] init];
[assetsLib assetForURL:referenceUrl
resultBlock:^(ALAsset *asset) {
NSString *type =
[asset valueForProperty:ALAssetPropertyType];
CLLocation *location =
[asset valueForProperty:ALAssetPropertyLocation];
NSNumber *duration =
[asset valueForProperty:ALAssetPropertyDuration];
NSNumber *orientation =
[asset valueForProperty:ALAssetPropertyOrientation];
NSDate *date =
[asset valueForProperty:ALAssetPropertyDate];
NSArray *representations =
[asset valueForProperty:ALAssetPropertyRepresentations];
NSDictionary *urls =
[asset valueForProperty:ALAssetPropertyURLs];
NSURL *assetUrl =
[asset valueForProperty:ALAssetPropertyAssetURL];
NSLog(#"type=%#", type);
NSLog(#"location=%#", location);
NSLog(#"duration=%#", duration);
NSLog(#"assetUrl=%#", assetUrl);
NSLog(#"orientation=%#", orientation);
NSLog(#"date=%#", date);
NSLog(#"representations=%#", representations);
NSLog(#"urls=%#", urls);
}
failureBlock:^(NSError *error) {
NSLog(#"Failed to get asset: %#", error);
}];
}
[picker dismissViewControllerAnimated:YES
completion:nil];
}
#end
So when you select an image, you get output that looks like this (including date!):
mediaType=public.image
originalImage=<UIImage: 0x7fb38e00e870> size {1280, 850} orientation 0 scale 1.000000
editedImage=<UIImage: 0x7fb38e09e1e0> size {640, 424} orientation 0 scale 1.000000
cropRect=NSRect: {{0, 0}, {1280, 848}}
mediaUrl=(null)
referenceUrl=assets-library://asset/asset.JPG?id=AC072879-DA36-4A56-8A04-4D467C878877&ext=JPG
mediaMetadata=(null)
type=ALAssetTypePhoto
location=(null)
duration=ALErrorInvalidProperty
assetUrl=assets-library://asset/asset.JPG?id=AC072879-DA36-4A56-8A04-4D467C878877&ext=JPG
orientation=0
date=2014-07-14 04:28:18 +0000
representations=(
"public.jpeg"
)
urls={
"public.jpeg" = "assets-library://asset/asset.JPG?id=AC072879-DA36-4A56-8A04-4D467C878877&ext=JPG";
}
Anyway, hopefully that saves someone else some time.
I spend a while working on this as well for an application I was contracted to build. Basically as the API currently stands it is not possible. The basic problem is the UIImage class STRIPS all EXIF data except for the orientation out. Also the function to save to the camera roll strips this data out. So basically the only way to grab and maintain any extra EXIF data is to save it in a private "camera roll" in your application. I have filed this bug with apple as well and emphasized the need to the app reviewer reps we've been in contact with. Hopefully someday they'll add it in.. Otherwise it makes having GEO tagging completely useless as it only works in the "stock" camera application.
NOTE Some applications on the app store hack around this. By, what I have found, directly accessing the camera roll and SAVING photos straight to it to save GEO data. However this only works with the camera roll/saved photos and NOT the rest of the photo library. The photos "synced" to your phone from your computer have all EXIF data except for orientation stripped.
I still can't understand why those applications were approved (heck they even DELETE from the camera roll) and our application which does none of that is still being held back.
For iOS 8 and later you can use Photos Framework.
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
let url = info[UIImagePickerControllerReferenceURL] as? URL
if url != nil {
let fetchResult = PHAsset.fetchAssets(withALAssetURLs: [url!], options: nil)
let asset = fetchResult.firstObject
print(asset?.location?.coordinate.latitude)
print(asset?.creationDate)
}
}
This is something that the public API does not provide, but could be useful to many people. Your primary recourse is to file a bug with Apple that describes what you need (and it can be helpful to explain why you need it as well). Hopefully your request could make it into a future release.
After filing a bug, you could also use one of the Developer Technical Support (DTS) incidents that came with your iPhone Developer Program membership. If there is a public way to do this, an Apple engineer will know. Otherwise, it may at least help get your plight a bit more attention within the mothership. Best of luck!
Use the UIImagePickerControllerMediaURL dictionary key to get the file URL to the original file. Despite what the documentation says, you can get the file URL for photos and not only movies.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
// Try to get the original file.
NSURL *originalFile = [info objectForKey:UIImagePickerControllerMediaURL];
if (originalFile) {
NSData *fileData = [NSData dataWithContentsOfURL:originalFile];
}
}
You might be able to hash the image data returned by the UIImagePickerController and each of the images in the directory and compare them.
Just a thought, but have you tried TTPhotoViewController in the Three20 project on GitHub?
That provides an image picker that can read from multiple sources. You may be able to use it as an alternative to UIImagePickerController, or the source might give you a clue how to work out how to get the info you need.
Is there a specific reason you want to extract the location data from the image? An alternative could be to get the location separately using the CoreLocation framework. If it's only the geodata you need, this might save you some headaches.
it seems that photo attained by UIImagePickerControllerMediaURL don't have exif tags at all
In order to get this metadata you'll have to use the lower level framework AVFoundation.
Take a look at Apple's Squarecam example (http://developer.apple.com/library/ios/#samplecode/SquareCam/Introduction/Intro.html)
Find the method below and add the line, I've added to the code. The metadata dictionary returned also contains a diagnostics NSDictionary object.
- (BOOL)writeCGImageToCameraRoll:(CGImageRef)cgImage withMetadata:(NSDictionary *)metadata
{
NSDictionary *Exif = [metadata objectForKey:#"Exif"]; // Add this line
}
I'm using this for camera roll images
-(CLLocation*)locationFromAsset:(ALAsset*)asset
{
if (!asset)
return nil;
NSDictionary* pickedImageMetadata = [[asset defaultRepresentation] metadata];
NSDictionary* gpsInfo = [pickedImageMetadata objectForKey:(__bridge NSString *)kCGImagePropertyGPSDictionary];
if (gpsInfo){
NSNumber* nLat = [gpsInfo objectForKey:(__bridge NSString *)kCGImagePropertyGPSLatitude];
NSNumber* nLng = [gpsInfo objectForKey:(__bridge NSString *)kCGImagePropertyGPSLongitude];
if (nLat && nLng)
return [[CLLocation alloc]initWithLatitude:[nLat doubleValue] longitude:[nLng doubleValue]];
}
return nil;
}
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
//UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
// create the asset library in the init method of your custom object or view controller
//self.library = [[ALAssetsLibrary alloc] init];
//
[self.library assetForURL:assetURL resultBlock:^(ALAsset *asset) {
// try to retrieve gps metadata coordinates
CLLocation* myLocation = [self locationFromAsset:asset];
// Do your stuff....
} failureBlock:^(NSError *error) {
NSLog(#"Failed to get asset from library");
}];
}
It works obviously if the image contains gps meta informations
Hope it helps
This is in Swift 3 if you still want support for iOS 8:
import AssetsLibrary
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
if picker.sourceType == UIImagePickerControllerSourceType.photoLibrary,
let url = info[UIImagePickerControllerReferenceURL] as? URL {
let assetLibrary = ALAssetsLibrary()
assetLibrary.asset(for: url, resultBlock: { (asset) in
if let asset = asset {
let assetRep: ALAssetRepresentation = asset.defaultRepresentation()
let metaData: NSDictionary = assetRep.metadata() as NSDictionary
print(metaData)
}
}, failureBlock: { (error) in
print(error!)
})
}
}
For iOS 10 - Swift 3
The picker's callback has an info dict where there is a key with metadata: UIImagePickerControllerMediaMetadata
The naughty way to do this is to traverse the UIImagePickerViewController's views and pick out the selected image in the delegate callback.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
id thumbnailView = [[[[[[[[[[picker.view subviews]
objectAtIndex:0] subviews]
objectAtIndex:0] subviews]
objectAtIndex:0] subviews]
objectAtIndex:0] subviews]
objectAtIndex:0];
NSString *fullSizePath = [[[thumbnailView selectedPhoto] fileGroup] pathForFullSizeImage];
NSString *thumbnailPath = [[[thumbnailView selectedPhoto] fileGroup] pathForThumbnailFile];
NSLog(#"%# and %#", fullSizePath, thumbnailPath);
}
That will give you the path to the full size image, which you can then open with an EXIF library of your choice.
But, this calls a Private API and these method names will be detected by Apple if you submit this app. So don't do this, OK?