Writing metadata to ALAsset - iphone

I am developing a video app for iPhone. I am recording a video and saving it to iPhone Camera Roll using AssetsLibrary framework. The API that I have used is:
- (void)writeVideoAtPathToSavedPhotosAlbum:(NSURL *)videoPathURL
completionBlock:(ALAssetsLibraryWriteVideoCompletionBlock)completionBlock
Is there any way to save custom metadata of the video to the Camera Roll using ALAsset. If this is not possible using AssetsLibrary framework, can this be done using some other method. Basically I am interested in writing details about my app as a part of the video metadata.

Since iOS 4+ there is the AVFoundation framework, which also lets you read/write metadata from/to video files. There are only specific keys that you can use to add metadata using this option, but I don't believe it would be a problem.
Here's a small example that you can use to add a title to your videos (however, in this example all older metadata is removed):
// prepare metadata (add title "title")
NSMutableArray *metadata = [NSMutableArray array];
AVMutableMetadataItem *mi = [AVMutableMetadataItem metadataItem];
mi.key = AVMetadataCommonKeyTitle;
mi.keySpace = AVMetadataKeySpaceCommon;
mi.value = #"title";
[metadata addObject:mi];
// prepare video asset (SOME_URL can be an ALAsset url)
AVURLAsset *videoAsset = [[AVURLAsset alloc] initWithURL:SOME_URL options:nil];
// prepare to export, without transcoding if possible
AVAssetExportSession *_videoExportSession = [[AVAssetExportSession alloc] initWithAsset:videoAsset presetName:AVAssetExportPresetPassthrough];
[videoAsset release];
_videoExportSession.outputURL = [NSURL fileURLWithPath:_outputPath];
_videoExportSession.outputFileType = AVFileTypeQuickTimeMovie;
_videoExportSession.metadata = metadata;
[_videoExportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([_videoExportSession status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[_videoExportSession error] localizedDescription]);
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
default:
break;
}
[_videoExportSession release]; _videoExportSession = nil;
[self finishExport]; //in finishExport you can for example call writeVideoAtPathToSavedPhotosAlbum:completionBlock: to save the video from _videoExportSession.outputURL
}];
This also shows some examples: avmetadataeditor

There is no officially supported way to do this.
What you may do: Store the info you want to save in a separate database. The downside however is that such information is then only available in your app.
What exactly are you trying to accomplish?

You can also set the metadata in the videoWriter so something like =>
NSMutableArray *metadata = [NSMutableArray array];
AVMutableMetadataItem *mi = [AVMutableMetadataItem metadataItem];
mi.key = AVMetadataCommonKeyTitle;
mi.keySpace = AVMetadataKeySpaceCommon;
mi.value = #"title";
[metadata addObject:mi];
videoWriter.metadata = metadata;
where videoWriter is of type AVAssetWriter
and then when you stop recording you call =>
[videoWriter endSessionAtSourceTime:CMTimeMake(durationInMs, 1000)];
[videoWriter finishWritingWithCompletionHandler:^() {
ALAssetsLibrary *assetsLib = [[ALAssetsLibrary alloc] init];
[assetsLib writeVideoAtPathToSavedPhotosAlbum:videoUrl
completionBlock:^(NSURL* assetURL, NSError* error) {
if (error != nil) {
NSLog( #"Video not saved");
}
}];
}];

Related

how to create folder on Google Drive using Google Drive SDK for iPhone?

I am using Google Drive SDK for iPhone and trying to upload Audio file in "TestAudio" folder.If "TestAudio" folder is not created at google drive then first create that folder and after that my audio should store to that folder only. Every thing is working gr8 except folder creation. can any buddy please help?
I am using below code for upload audio file.
GTLUploadParameters *uploadParameters = nil;
NSString *soundFilePath = [[NSBundle mainBundle]
pathForResource:#"honey_bunny_new"
ofType:#"mp3"];
if (soundFilePath) {
NSData *fileContent = [[NSData alloc] initWithContentsOfFile:soundFilePath];
uploadParameters = [GTLUploadParameters uploadParametersWithData:fileContent MIMEType:#"audio/mpeg"];
}
self.driveFile.title = self.updatedTitle;
GTLQueryDrive *query = nil;
if (self.driveFile.identifier == nil || self.driveFile.identifier.length == 0) {
// This is a new file, instantiate an insert query.
query = [GTLQueryDrive queryForFilesInsertWithObject:self.driveFile
uploadParameters:uploadParameters];
} else {
// This file already exists, instantiate an update query.
query = [GTLQueryDrive queryForFilesUpdateWithObject:self.driveFile
fileId:self.driveFile.identifier
uploadParameters:uploadParameters];
}
UIAlertView *alert = [DrEditUtilities showLoadingMessageWithTitle:#"Saving file"
delegate:self];
[self.driveService executeQuery:query completionHandler:^(GTLServiceTicket *ticket,
GTLDriveFile *updatedFile,
NSError *error) {
[alert dismissWithClickedButtonIndex:0 animated:YES];
if (error == nil) {
self.driveFile = updatedFile;
self.originalContent = [self.textView.text copy];
self.updatedTitle = [updatedFile.title copy];
[self toggleSaveButton];
[self.delegate didUpdateFileWithIndex:self.fileIndex
driveFile:self.driveFile];
[self doneEditing:nil];
} else {
NSLog(#"An error occurred: %#", error);
[DrEditUtilities showErrorMessageWithTitle:#"Unable to save file"
message:error.description
delegate:self];
}
}];
I don't see your code to create a folder, but I was having the same problem with folder creation myself. As you know, the mimeType must be "application/vnd.google-apps.folder". I ran into assert failures if the NSData parameter to uploadParametersWithData was nil. So I tried a zero length NSData object and that failed. Using a 1 byte NSData object also failed. The trick is to call queryForFilesUpdateWithObject with uploadParameters:nil. Then the folder creation works fine. I also discovered that the Objective-C code shown at the end of:
https://developers.google.com/drive/v2/reference/files/insert
is incorrect. The file.parents should be as follows:
GTLDriveParentReference *parentRef = [GTLDriveParentReference object];
parentRef.identifier = parentID;
if (parentID.length>0) file.parents = [NSArray arrayWithObjects:parentRef,nil];

Write UIImage along with metadata (EXIF, GPS, TIFF) in iPhone's Photo library

I am developing a project, where the requirements are:
- User will open the camera through the application
- Upon capturing an Image, some data will be appended to the captured image's metadata.
I have gone through some of the forums. I tried to code this logic. I guess, I have reached to the point, but something is missing as I am not able to see the metadata that I am appending to the image.
My code is:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)image editingInfo:(NSDictionary *)dictionary
{
[picker dismissModalViewControllerAnimated:YES];
NSData *dataOfImageFromGallery = UIImageJPEGRepresentation (image,0.5);
NSLog(#"Image length: %d", [dataOfImageFromGallery length]);
CGImageSourceRef source;
source = CGImageSourceCreateWithData((CFDataRef)dataOfImageFromGallery, NULL);
NSDictionary *metadata = (NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source, 0, NULL);
NSMutableDictionary *metadataAsMutable = [[metadata mutableCopy]autorelease];
[metadata release];
NSMutableDictionary *EXIFDictionary = [[[metadataAsMutable objectForKey:(NSString *)kCGImagePropertyExifDictionary]mutableCopy]autorelease];
NSMutableDictionary *GPSDictionary = [[[metadataAsMutable objectForKey:(NSString *)kCGImagePropertyGPSDictionary]mutableCopy]autorelease];
if(!EXIFDictionary)
{
//if the image does not have an EXIF dictionary (not all images do), then create one for us to use
EXIFDictionary = [NSMutableDictionary dictionary];
}
if(!GPSDictionary)
{
GPSDictionary = [NSMutableDictionary dictionary];
}
//Setup GPS dict -
//I am appending my custom data just to test the logic……..
[GPSDictionary setValue:[NSNumber numberWithFloat:1.1] forKey:(NSString*)kCGImagePropertyGPSLatitude];
[GPSDictionary setValue:[NSNumber numberWithFloat:2.2] forKey:(NSString*)kCGImagePropertyGPSLongitude];
[GPSDictionary setValue:#"lat_ref" forKey:(NSString*)kCGImagePropertyGPSLatitudeRef];
[GPSDictionary setValue:#"lon_ref" forKey:(NSString*)kCGImagePropertyGPSLongitudeRef];
[GPSDictionary setValue:[NSNumber numberWithFloat:3.3] forKey:(NSString*)kCGImagePropertyGPSAltitude];
[GPSDictionary setValue:[NSNumber numberWithShort:4.4] forKey:(NSString*)kCGImagePropertyGPSAltitudeRef];
[GPSDictionary setValue:[NSNumber numberWithFloat:5.5] forKey:(NSString*)kCGImagePropertyGPSImgDirection];
[GPSDictionary setValue:#"_headingRef" forKey:(NSString*)kCGImagePropertyGPSImgDirectionRef];
[EXIFDictionary setValue:#"xml_user_comment" forKey:(NSString *)kCGImagePropertyExifUserComment];
//add our modified EXIF data back into the image’s metadata
[metadataAsMutable setObject:EXIFDictionary forKey:(NSString *)kCGImagePropertyExifDictionary];
[metadataAsMutable setObject:GPSDictionary forKey:(NSString *)kCGImagePropertyGPSDictionary];
CFStringRef UTI = CGImageSourceGetType(source);
NSMutableData *dest_data = [NSMutableData data];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((CFMutableDataRef) dest_data, UTI, 1, NULL);
if(!destination)
{
NSLog(#"--------- Could not create image destination---------");
}
CGImageDestinationAddImageFromSource(destination, source, 0, (CFDictionaryRef) metadataAsMutable);
BOOL success = NO;
success = CGImageDestinationFinalize(destination);
if(!success)
{
NSLog(#"-------- could not create data from image destination----------");
}
UIImage * image1 = [[UIImage alloc] initWithData:dest_data];
UIImageWriteToSavedPhotosAlbum (image1, self, nil, nil);
}
Kindly, help me to do this and get something positive.
Look at the last line, am I saving the image with my metadata in it?
The image is getting saved at that point, but the metadata that I am appending to it, is not getting saved.
Thanks in advance.
Apple has updated their article addressing this issue (Technical Q&A QA1622). If you're using an older version of Xcode, you may still have the article that says, more or less, tough luck, you can't do this without low-level parsing of the image data.
https://developer.apple.com/library/ios/#qa/qa1622/_index.html
I adapted the code there as follows:
- (void) saveImage:(UIImage *)imageToSave withInfo:(NSDictionary *)info
{
// Get the assets library
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// Get the image metadata (EXIF & TIFF)
NSMutableDictionary * imageMetadata = [[info objectForKey:UIImagePickerControllerMediaMetadata] mutableCopy];
// add GPS data
CLLocation * loc = <•••>; // need a location here
if ( loc ) {
[imageMetadata setObject:[self gpsDictionaryForLocation:loc] forKey:(NSString*)kCGImagePropertyGPSDictionary];
}
ALAssetsLibraryWriteImageCompletionBlock imageWriteCompletionBlock =
^(NSURL *newURL, NSError *error) {
if (error) {
NSLog( #"Error writing image with metadata to Photo Library: %#", error );
} else {
NSLog( #"Wrote image %# with metadata %# to Photo Library",newURL,imageMetadata);
}
};
// Save the new image to the Camera Roll
[library writeImageToSavedPhotosAlbum:[imageToSave CGImage]
metadata:imageMetadata
completionBlock:imageWriteCompletionBlock];
[imageMetadata release];
[library release];
}
and I call this from
imagePickerController:didFinishPickingMediaWithInfo:
which is the delegate method for the image picker.
I use a helper method (adapted from GusUtils) to build a GPS metadata dictionary from a location:
- (NSDictionary *) gpsDictionaryForLocation:(CLLocation *)location
{
CLLocationDegrees exifLatitude = location.coordinate.latitude;
CLLocationDegrees exifLongitude = location.coordinate.longitude;
NSString * latRef;
NSString * longRef;
if (exifLatitude < 0.0) {
exifLatitude = exifLatitude * -1.0f;
latRef = #"S";
} else {
latRef = #"N";
}
if (exifLongitude < 0.0) {
exifLongitude = exifLongitude * -1.0f;
longRef = #"W";
} else {
longRef = #"E";
}
NSMutableDictionary *locDict = [[NSMutableDictionary alloc] init];
[locDict setObject:location.timestamp forKey:(NSString*)kCGImagePropertyGPSTimeStamp];
[locDict setObject:latRef forKey:(NSString*)kCGImagePropertyGPSLatitudeRef];
[locDict setObject:[NSNumber numberWithFloat:exifLatitude] forKey:(NSString *)kCGImagePropertyGPSLatitude];
[locDict setObject:longRef forKey:(NSString*)kCGImagePropertyGPSLongitudeRef];
[locDict setObject:[NSNumber numberWithFloat:exifLongitude] forKey:(NSString *)kCGImagePropertyGPSLongitude];
[locDict setObject:[NSNumber numberWithFloat:location.horizontalAccuracy] forKey:(NSString*)kCGImagePropertyGPSDOP];
[locDict setObject:[NSNumber numberWithFloat:location.altitude] forKey:(NSString*)kCGImagePropertyGPSAltitude];
return [locDict autorelease];
}
So far this is working well for me on iOS4 and iOS5 devices.
Update: and iOS6/iOS7 devices. I built a simple project using this code:
https://github.com/5teev/MetaPhotoSave
The function: UIImageWriteToSavePhotosAlbum only writes the image data.
You need to read up on the ALAssetsLibrary
The method you ultimately want to call is:
ALAssetsLibrary *library = [[ALAssetsLibrary alloc]
[library writeImageToSavedPhotosAlbum:metadata:completionBlock];
For anyone who comes here trying to take a photo with the camera in your app and saving the image file to the camera roll with GPS metadata, I have a Swift solution that uses the Photos API since ALAssetsLibrary is deprecated as of iOS 9.0.
As mentioned by rickster on this answer, the Photos API does not embed location data directly into a JPG image file even if you set the .location property of the new asset.
Given a CMSampleBuffer sample buffer buffer, some CLLocation location, and using Morty’s suggestion to use CMSetAttachments in order to avoid duplicating the image, we can do the following. The gpsMetadata method extending CLLocation can be found here.
if let location = location {
// Get the existing metadata dictionary (if there is one)
var metaDict = CMCopyDictionaryOfAttachments(nil, buffer, kCMAttachmentMode_ShouldPropagate) as? Dictionary<String, Any> ?? [:]
// Append the GPS metadata to the existing metadata
metaDict[kCGImagePropertyGPSDictionary as String] = location.gpsMetadata()
// Save the new metadata back to the buffer without duplicating any data
CMSetAttachments(buffer, metaDict as CFDictionary, kCMAttachmentMode_ShouldPropagate)
}
// Get JPG image Data from the buffer
guard let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer) else {
// There was a problem; handle it here
}
// Now save this image to the Camera Roll (will save with GPS metadata embedded in the file)
self.savePhoto(withData: imageData, completion: completion)
The savePhoto method is below. Note that the handy addResource:with:data:options method is available only in iOS 9. If you are supporting an earlier iOS and want to use the Photos API, then you must make a temporary file and then create an asset from the file at that URL if you want to have the GPS metadata properly embedded (PHAssetChangeRequest.creationRequestForAssetFromImage:atFileURL). Only setting PHAsset’s .location will NOT embed your new metadata into the actual file itself.
func savePhoto(withData data: Data, completion: (() -> Void)? = nil) {
// Note that using the Photos API .location property on a request does NOT embed GPS metadata into the image file itself
PHPhotoLibrary.shared().performChanges({
if #available(iOS 9.0, *) {
// For iOS 9+ we can skip the temporary file step and write the image data from the buffer directly to an asset
let request = PHAssetCreationRequest.forAsset()
request.addResource(with: PHAssetResourceType.photo, data: data, options: nil)
request.creationDate = Date()
} else {
// Fallback on earlier versions; write a temporary file and then add this file to the Camera Roll using the Photos API
let tmpURL = URL(fileURLWithPath: NSTemporaryDirectory(), isDirectory: true).appendingPathComponent("tempPhoto").appendingPathExtension("jpg")
do {
try data.write(to: tmpURL)
let request = PHAssetChangeRequest.creationRequestForAssetFromImage(atFileURL: tmpURL)
request?.creationDate = Date()
} catch {
// Error writing the data; photo is not appended to the camera roll
}
}
}, completionHandler: { _ in
DispatchQueue.main.async {
completion?()
}
})
}
Aside:
If you are just wanting to save the image with GPS metadata to your temporary files or documents (as opposed to the camera roll/photo library), you can skip using the Photos API and directly write the imageData to a URL.
// Write photo to temporary files with the GPS metadata embedded in the file
let tmpURL = URL(fileURLWithPath: NSTemporaryDirectory(), isDirectory: true).appendingPathComponent("tempPhoto").appendingPathExtension("jpg")
do {
try data.write(to: tmpURL)
// Do more work here...
} catch {
// Error writing the data; handle it here
}
A piece of this involves generating the GPS metadata. Here's a category on CLLocation to do just that:
https://gist.github.com/phildow/6043486
Getting meta data from cam captured image within an application:
UIImage *pTakenImage= [info objectForKey:#"UIImagePickerControllerOriginalImage"];
NSMutableDictionary *imageMetadata = [[NSMutableDictionary alloc] initWithDictionary:[info objectForKey:UIImagePickerControllerMediaMetadata]];
now to save image to library with extracted metadata:
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[sourceImage CGImage] metadata:imageMetadata completionBlock:Nil];
[library release];
or want to save to local directory
CGImageDestinationAddImageFromSource(destinationPath,sourceImage,0, (CFDictionaryRef)imageMetadata);
The problem we are trying to solve is: the user has just taken a picture with the UIImagePickerController camera. What we get is a UIImage. How do we fold metadata into that UIImage as we save it into the camera roll (photo library), now that we don't have the AssetsLibrary framework?
The answer (as far as I can make out) is: use the ImageIO framework. Extract the JPEG data from the UIImage, use it as a source and write it and the metadata dictionary into the destination, and save the destination data as a PHAsset into the camera roll.
In this example, im is the UIImage and meta is the metadata dictionary:
let jpeg = UIImageJPEGRepresentation(im, 1)!
let src = CGImageSourceCreateWithData(jpeg as CFData, nil)!
let data = NSMutableData()
let uti = CGImageSourceGetType(src)!
let dest = CGImageDestinationCreateWithData(data as CFMutableData, uti, 1, nil)!
CGImageDestinationAddImageFromSource(dest, src, 0, meta)
CGImageDestinationFinalize(dest)
let lib = PHPhotoLibrary.shared()
lib.performChanges({
let req = PHAssetCreationRequest.forAsset()
req.addResource(with: .photo, data: data as Data, options: nil)
})
A good way to test — and a common use case — is to receive the photo metadata from the UIImagePickerController delegate info dictionary thru the UIImagePickerControllerMediaMetadata key and fold it into the PHAsset as we save it into the photo library.
There are many frameworks that deals with image and metadata.
Assets Framework is deprecated, and replaced by Photos Library framework. If you implemented AVCapturePhotoCaptureDelegate to capture photos, you can do so:
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
var metadata = photo.metadata
metadata[kCGImagePropertyGPSDictionary as String] = gpsMetadata
photoData = photo.fileDataRepresentation(withReplacementMetadata: metadata,
replacementEmbeddedThumbnailPhotoFormat: photo.embeddedThumbnailPhotoFormat,
replacementEmbeddedThumbnailPixelBuffer: nil,
replacementDepthData: photo.depthData)
...
}
The metadata is a dictionary of dictionaries, and you have to refer to CGImageProperties.
I wrote about this topic here.
Here is a slight variation of #matt answer.
The following code use only one CGImageDestination and more interesting allow to save in HEIC format on iOS11+.
Notice that the compression quality is added to the metadata before adding the image. 0.8 is roughly the compression quality of native camera save.
//img is the UIImage and metadata the metadata received from the picker
NSMutableDictionary *meta_plus = metadata.mutableCopy;
//with CGimage, one can set compression quality in metadata
meta_plus[(NSString *)kCGImageDestinationLossyCompressionQuality] = #(0.8);
NSMutableData *img_data = [NSMutableData new];
NSString *type;
if (#available(iOS 11.0, *)) type = AVFileTypeHEIC;
else type = #"public.jpeg";
CGImageDestinationRef dest = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)img_data, (__bridge CFStringRef)type, 1, nil);
CGImageDestinationAddImage(dest, img.CGImage, (__bridge CFDictionaryRef)meta_plus);
CGImageDestinationFinalize(dest);
CFRelease(dest); //image is in img_data
//go for the PHLibrary change request

Getting exposure values from camera on iPhone OS 4.0

Exposure values from camera can be acquired when you take picture (without saving it to SavedPhotos). A light meter application on iPhone does this, probably by using some private API.
That application does it on iPhone 3GS only, so I guess it may be somehow related to EXIF data which is populated with this information when the image is created.
This all applies to 3GS.
Has anything changed with iPhone OS 4.0?
Is there a regular way to get these values now?
Does anyone have a working code example for taking these camera/photo setting values?
Thank you
If you want realtime* exposure information, you can capture a video using AVCaptureVideoDataOutput. Each frame CMSampleBuffer is full of interesting data describing the current state of the camera.
*up to 30 fps
With AVFoundation in iOS 4.0 you can mess with exposure, refer specifically to AVCaptureDevice, here is a link AVCaptureDevice ref. Not sure if its exactly what you want but you can look around AVFoundation and probably find some useful stuff
I think I finally found the lead to the real EXIF data. It'll be a while before I have actual code to post, but I figured this should be publicized in the meantime.
Google captureStillImageAsynchronouslyFromConnection. It's a function of AVCaptureStillImageOutput and following is an excerpt from the documentation (long sought for):
imageDataSampleBuffer -
The data that was captured.
The buffer attachments may contain metadata appropriate to the image data format. For example, a buffer containing JPEG data may carry a kCGImagePropertyExifDictionary as an attachment. See ImageIO/CGImageProperties.h for a list of keys and value types.
For an example of working with AVCaptureStillImageOutput see WWDC 2010 sample code, under AVCam.
Peace,
O.
Here is the complete solution. Dont forget to import appropriate frameworks and headers.
In the exifAttachments var in capturenow method you'll find all data you are looking for.
#import <AVFoundation/AVFoundation.h>
#import <ImageIO/CGImageProperties.h>
AVCaptureStillImageOutput *stillImageOutput;
AVCaptureSession *session;
- (void)viewDidLoad
{
[super viewDidLoad];
[self setupCaptureSession];
// Do any additional setup after loading the view, typically from a nib.
}
-(void)captureNow{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *__strong error) {
CFDictionaryRef exifAttachments = CMGetAttachment( imageDataSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
}];
}
// Create and configure a capture session and start it running
- (void)setupCaptureSession
{
NSError *error = nil;
// Create the session
session = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPreset352x288;
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
device.whiteBalanceMode = AVCaptureWhiteBalanceModeLocked;
device.focusMode = AVCaptureFocusModeLocked;
[device unlockForConfiguration];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// Handling the error appropriately.
}
[session addInput:input];
stillImageOutput = [AVCaptureStillImageOutput new];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
if ([session canAddOutput:stillImageOutput])
[session addOutput:stillImageOutput];
// Start the session running to start the flow of data
[session startRunning];
[self captureNow];
}

MPMediaItems raw song data

I was wondering how to access an MPMediaItem's raw data.
Any ideas?
you can obtain the media item's data in such way:
-(void)mediaItemToData
{
// Implement in your project the media item picker
MPMediaItem *curItem = musicPlayer.nowPlayingItem;
NSURL *url = [curItem valueForProperty: MPMediaItemPropertyAssetURL];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL: url options:nil];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset: songAsset
presetName: AVAssetExportPresetPassthrough];
exporter.outputFileType = #"public.mpeg-4";
NSString *exportFile = [[self myDocumentsDirectory] stringByAppendingPathComponent:
#"exported.mp4"];
NSURL *exportURL = [[NSURL fileURLWithPath:exportFile] retain];
exporter.outputURL = exportURL;
// do the export
// (completion handler block omitted)
[exporter exportAsynchronouslyWithCompletionHandler:
^{
NSData *data = [NSData dataWithContentsOfFile: [[self myDocumentsDirectory]
stringByAppendingPathComponent: #"exported.mp4"]];
// Do with data something
}];
}
This code will work only on ios 4.0 and later
Good luck!
Of course you can access the data of a MPMediaItem. It's not crystal clear at once but it works. Here's how:
Get the media item's URL from it's MPMediaItemPropertyAssetURL property
Initialize an AVURLAsset with this URL
Initialize an AVAssetReader with this asset
Fetch the AVAssetTrack you want to read from the AVURLAsset
Create an AVAssetReaderTrackOutput with this track
Add this output to the AVAssetReader created before and -startReading
Fetch all data with AVAssetReaderTrackOutput's -copyNextSampleBuffer
PROFIT!
Here is some sample code from a project of mine (this is not a code jewel of mine, wrote it some time back in my coding dark ages):
typedef enum {
kEDSupportedMediaTypeAAC = 'aac ',
kEDSupportedMediaTypeMP3 = '.mp3'
} EDSupportedMediaType;
- (EDLibraryAssetReaderStatus)prepareAsset {
// Get the AVURLAsset
AVURLAsset *uasset = [m_asset URLAsset];
// Check for DRM protected content
if (uasset.hasProtectedContent) {
return kEDLibraryAssetReader_TrackIsDRMProtected;
}
if ([uasset tracks] == 0) {
DDLogError(#"no asset tracks found");
return AVAssetReaderStatusFailed;
}
// Initialize a reader with a track output
NSError *err = noErr;
m_reader = [[AVAssetReader alloc] initWithAsset:uasset error:&err];
if (!m_reader || err) {
DDLogError(#"could not create asset reader (%i)\n", [err code]);
return AVAssetReaderStatusFailed;
}
// Check tracks for valid format. Currently we only support all MP3 and AAC types, WAV and AIFF is too large to handle
for (AVAssetTrack *track in uasset.tracks) {
NSArray *formats = track.formatDescriptions;
for (int i=0; i<[formats count]; i++) {
CMFormatDescriptionRef format = (CMFormatDescriptionRef)[formats objectAtIndex:i];
// Check the format types
CMMediaType mediaType = CMFormatDescriptionGetMediaType(format);
FourCharCode mediaSubType = CMFormatDescriptionGetMediaSubType(format);
DDLogVerbose(#"mediaType: %s, mediaSubType: %s", COFcc(mediaType), COFcc(mediaSubType));
if (mediaType == kCMMediaType_Audio) {
if (mediaSubType == kEDSupportedMediaTypeAAC ||
mediaSubType == kEDSupportedMediaTypeMP3) {
m_track = [track retain];
m_format = CFRetain(format);
break;
}
}
}
if (m_track != nil && m_format != NULL) {
break;
}
}
if (m_track == nil || m_format == NULL) {
return kEDLibraryAssetReader_UnsupportedFormat;
}
// Create an output for the found track
m_output = [[AVAssetReaderTrackOutput alloc] initWithTrack:m_track outputSettings:nil];
[m_reader addOutput:m_output];
// Start reading
if (![m_reader startReading]) {
DDLogError(#"could not start reading asset");
return kEDLibraryAssetReader_CouldNotStartReading;
}
return 0;
}
- (OSStatus)copyNextSampleBufferRepresentation:(CMSampleBufferRepresentationRef *)repOut {
pthread_mutex_lock(&m_mtx);
OSStatus err = noErr;
AVAssetReaderStatus status = m_reader.status;
if (m_invalid) {
pthread_mutex_unlock(&m_mtx);
return kEDLibraryAssetReader_Invalidated;
}
else if (status != AVAssetReaderStatusReading) {
pthread_mutex_unlock(&m_mtx);
return kEDLibraryAssetReader_NoMoreSampleBuffers;
}
// Read the next sample buffer
CMSampleBufferRef sbuf = [m_output copyNextSampleBuffer];
if (sbuf == NULL) {
pthread_mutex_unlock(&m_mtx);
return kEDLibraryAssetReader_NoMoreSampleBuffers;
}
CMSampleBufferRepresentationRef srep = CMSampleBufferRepresentationCreateWithSampleBuffer(sbuf);
if (srep && repOut != NULL) {
*repOut = srep;
}
else {
DDLogError(#"CMSampleBufferRef corrupted");
EDCFShow(sbuf);
err = kEDLibraryAssetReader_BufferCorrupted;
}
CFRelease(sbuf);
pthread_mutex_unlock(&m_mtx);
return err;
}
You can't, and there are no workaround. An MPMediaItem is not the actual piece of media, it is just the metadata about the media item communicated to the application via RPC from another process. The data for the item itself is not accessible in your address space.
I should note that even if you have the MPMediaItem its data probably is not loaded into the devices memory. The flash on the iPhone is slow and memory is scarce. While Apple may not want you to have access to the raw data backing an MPMediaItem, it is just as likely that they didn't bother dealing with it because they didn't want to invest the time necessary to deal with the APIs. If they did provide access to such a thing it almost certainly would not be as an NSData, but more likely as an NSURL they would give your application that would allow it to open the file and stream through the data.
In any event, if you want the functionality, you should file a bug report asking for.
Also, as a side note, don't mention your age in a bug report you send to Apple. I think it is very cool you are writing apps for the phone, when I was your age I loved experimenting with computers (back then I was working on things written in Lisp). The thing is you cannot legally agree to a contract in the United States, which is why the developer agreement specifically prohibits you from joining. From the first paragraph of the agreement:
You also certify that you are of the
legal age of majority in the
jurisdiction in which you reside (at
least 18 years of age in many
countries) and you represent that you
are legally permitted to become a
Registered iPhone Developer.
If you mention to a WWDR representative that you are not of age of majority they may realize you are in violation of the agreement and be obligated to terminate your developer account. Just a friendly warning.

UIImagePickerController and extracting EXIF data from existing photos

It's well known that UIImagePickerController doesn't return the metadata of the photo after selection. However, a couple of apps in the app store (Mobile Fotos, PixelPipe) seem to be able to read the original files and the EXIF data stored within them, enabling the app to extract the geodata from the selected photo.
They seem to do this by reading the original file from the /private/var/mobile/Media/DCIM/100APPLE/ folder and running it through an EXIF library.
However, I can't work out a way of matching a photo returned from the UIImagePickerController to a file on disk. I've explored file sizes, but the original file is a JPEG, whilst the returned image is a raw UIImage, making it impossible to know the file size of the image that was selected.
I'm considering making a table of hashes and matching against the first x pixels of each image. This seems a bit over the top though, and probably quite slow.
Any suggestions?
Have you took a look at this exif iPhone library?
http://code.google.com/p/iphone-exif/
Gonna try it on my side. I'd like to get the GPS (geotags) coordinates from the picture that has been taken with the UIImagePickerController :/
After a deeper look, this library seems to take NSData info as an input and the UIImagePickerController returns a UIImage after taking a snapshot. In theory, if we use the selected from the UIkit category for UIImage
NSData * UIImageJPEGRepresentation (
UIImage *image,
CGFloat compressionQuality
);
Then we can convert the UIImage into a NSData instance and then use it with the iPhone exif library.
UPDATE:
I gave a test to the library mentioned above and it seems to work. However because of my limited knwoledge about the EXIF format and the lack of high level API in the library, I don't manage to get the values for the EXIF tags.
Here's my code in case any of you can go further :
#import "EXFJpeg.h"
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)image editingInfo:(NSDictionary *)editingInfo {
NSLog(#"image picked %# with info %#", image, editingInfo);
NSData* jpegData = UIImageJPEGRepresentation (image,0.5);
EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
[jpegScanner scanImageData: jpegData];
EXFMetaData* exifData = jpegScanner.exifMetaData;
EXFJFIF* jfif = jpegScanner.jfif;
EXFTag* tagDefinition = [exifData tagDefinition: [NSNumber numberWithInt:EXIF_DateTime]];
//EXFTag* latitudeDef = [exifData tagDefinition: [NSNumber numberWithInt:EXIF_GPSLatitude]];
//EXFTag* longitudeDef = [exifData tagDefinition: [NSNumber numberWithInt:EXIF_GPSLongitude]];
id latitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLatitude]];
id longitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLongitude]];
id datetime = [exifData tagValue:[NSNumber numberWithInt:EXIF_DateTime]];
id t = [exifData tagValue:[NSNumber numberWithInt:EXIF_Model]];
....
....
The retrieving of tags definition is OK, but all tag values returns nil :(
In case you want to give a try to the library, you need to define a global variable to get it running (as explained in the doc but hum.. :/)
BOOL gLogging = FALSE;
UPDATE 2
Answer here : iPhone - access location information from a photo
A UIImage does not encapsulate the meta information, so we're stuck : for sure, no EXIF info will be given through this interface.
FINAL UPDATE
Ok I managed to get it working, at least to geotag properly pictures returned by the picker.
Before triggering the UIImagePickerController, it's up to you to use the CLLocationManager to retrieve the current CLocation
Once you have it, you can use this method that uses exif-iPhone library to geotag the UIImage from the CLLocation :
-(NSData*) geotagImage:(UIImage*)image withLocation:(CLLocation*)imageLocation {
NSData* jpegData = UIImageJPEGRepresentation(image, 0.8);
EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
[jpegScanner scanImageData: jpegData];
EXFMetaData* exifMetaData = jpegScanner.exifMetaData;
// end of helper methods
// adding GPS data to the Exif object
NSMutableArray* locArray = [self createLocArray:imageLocation.coordinate.latitude];
EXFGPSLoc* gpsLoc = [[EXFGPSLoc alloc] init];
[self populateGPS: gpsLoc :locArray];
[exifMetaData addTagValue:gpsLoc forKey:[NSNumber numberWithInt:EXIF_GPSLatitude] ];
[gpsLoc release];
[locArray release];
locArray = [self createLocArray:imageLocation.coordinate.longitude];
gpsLoc = [[EXFGPSLoc alloc] init];
[self populateGPS: gpsLoc :locArray];
[exifMetaData addTagValue:gpsLoc forKey:[NSNumber numberWithInt:EXIF_GPSLongitude] ];
[gpsLoc release];
[locArray release];
NSString* ref;
if (imageLocation.coordinate.latitude <0.0)
ref = #"S";
else
ref =#"N";
[exifMetaData addTagValue: ref forKey:[NSNumber numberWithInt:EXIF_GPSLatitudeRef] ];
if (imageLocation.coordinate.longitude <0.0)
ref = #"W";
else
ref =#"E";
[exifMetaData addTagValue: ref forKey:[NSNumber numberWithInt:EXIF_GPSLongitudeRef] ];
NSMutableData* taggedJpegData = [[NSMutableData alloc] init];
[jpegScanner populateImageData:taggedJpegData];
[jpegScanner release];
return [taggedJpegData autorelease];
}
// Helper methods for location conversion
-(NSMutableArray*) createLocArray:(double) val{
val = fabs(val);
NSMutableArray* array = [[NSMutableArray alloc] init];
double deg = (int)val;
[array addObject:[NSNumber numberWithDouble:deg]];
val = val - deg;
val = val*60;
double minutes = (int) val;
[array addObject:[NSNumber numberWithDouble:minutes]];
val = val - minutes;
val = val*60;
double seconds = val;
[array addObject:[NSNumber numberWithDouble:seconds]];
return array;
}
-(void) populateGPS:(EXFGPSLoc* ) gpsLoc :(NSArray*) locArray{
long numDenumArray[2];
long* arrPtr = numDenumArray;
[EXFUtils convertRationalToFraction:&arrPtr :[locArray objectAtIndex:0]];
EXFraction* fract = [[EXFraction alloc] initWith:numDenumArray[0]:numDenumArray[1]];
gpsLoc.degrees = fract;
[fract release];
[EXFUtils convertRationalToFraction:&arrPtr :[locArray objectAtIndex:1]];
fract = [[EXFraction alloc] initWith:numDenumArray[0] :numDenumArray[1]];
gpsLoc.minutes = fract;
[fract release];
[EXFUtils convertRationalToFraction:&arrPtr :[locArray objectAtIndex:2]];
fract = [[EXFraction alloc] initWith:numDenumArray[0] :numDenumArray[1]];
gpsLoc.seconds = fract;
[fract release];
}
This works with iOS5 (beta 4) and the camera roll (you need type defs for the blocks in the .h):
-(void) imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if ([mediaType isEqualToString:(NSString*)kUTTypeImage]) {
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
if (url) {
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset) {
CLLocation *location = [myasset valueForProperty:ALAssetPropertyLocation];
// location contains lat/long, timestamp, etc
// extracting the image is more tricky and 5.x beta ALAssetRepresentation has bugs!
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror) {
NSLog(#"cant get image - %#", [myerror localizedDescription]);
};
ALAssetsLibrary *assetsLib = [[ALAssetsLibrary alloc] init];
[assetsLib assetForURL:url resultBlock:resultblock failureBlock:failureblock];
}
}
There is a way in iOS 8
Without using any 3rd party EXIF library.
#import <Photos/Photos.h>
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithALAssetURLs:#[url] options:nil];
PHAsset *asset = fetchResult.firstObject;
//All you need is
//asset.location.coordinate.latitude
//asset.location.coordinate.longitude
//Other useful properties of PHAsset
//asset.favorite
//asset.modificationDate
//asset.creationDate
}
Apple has added an Image I/O Framework in iOS4 which can be used to read EXIF data from pictures. I don't know if the UIImagePickerController returns a picture with the EXIF data embedded though.
Edit: In iOS4 you can fetch the EXIF data by grabbing the value of the UIImagePickerControllerMediaMetadata key in the info dictionary which is passed to the UIImagePickerControllerDelegate delegate.
I had a similar question where I wanted just the date a picture was taken and none of the above appear to solve my problem in a simple way (e.g. no external libraries), so here is all of the data I could find which you can extract from an image after selecting it with the picker:
// Inside whatever implements UIImagePickerControllerDelegate
#import AssetsLibrary;
// ... your other code here ...
#implementation MYImagePickerDelegate
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = info[UIImagePickerControllerMediaType];
UIImage *originalImage = info[UIImagePickerControllerOriginalImage];
UIImage *editedImage = info[UIImagePickerControllerEditedImage];
NSValue *cropRect = info[UIImagePickerControllerCropRect];
NSURL *mediaUrl = info[UIImagePickerControllerMediaURL];
NSURL *referenceUrl = info[UIImagePickerControllerReferenceURL];
NSDictionary *mediaMetadata = info[UIImagePickerControllerMediaMetadata];
NSLog(#"mediaType=%#", mediaType);
NSLog(#"originalImage=%#", originalImage);
NSLog(#"editedImage=%#", editedImage);
NSLog(#"cropRect=%#", cropRect);
NSLog(#"mediaUrl=%#", mediaUrl);
NSLog(#"referenceUrl=%#", referenceUrl);
NSLog(#"mediaMetadata=%#", mediaMetadata);
if (!referenceUrl) {
NSLog(#"Media did not have reference URL.");
} else {
ALAssetsLibrary *assetsLib = [[ALAssetsLibrary alloc] init];
[assetsLib assetForURL:referenceUrl
resultBlock:^(ALAsset *asset) {
NSString *type =
[asset valueForProperty:ALAssetPropertyType];
CLLocation *location =
[asset valueForProperty:ALAssetPropertyLocation];
NSNumber *duration =
[asset valueForProperty:ALAssetPropertyDuration];
NSNumber *orientation =
[asset valueForProperty:ALAssetPropertyOrientation];
NSDate *date =
[asset valueForProperty:ALAssetPropertyDate];
NSArray *representations =
[asset valueForProperty:ALAssetPropertyRepresentations];
NSDictionary *urls =
[asset valueForProperty:ALAssetPropertyURLs];
NSURL *assetUrl =
[asset valueForProperty:ALAssetPropertyAssetURL];
NSLog(#"type=%#", type);
NSLog(#"location=%#", location);
NSLog(#"duration=%#", duration);
NSLog(#"assetUrl=%#", assetUrl);
NSLog(#"orientation=%#", orientation);
NSLog(#"date=%#", date);
NSLog(#"representations=%#", representations);
NSLog(#"urls=%#", urls);
}
failureBlock:^(NSError *error) {
NSLog(#"Failed to get asset: %#", error);
}];
}
[picker dismissViewControllerAnimated:YES
completion:nil];
}
#end
So when you select an image, you get output that looks like this (including date!):
mediaType=public.image
originalImage=<UIImage: 0x7fb38e00e870> size {1280, 850} orientation 0 scale 1.000000
editedImage=<UIImage: 0x7fb38e09e1e0> size {640, 424} orientation 0 scale 1.000000
cropRect=NSRect: {{0, 0}, {1280, 848}}
mediaUrl=(null)
referenceUrl=assets-library://asset/asset.JPG?id=AC072879-DA36-4A56-8A04-4D467C878877&ext=JPG
mediaMetadata=(null)
type=ALAssetTypePhoto
location=(null)
duration=ALErrorInvalidProperty
assetUrl=assets-library://asset/asset.JPG?id=AC072879-DA36-4A56-8A04-4D467C878877&ext=JPG
orientation=0
date=2014-07-14 04:28:18 +0000
representations=(
"public.jpeg"
)
urls={
"public.jpeg" = "assets-library://asset/asset.JPG?id=AC072879-DA36-4A56-8A04-4D467C878877&ext=JPG";
}
Anyway, hopefully that saves someone else some time.
I spend a while working on this as well for an application I was contracted to build. Basically as the API currently stands it is not possible. The basic problem is the UIImage class STRIPS all EXIF data except for the orientation out. Also the function to save to the camera roll strips this data out. So basically the only way to grab and maintain any extra EXIF data is to save it in a private "camera roll" in your application. I have filed this bug with apple as well and emphasized the need to the app reviewer reps we've been in contact with. Hopefully someday they'll add it in.. Otherwise it makes having GEO tagging completely useless as it only works in the "stock" camera application.
NOTE Some applications on the app store hack around this. By, what I have found, directly accessing the camera roll and SAVING photos straight to it to save GEO data. However this only works with the camera roll/saved photos and NOT the rest of the photo library. The photos "synced" to your phone from your computer have all EXIF data except for orientation stripped.
I still can't understand why those applications were approved (heck they even DELETE from the camera roll) and our application which does none of that is still being held back.
For iOS 8 and later you can use Photos Framework.
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
let url = info[UIImagePickerControllerReferenceURL] as? URL
if url != nil {
let fetchResult = PHAsset.fetchAssets(withALAssetURLs: [url!], options: nil)
let asset = fetchResult.firstObject
print(asset?.location?.coordinate.latitude)
print(asset?.creationDate)
}
}
This is something that the public API does not provide, but could be useful to many people. Your primary recourse is to file a bug with Apple that describes what you need (and it can be helpful to explain why you need it as well). Hopefully your request could make it into a future release.
After filing a bug, you could also use one of the Developer Technical Support (DTS) incidents that came with your iPhone Developer Program membership. If there is a public way to do this, an Apple engineer will know. Otherwise, it may at least help get your plight a bit more attention within the mothership. Best of luck!
Use the UIImagePickerControllerMediaURL dictionary key to get the file URL to the original file. Despite what the documentation says, you can get the file URL for photos and not only movies.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
// Try to get the original file.
NSURL *originalFile = [info objectForKey:UIImagePickerControllerMediaURL];
if (originalFile) {
NSData *fileData = [NSData dataWithContentsOfURL:originalFile];
}
}
You might be able to hash the image data returned by the UIImagePickerController and each of the images in the directory and compare them.
Just a thought, but have you tried TTPhotoViewController in the Three20 project on GitHub?
That provides an image picker that can read from multiple sources. You may be able to use it as an alternative to UIImagePickerController, or the source might give you a clue how to work out how to get the info you need.
Is there a specific reason you want to extract the location data from the image? An alternative could be to get the location separately using the CoreLocation framework. If it's only the geodata you need, this might save you some headaches.
it seems that photo attained by UIImagePickerControllerMediaURL don't have exif tags at all
In order to get this metadata you'll have to use the lower level framework AVFoundation.
Take a look at Apple's Squarecam example (http://developer.apple.com/library/ios/#samplecode/SquareCam/Introduction/Intro.html)
Find the method below and add the line, I've added to the code. The metadata dictionary returned also contains a diagnostics NSDictionary object.
- (BOOL)writeCGImageToCameraRoll:(CGImageRef)cgImage withMetadata:(NSDictionary *)metadata
{
NSDictionary *Exif = [metadata objectForKey:#"Exif"]; // Add this line
}
I'm using this for camera roll images
-(CLLocation*)locationFromAsset:(ALAsset*)asset
{
if (!asset)
return nil;
NSDictionary* pickedImageMetadata = [[asset defaultRepresentation] metadata];
NSDictionary* gpsInfo = [pickedImageMetadata objectForKey:(__bridge NSString *)kCGImagePropertyGPSDictionary];
if (gpsInfo){
NSNumber* nLat = [gpsInfo objectForKey:(__bridge NSString *)kCGImagePropertyGPSLatitude];
NSNumber* nLng = [gpsInfo objectForKey:(__bridge NSString *)kCGImagePropertyGPSLongitude];
if (nLat && nLng)
return [[CLLocation alloc]initWithLatitude:[nLat doubleValue] longitude:[nLng doubleValue]];
}
return nil;
}
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
//UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
// create the asset library in the init method of your custom object or view controller
//self.library = [[ALAssetsLibrary alloc] init];
//
[self.library assetForURL:assetURL resultBlock:^(ALAsset *asset) {
// try to retrieve gps metadata coordinates
CLLocation* myLocation = [self locationFromAsset:asset];
// Do your stuff....
} failureBlock:^(NSError *error) {
NSLog(#"Failed to get asset from library");
}];
}
It works obviously if the image contains gps meta informations
Hope it helps
This is in Swift 3 if you still want support for iOS 8:
import AssetsLibrary
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
if picker.sourceType == UIImagePickerControllerSourceType.photoLibrary,
let url = info[UIImagePickerControllerReferenceURL] as? URL {
let assetLibrary = ALAssetsLibrary()
assetLibrary.asset(for: url, resultBlock: { (asset) in
if let asset = asset {
let assetRep: ALAssetRepresentation = asset.defaultRepresentation()
let metaData: NSDictionary = assetRep.metadata() as NSDictionary
print(metaData)
}
}, failureBlock: { (error) in
print(error!)
})
}
}
For iOS 10 - Swift 3
The picker's callback has an info dict where there is a key with metadata: UIImagePickerControllerMediaMetadata
The naughty way to do this is to traverse the UIImagePickerViewController's views and pick out the selected image in the delegate callback.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
id thumbnailView = [[[[[[[[[[picker.view subviews]
objectAtIndex:0] subviews]
objectAtIndex:0] subviews]
objectAtIndex:0] subviews]
objectAtIndex:0] subviews]
objectAtIndex:0];
NSString *fullSizePath = [[[thumbnailView selectedPhoto] fileGroup] pathForFullSizeImage];
NSString *thumbnailPath = [[[thumbnailView selectedPhoto] fileGroup] pathForThumbnailFile];
NSLog(#"%# and %#", fullSizePath, thumbnailPath);
}
That will give you the path to the full size image, which you can then open with an EXIF library of your choice.
But, this calls a Private API and these method names will be detected by Apple if you submit this app. So don't do this, OK?