Is it possible, in an iPhone app, to extract location information (geocode, I suppose it's called) from a photo taken with the iPhone camera?
If there is no API call to do it, is there any known way to parse the bytes of data to extract the information? Something I can roll on my own?
Unfortunately no.
The problem is thus;
A jpeg file consists of several parts. For this question the ones we are interested in are the image data and the exif data. The image data is the picture and the exif data are where things like geocoding, shutter speed, camera type and so on are stored.
A UIImage (and CGImage) only contain image data, no tags.
When the image picker selects an image (either from the library or the camera) it returns a UIImage, not a jpeg. This UIImage is created from the jpeg image data, but the exif data in the jpeg is discarded.
This means this data is not in the UIImage at all and thus is not accessible.
I think the selected answer is wrong, actually. Well, not wrong. Everything it said is correct, but there is a way around that limitation.
UIImagePickerController passes a dictionary along with the UIImage it returns. One of the keys is UIImagePickerControllerMediaURL which is "the filesystem URL for the movie". However, as noted here in newer iOS versions it returns a url for images as well. Couple that with the exif library mentioned by #Jasper and you might be able to pull geotags out of photos.
I haven't tried this method, but as #tomtaylor mentioned, this has to be possible somehow, as there are a few apps that do it. (e.g. Lab).
Related
I have an imagePickerController that is used for importing photos from library into my app.
When in ALAssetsLibraryAssetForURLResultBlock, I'm trying to find out if the ALAsset I've got in the block is a photo taken as a screen-shot or is it a "genuine" photo, taken by the camera.
I've tried to go through the ALAsset's metadata dictionaries but couldn't find any flag / indication that might fit.
Anyone have any ideas?
For screenshot, its UTI is always a "public.png" and same size as screen (be sure you have multiply [UIScreen scale] on screen bounds width and height), just need to check these 2 metadata, you can easily identify screenshot.
Add MetaData to UIImage while saving to Photo Library. Same metadata of UIImage can be used to know if its screenshot or not.
Refer Save_Photo_to_Album_with_Metadata
Well, I was researching and experimenting .. and the closest solution I've found is based on the fact that iPhone screenshots don't yield EXIF records (while all other generated photos do generate them).
Therefore, once a photo is selected in the picker, I'm checking if the photo's metadata consists an EXIF record and if it doesn't - I conclude that the photo was screenshot.
I found it's the "as good as it gets" solution for now, although it's not an official one.
Cheers.
I'm trying to get the cropped version of an image that's pulled using ALAsset. Specifically, I'm selecting items from the user's Photo Library and then uploading them. The issue is that in the library thumbnail view, iOS is showing us the cropped version. When you select that thumbnail and pull that image's asset using ALAsset, I get the full resolution version.
I did some research and couldn't find anything that helps in getting a second coordinate system of where the cropping happens.
To test it, you need iOS5 to edit the image in your library. Select an image in your image library, select "Edit", and crop the image. When you get the ALAsset you'll get the full image, and if you sync using iPhoto, iPhoto also pulls the full image. Also, you can re-edit the image and undo your crop.
This is how I'm getting the image:
UIImage *tmpImage = [UIImage imageWithCGImage:[[asset defaultRepresentation] fullResolutionImage]];
That gives me the full resolution image, obviously. There's a fullScreenImage flag which scales the full resolution image to the size of the screen. That's not what I want.
The ALAssetRepresenation class has a scale field, but that's a float value, which is also what I don't want.
If anyone can tell me where this cropped coordinate system can be found, I'd appreciate it.
Your Options:
Option 1 (ALAssetLibrary)
Use the - (CGImageRef)fullScreenImage method of AlAssetRepresentation.
Pros:
All the hard work is done for you, you get an image that looks just like the one in the Photos app. This includes cropping, and other changes. Easy.
Cons:
The resolution is "screen size", only as big as the device you are using, not the full possible resolution of the cropped image. If this doesn't concern you, then this is the perfect option.
Option 2 (ALAssetLibrary)
Extract the cropping data using the AdjustmentXMP key in the image's metadata (what #tom is referring to). Apply the crop.
Benefit:
It is possible to get a cropped image at the best possible resolution
Cons
You only get the cropping edits, not any other adjustments (like red-eye)
Who knows what Apple will support in the future in "Edit" mode, you may have to apply more edits in the future.
It's complicated, you first have to parse the XML data to read the crop rectangle, crop the unrotated image, and then apply the rotation.
Option 3 (Wishful Thinking)
Beg Apple to include a method like fullResolutionEditedImage which gives you the best possible quality photo, with all edits applied.
Pros:
Everything magically solved.
Cons:
Apple may never add this method.
Option 4 (UIImagePickerController)
This option only applies if you are using the image picker, you can't use it directly with the asset library
In the NSDictionary returned by -(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
You can extract the full sized, adjusted image from the UIImagePickerControllerOriginalImage key. Save this image somewhere. Then, instead of retrieving the image from the asset library, load the copy you made.
Pros:
You get the full size image, with adjustments
This is the only option Apple gives us for getting the full size image with all adjustments (like red-eye, etc), and not just the crop. This is particularly important in iOS 7 with the introduction of filters that can drastically alter the image.
Cons:
Can only be used with the image picker (not ALAssetRepresentation)
You must keep around a full-sized copy of the image. Depending on the number of such images, the disk usage by your app could grow substantially.
Update for iOS 7: you may wish to consider Option 4, or Option 1, as iOS 7 supports many operations now like filters, and your users will probably notice if they are missing. These two options support filters (and other edits), with Option 4 giving you a higher resolution result.
When a photo has been cropped with the iOS Photos App, the cropping coordinates can be found in the ALAssetRepresentation's metadata dictionary. fullResolutionImage will give you the uncropped photo, you have to perform the cropping yourself.
The AdjustmentXMP metadata contains not only the cropping coordinates but also indicates if auto-enhance or remove-red-eyes has been applied.
As of iOS 6.0 CIFilter provides filterArrayFromSerializedXMP:inputImageExtent:error: Probably you can use the ALAssetRepresentation's AdjustmentXMP metadata here and apply the CIFilter onto the ALAssetRepresentation's fullResolutionImage to recreate the modified image.
Be aware that the iOS Photos App handles JPG and RAW images differently. For JPG images a new ALAsset with the XMP metadata is stored in the Camera Roll. For RAW images an ALAssetRepresentation is added to the original ALAsset. I'm not sure if this additional ALAssetRepresentation is the modified image and if it has the AdjustmentXMP metadata. In addition to JPG and RAW images you should also test the behaviour for RAW+JPG images.
I have encountered a similar problem to others on SO regarding the orientation of UIImages taken using the iPad camera. Essentially, I am taking a UIImage using the camera on the iPad. When I then display it, it has rotated through 90 degrees.
From reading other questions and answers, I now understand that when the camera takes a photo it stores an EXIF tag which determines the orientation of the photo. This would normally allow all Mac and iOS apps to read this orientation data. However, I am storing my images in the documents directory as NSData (having been converted through UIImagePNGRepresentation and saving the URL in core data. I am assuming the process of doing this is causing me to lose the orientation EXIF tag so when it is retrieved it is displayed incorrectly.
Can anyone think of a way to correct this?
Thoughts, advice and pointers all welcome.
Many thanks
EXIF is something that is in image files, not UIImage. When you take a photo with the camera, UIImagePicker never saves it in a file, therefore, there is no EXIF yet.
In your case though, the problem isn't the orientation of the iPad. I believe UIImagePNGRepresentation always saves it rotated 90 degrees (regardless of rotation). You just need to rotate it before generating a PNG. I don't know why it does this, but you just need to work around it.
EXIF metadata is not accurate in orientation it is accurate in GPS data, it will detect only 90 & 180 & 270 with some Error cases try this
iOS UIImagePickerController result image orientation after upload
Good Luck
Take a look at this library: http://vocaro.com/trevor/blog/2009/10/12/resize-a-uiimage-the-right-way/ He has the code to take an image and remove the orientation info by transforming it.
Is there any Image API available in iPhone?? so using that API I can develop a functionality to get all information of that photo in iPhone.
Thanks in advance.
You need to look at the ImageIO framework
http://developer.apple.com/library/ios/#documentation/GraphicsImaging/Conceptual/ImageIOGuide/imageio_intro/ikpg_intro.html#//apple_ref/doc/uid/TP40005462
yes, since IOS4 http://developer.apple.com/library/ios/#documentation/AssetsLibrary/Reference/ALAsset_Class/Reference/Reference.html
depending on what you want exactly, this deals with the camera output.
The question is very fuzzy: please make it more exact.
if you want to get information about the size of the image, orientation, type etc, you can use the UIImage and CGImage classes which will give you information about the image itself. (size, colors, pixel size etc)..
If you want to get or access pictures in the Photos app of the iphONE, ALAssetsLibrary is what you need to read about - you can save and get images and videos from the iPhone's native album.
If you want the metadata saved alongside each image - like the location it picture was taken, the lens size, date taken etc.. - I am unsure how one gets that data.
This is kind of continuation of this thread
iphone how to go to the image gallery after taking photo through Xcode
I am following the method suggested by Saurabh(the link given in the comment). I just want to know if the EXIF data will be retained if I follow this method(storing the image in the application)
The short answer, yes, it will be retained.