I am loading photos from the users Photo roll.
PHImageManager.defaultManager().requestImageForAsset(asset, targetSize:size, contentMode: .AspectFill, options: options)
{
result, info in
//get orientation???
var image:iImage = iImage(uiimage: result)
self.selectedPhotoUpdateCallback(editImage,image)
}
However, some times the photos I load are upside down.
I cannot seem to find any information on the meta data for the orientation of these images.
Do you know how I can check the orientation of the requested asset from a PHImageManager?
The picture displays upright when I am selecting from the photo roll, but when I load the full image into another view its rotated (and my code does not apply any rotation).
You can get the image orientation via image.imageOrientation. Also you can check out this link.
Related
I need to determine the image orientation of the image to be displayed in an imageView. My app stores images in Firebase Storage and uses the SDWebImage integration in Firebase like so:
imageView.sd_setImage(with: imageRef)
Thought that the metadata would hold enough information to determine image orientation, so I tried this:
imageRef.getMetadata(completion: (StorageMetadata?, Error?) -> Void)
While I'm able to retrieve the metadata, I can't seem to figure out the orientation from it. Accordning to the docs (https://firebase.google.com/docs/storage/ios/file-metadata#file_metadata_properties) image height or width cannot be obtained from metadata.
I know I could store image orientation info in my realtime database upon uploading the image, but I'd rather not go that way.
Earlier I used this
imageRef.getData(maxSize: Int64, completion: (Data?, Error?) -> Void)
which can give you an UIImage to play around with. While I can get the image orientation this way, it's too slow for my (client's) taste.
Is it possible to determine image orientation from Firebase Storage info only?
I have an augmented reality app that plays a video on top of images and it was working very well, but as soon as I neared 50+ images, I started getting an error on some of the images, "AR reference image 'x' is too similar to 'y'." I am panicking because I need this done quickly and this error appears at random for no apart reason. In the linked picture, the reference images are clearly not similar in any way and when I even change the name of one of the pictures, it resolves itself at first until more issues of the same error come up on different reference images. If anyone has any theories or questions, please post them here! Thank you so much to anyone who can shine some light on this issue!
Image of AR Reference Image folder with error on pictures:
https://imgur.com/a/U3dlFef
Update: changed every image to be number 1-39 and the same images that in the last picture had errors still had errors so it must be something related to the pictures themselves. Still confused how though. Tried deleting every reference image and reuploaded exact same images and after giving each it's physical dimensions, same error popped up for 2 images still.
Is it possible to upload this update to Apple with this error and them allow it to go through? I did a test upload to my device and tested all images with errors and they all work as intended. I currently have no solution to a problem that seems very superficial. Thanks again!
ARReference image is created from these images and even so your human eyes sees fully different images, after parsing, it might be that the detected structure is too similar (because it is not looking for exact pixel images rather characteristics of given image). This is why your problem appears after increased number of images (bigger chance of similar characteristics). So your best bet might be to use less images or if not possible, change images until all warning disappear.
If you don't use these images at the same time (Same AR session), like maybe you have some kind of selection in your application, you can try that you use a simple assets catalog for the images and loading these reference images from code. I use this method because our application downloads simple images for markers and I create the reference image programatically.
private func startTracking(WithMarkerImage marker: UIImage) {
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.horizontal, .vertical]
var orientation: CGImagePropertyOrientation!
switch marker.imageOrientation {
case .up:
orientation = CGImagePropertyOrientation.up
case .down:
orientation = CGImagePropertyOrientation.down
case .left:
orientation = CGImagePropertyOrientation.left
case .right:
orientation = CGImagePropertyOrientation.right
case .upMirrored:
orientation = CGImagePropertyOrientation.upMirrored
case .downMirrored:
orientation = CGImagePropertyOrientation.downMirrored
case .leftMirrored:
orientation = CGImagePropertyOrientation.leftMirrored
case .rightMirrored:
orientation = CGImagePropertyOrientation.rightMirrored
}
let referenceImage = ARReferenceImage(marker.cgImage!, orientation: orientation, physicalWidth: 0.15)
configuration.detectionImages = [mareferenceImageker]
configuration.environmentTexturing = .none
configuration.isLightEstimationEnabled = true
_arSession.run(configuration, options: [.removeExistingAnchors, .resetTracking])
}
I'm building a camera effects app and would like to be able to add geo tagging to the screenshots that I capture.
I'm grabbing camera images from an AVCaptureSession frame buffer and am displaying them to the user after some processing. The end result is a PNG screenshot. How can I take a CLLocation object and add it to a PNG image?
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
//how to add geotagging to this method?
[self performImageCaptureFrom:sampleBuffer];
}
Thank you!
You can follow this post to add the exif data to the png image. How to write exif metadata to an image (not the camera roll, just a UIImage or JPEG)
Note that the user must have already allowed location services for your app. If you are processing the image or letting the user do so, you might find it easier to add the metadata before the final save (which will allow you to not consume double memory by not reopening the image twice, and also since I don't know what kind of processing you are doing the Geo-tagging wont be lost during it).
After post-producing the pictures taken with the iphone camera and saving them to the camera roll, I see the final result is being saved without the orientation information.
When the iPhone reads these pictures, it reads orientation = "0", that is "taken with the iphone button on the right", and that messes all up.
How do I save the orientation metadata on the picture going to the camera roll?
thanks for any help.
Depends on the file format:
If you save with UIImageJPEGRepresentation() the orientation is saved correctly in the exif data. If you load a JPEG file with exif orientation data you get an UIImage with a it's imageOrientation set and the original CGImage data(unrotated).
If you save with UIImagePNGRepresentation() it doesn't save the orientation in the exif data(or anywhere). If you load a png with exif orientation data then it prerotates the CGImage data and always gives you a UIImage with an imageOrientation of "UIImageOrientationUp".
I figured this out from various posts and experimentation.
So displaying either PNG or JPEG UIImages works the same, but messing with the data is very different.
If I remember correctly, the only way was to render processed image in a correct orientation yourself and then save it.
When I use UIImagePickerController to select a photo, either from the Camera Roll or the Photo Library, the image that gets returned to me in the method
'didFinishPickingImage'
does not contain the exif data for latitude and longitude. I know that the headers are there, because they show up when imported into iPhoto, also if I upload images from the Camera Roll, they also contain the exif headers for location. Is there a way to get UIImagePickerController to deliver that information as well?
It's up to you to geotag the image received from the UIImagePickerController, see my answer here :
UIImagePickerController and extracting EXIF data from existing photos
It seems that there is not. The UIImagePickerController incorrectly strips off the location tags, but they are there on the file (which you are not allowed to access as it's outside your sandbox).
See the discussion on Apple here: link text
And the comments on Flicker here: link text