Determine the image orientation of image in Firebase Storage - swift

I need to determine the image orientation of the image to be displayed in an imageView. My app stores images in Firebase Storage and uses the SDWebImage integration in Firebase like so:
imageView.sd_setImage(with: imageRef)
Thought that the metadata would hold enough information to determine image orientation, so I tried this:
imageRef.getMetadata(completion: (StorageMetadata?, Error?) -> Void)
While I'm able to retrieve the metadata, I can't seem to figure out the orientation from it. Accordning to the docs (https://firebase.google.com/docs/storage/ios/file-metadata#file_metadata_properties) image height or width cannot be obtained from metadata.
I know I could store image orientation info in my realtime database upon uploading the image, but I'd rather not go that way.
Earlier I used this
imageRef.getData(maxSize: Int64, completion: (Data?, Error?) -> Void)
which can give you an UIImage to play around with. While I can get the image orientation this way, it's too slow for my (client's) taste.
Is it possible to determine image orientation from Firebase Storage info only?

Related

Load Gif image using Kingfisher Swift cause laggy when scrolling down UITableView

I have a tableview and each cell will load Gif file from server.
I'm using Kingfisher to load Gif file into cell imageview like this:
cell.ivPost?.kf.setImage(with: url)
the Gif is loaded successfully, however the tableview is very laggy when scrolling down. I though that the loading, encode & decode gif file should be done asynchronously using Kingfisher
Anyone has solution for this?
I'm using xCode 8 and Swift 3
User this method
public func setImage(with resource: Resource?, placeholder: Image? = default, options: KingfisherOptionsInfo? = default, progressBlock: Kingfisher.DownloadProgressBlock? = default, completionHandler: Kingfisher.CompletionHandler? = default) -> Kingfisher.RetrieveImageTask
you can give options KingfisherOptionsInfo, which is an array of
KingfisherOptionsInfoItem
/// Decode the image in background thread before using.
case backgroundDecode
I've ended up using SDWebImage and it works smoothly with SDAnimatedImageView()

Getting iOS 8 UIImage Orientation Property

I am loading photos from the users Photo roll.
PHImageManager.defaultManager().requestImageForAsset(asset, targetSize:size, contentMode: .AspectFill, options: options)
{
result, info in
//get orientation???
var image:iImage = iImage(uiimage: result)
self.selectedPhotoUpdateCallback(editImage,image)
}
However, some times the photos I load are upside down.
I cannot seem to find any information on the meta data for the orientation of these images.
Do you know how I can check the orientation of the requested asset from a PHImageManager?
The picture displays upright when I am selecting from the photo roll, but when I load the full image into another view its rotated (and my code does not apply any rotation).
You can get the image orientation via image.imageOrientation. Also you can check out this link.

Saving UIImage to plist problems (SWIFT)

Simply stated,
I can encode a UIImage into a .plist if that image is selected from a UIImagePickerController (camera or photo library) and then stored into an object's instance variable using NSKeyedArchiver...
func imagePickerController(picker: UIImagePickerController!, didFinishPickingImage image: UIImage!, editingInfo: NSDictionary!) {
let selectedImage : UIImage = image
myObject.image = selectedImage
...
}
I can NOT, however, encode a UIImage into a .plist if that image is one existing in my app bundle and assigned to a variable like this...
myObject.image = UIImage(named: "thePNGImage")
...where thePNGImage.png lives in my apps bundle. I can display it anytime in my app, but I just can't store it and recover it in a plist!
I want a user to select a profile image from his or her camera or photo library, and assign a default image from my app bundle to their profile should they not choose to select one of their own. The latter is giving me issues.
Any help would be greatly appreciated.
Thanks!
First of all, you don't need any plist. If you want to save the user's preferences, use NSUserDefaults (which is a plist, but it is maintained for you).
Second, you should not be saving an image into a plist. Save a reference to an image, e.g. its URL. That way when you need it again you can find it again. An image is huge; a URL is tiny.
Finally, from the question as I understand it, you want to know whether the user has chosen an image and, if not, you want to use yours as a default. The NSUserDefault could contain a Bool value for this, or you could just use the lack of an image URL to mean "use the default".
But in any case you are certainly right that it is up to you to recover state the next time the app launches based on whatever information you have previously saved. Nothing is going to happen magically by itself. If your image is not magically coming back the next time you launch, that is because you are not bringing it back. Your app has to have code / logic to do that as it launches and creates the interface.
Still having the issue? BTW, I've confirmed your issue. Here's a workaround:
func imageFromXcassets(name: String) -> UIImage
{
// Load image from Xcassets
let myImage = UIImage(named: name)!
// Write image to file
let imagePath = NSHomeDirectory().stringByAppendingPathComponent("Documents/test.png")
UIImagePNGRepresentation(myImage).writeToFile(imagePath, atomically: true)
// Get image from file and return
return UIImage(contentsOfFile: imagePath)!
}
Given the name of an image stored in Images.xcassets, the method returns the UIImage after first writing the image to a file, then reading the image from the file. Now the image can be written successfully to NSUserDefaults just like images obtained from your imagePickerController.
Please note that the example isn't handling optionals like it should. (I'm an optimistic guy.)
ADDITION BELOW
Note on Leonardo’s response:
The originating question asks how to encode an image obtained from Images.xcassets via UIImaged(named: String). I don’t believe Leonardo’s response provides a solution for this.

iPhone Image Loading in Photos Application

I am not sure if this is the right forum for asking this question, but I googled "questions related to iPhone" and the first page had SO from top to bottom. So here goes.
When I open the Photos Application in iPhone (3GS, 4 and 5 all running iOS 5.0) and open an image, first a blurred image appears for a fraction of a second, which clears up into the actual picture. My question is, does the Photo application have a low resolution copy of the high res images which it displays while the image is being loaded, OR does it generate a low res image on the fly before going on to load the high res image.
I am writing an application to browse through the photos, and need to know which is the best approach. That is the purpose behind this question.
The best way is using ALAsset thumbnail . If you have concerns with image clarity, then go with ALAsset fullResolutionImage
Here are some details that you must read before going to start developing an photos application.
An instance of ALAssetsLibrary provides access to the videos and photos that are under the control of the Photos application.
An ALAsset object represents a photo or a video managed by the Photo application.
There are different ALAsset Accessing Representations
1.thumbnail
Returns a thumbnail representation of the asset.
- (CGImageRef)thumbnail
2.aspectRatioThumbnail
Returns an aspect ratio thumbnail of the asset.
- (CGImageRef)aspectRatioThumbnail
3.defaultRepresentation
Returns an asset representation object for the default representation.
- (ALAssetRepresentation *)defaultRepresentation
4.representationForUTI:
Returns an an asset representation object for a given representation UTI.
- (ALAssetRepresentation *)representationForUTI:(NSString *)representationUTI
An ALAssetRepresentation object encapsulates one of the representations of a given ALAsset object.
1.CGImageWithOptions:
Returns a full resolution CGImage of the representation.
- (CGImageRef)CGImageWithOptions:(NSDictionary *)options
2.fullResolutionImage
Returns a CGImage representation of the asset.
- (CGImageRef)fullResolutionImage
3.fullScreenImage
Returns a CGImage of the representation that is appropriate for displaying full screen.
- (CGImageRef)fullScreenImage
Sample Code

How do I create an AVAsset with a UIImage captured from a camera?

I am a newbie trying to capture camera video images using AVFoundation and
want to render the captured frames without using AVCaptureVideoPreviewLayer. I
want a slider control to be able to slow down or speed up the rate of display of
camera images.
Using other peoples code as examples, I can capture images and using an NSTimer,
with my slider control can define on the fly how often to display them, but I
can't convert the image to something I can display. I want to move these
images into a UIView or UIImageView and render them in the timer Fire function.
I have looked at Apples AVCam app, (which uses an AVCaptureVideoPreviewLayer)
but because it has its own built in AVCaptureSession, I can't adjust how often the
images are displayed. (well, you can adjust the preview layer frame rate but
that can't be done on the fly)
I have looked at the AVFoundation programming guide, which talks about AVAssets
and AVPlayer, etc. but I can't see how a camera image can be turned into an
AVAsset. When I look at the AVFoundation guide, and other demos which show how
to define an AVAsset, it only gives me choices of using http stream data to
create the asset, or a url to define an asset using an existing file. I can't
figure out how to make my captured UIImage into an AVAsset, in which case I guess
I could use an AVPlayer, AVPlayerItems and AVAssetTracks to show the image with
an observeValueForKeyPath function checking status and doing [myPlayer play].
(I also studied the WWDC session 405 "Exploring AV Foundation" to see how that
is done)
I have tried similar code as in the WWDC Session 409 "Using the Camera on iPhone."
Like that myCone demo, I can set up the device, the input, the capture session,
the output, the setting up of a callback function to a CMSampleBuffer, and I
can collect UIImages and size them, etc. At this point I want to send that image
to a UIView or UIimageView. The session 409 just talks about doing it with
CFShow(sampleBuffer). This wasn't explained, and I guess its just assuming a
knowledge of Core Foundation I don't yet have. I think I am turning the captured
output in the sample buffer into a UIImage, but I can't figure out how to render
it. I created an IBOutlet UIImageView in my nib file, but when I try to stuff
the image into that view, nothing gets displayed. Do I need an AVPlayerLayer?
I have looked at the UIImagePickerViewController as an alternate method of
controlling how often I display captured camera images, and I dont see that I
can change the time on the fly to display images using that controller either.
So, as you can see, I am learning this stuff with the Apple development forum and
their documentation, the WWDC videos, and various websites such as
stackoverflow.com but have yet to see any examples of doing camera to screen
without using AVCaptureVideoPreviewLayer, UIImagePickderViewController or by
using an AVAsset that isnt already a file or http stream.
Can anybody make a suggestion? Thanks in advance.