Orientation does not behave correctly with Photo in ALAsset - iphone

I current have an app that uses ALAsssetsLibrary to fetch the photos. I have placed the photo to an image view and I am able to upload to the server. When I tested on the real device after taking some photos, I found out the photos that supposed to be taken in Portrait become a landscape.
Therefore, I called different function to get the CGImage like this:
UIImage *image = [UIImage imageWithCGImage:[representation fullResolutionImage] scale:1.0 orientation:(UIImageOrientation)[representation orientation]];
The first tried out, I used this :
UIImage *image = [UIImage imageWithCGImage:[representation fullResolutionImage]]
I thought the one with scale and orientation could give me the right orientation that the photo was taken. But it didn't give me the right solution.
Do I miss anything that is necessary to generate a correct orientation of photo?

The correct orientation handling depends on the iOS version you are using.
On iOS4 and iOS 5 the thumbnail is already correctly rotated, so you can initialize your UIImage without specifying any rotation parameters.
However for the fullScreenImage, the behavior is different for each iOS version. On iOS 5 the image is already rotated on iOS 4 not.
So on iOS4 you should use:
ALAssetRepresentation *defaultRep = [asset defaultRepresentation];
UIImage *_image = [UIImage imageWithCGImage:[defaultRep fullScreenImage]
scale:[defaultRep scale] orientation:(UIImageOrientation)[defaultRep orientation]];
On iOS5 the following code should work correctly:
ALAssetRepresentation *defaultRep = [asset defaultRepresentation];
UIImage *_image = [UIImage imageWithCGImage:[defaultRep fullScreenImage] scale:[defaultRep scale] orientation:0];
Cheers,
Hendrik

Try this code:-
UIImage* img = [UIImage imageWithCGImage:asset.thumbnail];
img = [UIImage imageWithCGImage:img.CGImage scale:1.0 orientation:UIImageOrientationUp];
This may help you.

My experience is limited to IOS 5.x but I can tell you that the thumbnail and fullscreen images are oriented properly. It's the fullresolutionimage that's horizontal when shot vertically. My solution is to use a category on uiimage that I got from here:
http://www.catamount.com/forums/viewtopic.php?f=21&t=967&start=0
It provides a nice rotating method on a UIImage like this:
UIImage *tmp = [UIImage imageWithCGImage:startingFullResolutionImage];
startingFullResolutionImage = [[tmp imageRotatedByDegrees:-90.0f] CGImage];

For fullResolutionImage, I'd like to provide a solution as follows,
ALAssetRepresentation *rep = [asset defaultRepresentation];
// First, write orientation to UIImage, i.e., EXIF message.
UIImage *image = [UIImage imageWithCGImage:[rep fullResolutionImage] scale:rep.scale orientation:(UIImageOrientation)rep.orientation];
// Second, fix orientation, and drop out EXIF
if (image.imageOrientation != UIImageOrientationUp) {
UIGraphicsBeginImageContextWithOptions(image.size, NO, image.scale);
[image drawInRect:(CGRect){0, 0, image.size}];
UIImage *normalizedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
image = normalizedImage;
}
// Third, compression
NSData *imageData = UIImageJPEGRepresentation(image, 1.0);
imageData is what you want, and just upload it to your photo server.
By the way, if you think EXIF is useful, you can complement it to normalizedImage as you wish.

Related

Rotating portrait UIImage that somehow has a UIImageOrientationRight orientation

Hi everyone,
I'm having an issue rotating a portrait UIImage (i.e. WIDTH < HEIGHT) that somehow has an orientation equal to UIImageOrientationRight. I'm not sure how this comes to be but it happens with a few images in my library captured with an iPhone 4.
This is my code (that does work but only if the orientation is equal to UIImageOrientationUp):
UIGraphicsBeginImageContextWithOptions(rotatedScaledRect.size, NO, 0.0);
CGContextRef myContext = UIGraphicsGetCurrentContext();
CGContextSetInterpolationQuality(myContext, kCGInterpolationHigh);
CGContextTranslateCTM(myContext,
(rotatedScaledRect.size.width/2),
(rotatedScaledRect.size.height/2));
CGContextConcatCTM(myContext, transformation);
CGContextScaleCTM(myContext, 1.0, -1.0);
CGContextDrawImage(myContext, CGRectMake(-roundf(newImage.size.width / 2), -roundf(newImage.size.height / 2), newImage.size.width, newImage.size.height), [newImage CGImage]);
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
I need to apply CGContextConcatCTM because I have several concatenated transformations to the image.
I've tried several different approaches to no avail.
Thanks in advance for your help.
First of all, excuse me for my english, because it's not my native language. I hope it's not so late to give this answer!
I had a similar problem loading photos from library taken with iPhone or iPod cammera because of this. If that's your case, you can found some info here: https://discussions.apple.com/thread/2495567?start=30&tstart=0, specifically from an answer that says "Apple chose to use the Orientation flag in the EXIF data to inform the program displaying the image as to how to rotate it so the image is presented correctly."
So, for load a photo from library taken with iPhone Cammera, you have to do this:
ALAssetRepresentation *assetRep = [photo defaultRepresentation];
CGImageRef imageRef = [assetRep fullResolutionImage];
UIImage *image = [UIImage imageWithCGImage:imageRef scale:[assetRep scale] orientation:[assetRep orientation]];
where photo is an ALAsset object taken from library.
Well, i hope this helps!

Is there a way to create a UIImage without orientation?

I am about to crop images but facing the orientation issue while creating image using CGImageCreateWithImageInRect.
CGImageCreateWithImageInRect crops the image based on UIImage orientation so I cannot get the right images as I want.
I want the plain image from the UIImage/maybe-camera-image without orientation meta data.
Is there a way to achieve this?
EDIT:
I have the some selected rect of the 'UIImage'. If I apply the crop on the 'UIImage' it gives different output with some other orientation
Try this:
CGImageRef imageRef = CGImageCreateWithImageInRect(src.CGImage, croppingRect);
UIImage *result = [UIImage imageWithCGImage:imageRef scale:1.0f orientation:src.imageOrientation];
Could help someone!
-(UIImage*) removeImageOrientation:(UIImage*)ImgtakenByUser
{
UIGraphicsBeginImageContext(ImgtakenByUser.size);
[ImgtakenByUser drawInRect:CGRectMake(0, 0, ImgtakenByUser.size.width, ImgtakenByUser.size.height)];
UIImage *_image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return _image;
}
Thanks!

Resize an ALAsset Photo takes a long time. Any way around this?

I have a blog application that I'm making. To compose a new entry, there is a "Compose Entry" view where the user can select a photo and input text. For the photo, there is a UIImageView placeholder and upon clicking this, a custom ImagePicker comes up where the user can select up to 3 photos.
This is where the problem comes in. I don't need the full resolution photo from the ALAsset, but at the same time, the thumbnail is too low resolution for me to use.
So what I'm doing at this point is resizing the fullResolution photos to a smaller size. However, this takes some time, especially when resizing up to 3 photos to a smaller size.
Here is a code snipped to show what I'm doing:
ALAssetRepresentation *rep = [[dict objectForKey:#"assetObject"] defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
CGRect screenBounds = [[UIScreen mainScreen] bounds];
UIImage *previewImage;
UIImage *largeImage;
if([rep orientation] == ALAssetOrientationUp) //landscape image
{
largeImage = [[UIImage imageWithCGImage:iref] scaledToWidth:screenBounds.size.width];
previewImage = [[UIImage imageWithCGImage:iref] scaledToWidth:300];
}
else // portrait image
{
previewImage = [[[UIImage imageWithCGImage:iref] scaledToHeight:300] imageRotatedByDegrees:90];
largeImage = [[[UIImage imageWithCGImage:iref] scaledToHeight:screenBounds.size.height] imageRotatedByDegrees:90];
}
}
Here, from the fullresolution image, I am creating two images: a preview image (max 300px on the long end) and a large image (max 960px or 640px on the long end). The preview image is what is shown on the app itself in the "new entry" preview. The large image is what will be used when uploading to the server.
The actual code I'm using to resize, I grabbed somewhere from here:
-(UIImage*)scaledToWidth:(float)i_width
{
float oldWidth = self.size.width;
float scaleFactor = i_width / oldWidth;
float newHeight = self.size.height * scaleFactor;
float newWidth = oldWidth * scaleFactor;
UIGraphicsBeginImageContext(CGSizeMake(newWidth, newHeight));
[self drawInRect:CGRectMake(0, 0, newWidth, newHeight)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Am I doing things wrong here? As it stands, the ALAsset thumbnail is too low clarity, and at the same time, I dont need the entire full resolution. It's all working now, but the resizing takes some time. Is this just a necessary consequence?
Thanks!
It is a necessary consequence of resizing your image that it will take some amount of time. How much depends on the device, the resolution of the asset and the format of the asset. But you don't have any control over that. But you do have control over where the resizing takes place. I suspect that right now you are resizing the image in your main thread, which will cause the UI to grind to a halt while you are doing the resizing. Do enough images, and your app will appear hung for long enough that the user will just go off and do something else (perhaps check out competing apps in the App Store).
What you should be doing is performing the resizing off the main thread. With iOS 4 and later, this has become much simpler because you can use Grand Central Dispatch to do the resizing. You can take your original block of code from above and wrap it in a block like this:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW, 0), ^{
ALAssetRepresentation *rep = [[dict objectForKey:#"assetObject"] defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
CGRect screenBounds = [[UIScreen mainScreen] bounds];
__block UIImage *previewImage;
__block UIImage *largeImage;
if([rep orientation] == ALAssetOrientationUp) //landscape image
{
largeImage = [[UIImage imageWithCGImage:iref] scaledToWidth:screenBounds.size.width];
previewImage = [[UIImage imageWithCGImage:iref] scaledToWidth:300];
}
else // portrait image
{
previewImage = [[[UIImage imageWithCGImage:iref] scaledToHeight:300] imageRotatedByDegrees:90];
largeImage = [[[UIImage imageWithCGImage:iref] scaledToHeight:screenBounds.size.height] imageRotatedByDegrees:90];
}
dispatch_async(dispatch_get_main_queue(), ^{
// do what ever you need to do in the main thread here once your image is resized.
// this is going to be things like setting the UIImageViews to show your new images
// or adding new views to your view hierarchy
});
}
});
You'll have to think about things a little differently this way. For example, you've now broken up what used to be a single step into multiple steps now. Code that was running after this will end up running before the image resize is complete or before you actually do anything with the images, so you need to make sure that you didn't have any dependencies on those images or you'll likely crash.
A late answer, but for those stumbling on this question, you might want to consider using the fullScreenImage rather than the fullResolutionImage of the defaultRepresentation. It's usually much smaller, but still large enough to maintain good quality for larger thumbnails.

Scaling issue in saving UIImage as NSData

I would like to save some image as nsdata into a plist and retrieve it later.
But I got a problem.
The problem is, if the UIImage with a scale of 2.0, when I load it again later with
[UIImage imageWithData:]
the image show 2x size
What I want is to like the behaviour of
[UIImage imageNamed:]
which will load according to the screen scale.
How can I do it?
I finally solve it by this code:
UIImage *image = [UIImage imageWithData:imageData];
if (isRetinaDisplay) {
image = [UIImage imageWithCGImage:[image CGImage] scale:2.0f orientation:UIImageOrientationUp];
}
You could take into account the possibility that the screen is Retina like so:
CGFloat screenScale = [UIScreen mainScreen].scale;
UIImage *image = [UIImage imageWithData:data scale:screenScale];
This code covers both cases (Retina / non-Retina) in one line.
Since iOS 6 you can also use
+ (UIImage *)imageWithData:(NSData *)data scale:(CGFloat)scale NS_AVAILABLE_IOS(6_0);
The scale property is read-only, but what you can do is subclass UIImageView to make it writable (or make it automatic according to what device it's running on).
// ImageView.h
#interface ImageView : UIImageView
// ImageView.m
CGFloat scaleProperty = 1.0;
#implementation ImageView
- (void)setScale:(CGFloat)scale
{
scaleProperty = scale;
}
- (CGFloat)scale
{
return scaleProperty;
}
#end

Rotating an image prior to saving in iOS

In my app, I need to save an image. I need the image to always be saved as a portrait, even if the device is in landscape mode. I am checking to see if the device is in landscape mode and if it is, I would like to rotate my image before it's saved as a PNG. Can anyone help me figure this out?
-(void) saveImage {
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
if (UIInterfaceOrientationIsLandscape([[UIDevice currentDevice] orientation])) {
//// need to rotate it here
}
NSData *data = UIImagePNGRepresentation (viewImage);
[data writeToFile:savePath atomically:YES];
}
This thread may help you. It shows how to use the imageOrientation method of a UIImage in order to switch the orientation. Hope that Helps!