MFMailComposeViewController image orientation - iphone

I'm developing a Universal app and I'm coding for iOS6.
I'm using the imagePickerController to take a photo and then I am sending it as an attachment using MFMailComposeViewController. All of that is working.
My problem is that when I shoot a picture in portrait mode, it is displayed by the MFMailComposeViewController in landscape mode. Also, when it arrives at the destination E-Mail address, it is displayed in landscape mode.
If I shoot the picture in landscape mode, it is displayed by the MFMailComposeViewController in landscape mode and when it arrives at the destination E-Mail address, it is displayed in landscape mode. So that's all OK.
I have the same issue on both of my test devices; an iPhone5 and an iPad2.
How can I make a picture shot in portrait mode arrive at the E-Mail destination in portrait mode?
Here's how I am adding the image to the E-Mail:
if ( [MFMailComposeViewController canSendMail] )
{
MFMailComposeViewController * mailVC = [MFMailComposeViewController new];
NSArray * aAddr = [NSArray arrayWithObjects: gAddr, nil];
NSData * imageAsNSData = UIImagePNGRepresentation( gImag );
[mailVC setMailComposeDelegate: self];
[mailVC setToRecipients: aAddr];
[mailVC setSubject: gSubj];
[mailVC addAttachmentData: imageAsNSData
mimeType: #"image/png"
fileName: #"myPhoto.png"];
[mailVC setMessageBody: #"Blah blah"
isHTML: NO];
[self presentViewController: mailVC
animated: YES
completion: nil];
}
else
{
NSLog( #"Device is unable to send email in its current state." );
}

I've spent a number of hours working on this issue and I am now clear about what's going on and how to fix it.
To repeat, the problem I ran into is:
When I shoot an image in portrait mode on the camera using the imagePickerController and pass that image to the MFMailComposeViewController and e-mail it, it arrives at the destination E-Mail address and is displayed there incorrectly in landscape mode.
However, if I shoot the picture in landscape mode and then send it, it is displayed at the E-mail's destination correctly in landscape mode
So, how can I make a picture shot in portrait mode arrive at the E-Mail destination in portrait mode? That was my original question.
Here's the code, as I showed it in the original question, except that I am now sending the image as a JPEG rather than a PNG but this has made no difference.
This is how I'm capturing the image with the imagePickerController and placing it into a global called gImag:
gImag = (UIImage *)[info valueForKey: UIImagePickerControllerOriginalImage];
[[self imageView] setImage: gImag]; // send image to screen
[self imagePickerControllerRelease: picker ]; // free the picker object
And this is how I E-Mail it using the MFMailComposeViewController:
if ( [MFMailComposeViewController canSendMail] )
{
MFMailComposeViewController * mailVC = [MFMailComposeViewController new];
NSArray * aAddr = [NSArray arrayWithObjects: gAddr, nil];
NSData * imageAsNSData = UIImageJPEGRepresentation( gImag );
[mailVC setMailComposeDelegate: self];
[mailVC setToRecipients: aAddr];
[mailVC setSubject: gSubj];
[mailVC addAttachmentData: imageAsNSData
mimeType: #"image/jpg"
fileName: #"myPhoto.jpg"];
[mailVC setMessageBody: #"Blah blah"
isHTML: NO];
[self presentViewController: mailVC
animated: YES
completion: nil];
}
else NSLog( #"Device is unable to send email in its current state." );
When I describe how to solve this, I’m going to focus on working with an iPhone 5 and using its main camera which shoots at 3264x2448 just to keep things simple. This problem does, however, affect other devices and resolutions.
The key to unlocking this problem is to realize that when you shoot an image, in either portrait of landscape, the iPhone always stores the UIImage the same way: as 3264 wide and 2448 high.
A UIImage has a property which describes its orientation at the time the image was captured and you can get it like this:
UIImageOrientation orient = image.imageOrientation;
Note that the orientation property does not describe how the data in the UIImage is physically configured (as as 3264w x 2448h); it only describes its orientation at the time the image was captured.
The property’s use is to tell software, which is about to display the image, how to rotate it so it will appear correctly.
If you capture an image in portrait mode, image.imageOrientation will return UIImageOrientationRight. This tells displaying software that it needs to rotate the image 90 degrees so it will display correctly. Be clear that this ‘rotation’ does not effect the underlying storage of the UIImage which remains as 3264w x 2448h.
If you capture an image in landscape mode, image.imageOrientation will return UIImageOrientationUp. UIImageOrientationUp tells displaying software that the image is fine to display as it is; no rotation is necessary. Again, the underlying storage of the UIIMage is 3264w x 2448h.
Once you are clear about the difference between how the data is physically stored vs. how the orientation property is used to describe its orientation at the time it was captured, things begin to get clearer.
I created a few lines of debugging code to ‘see’ all of this.
Here’s the imagePickerController code again with the debugging code added:
gImag = (PIMG)[info valueForKey: UIImagePickerControllerOriginalImage];
UIImageOrientation orient = gImag.imageOrientation;
CGFloat width = CGImageGetWidth(gImag.CGImage);
CGFloat height = CGImageGetHeight(gImag.CGImage);
[[self imageView] setImage: gImag]; // send image to screen
[self imagePickerControllerRelease: picker ]; // free the picker object
If we shoot portrait, gImage arrives with UIImageOrientationRight and width = 3264 and height = 2448.
If we shoot landscape, gImage arrives with UIImageOrientationUp and width = 3264 and height = 2448.
If we continue on to the E-Mailng MFMailComposeViewController code, I’ve added the debugging code in there as well:
if ( [MFMailComposeViewController canSendMail] )
{
MFMailComposeViewController * mailVC = [MFMailComposeViewController new];
NSArray * aAddr = [NSArray arrayWithObjects: gAddr, nil];
UIImageOrientation orient = gImag.imageOrientation;
CGFloat width = CGImageGetWidth(gImag.CGImage);
CGFloat height = CGImageGetHeight(gImag.CGImage);
NSData * imageAsNSData = UIImageJPEGRepresentation( gImag );
[mailVC setMailComposeDelegate: self];
[mailVC setToRecipients: aAddr];
[mailVC setSubject: gSubj];
[mailVC addAttachmentData: imageAsNSData
mimeType: #"image/jpg"
fileName: #"myPhoto.jpg"];
[mailVC setMessageBody: #"Blah blah"
isHTML: NO];
[self presentViewController: mailVC
animated: YES
completion: nil];
}
else NSLog( #"Device is unable to send email in its current state." );
Nothing amazing to see here. We get exactly the same values as we did back in the imagePickerController code.
Let's see exactly how the problem manifests:
To begin, the camera takes a portrait shot and it is displayed correctly in portrait mode by the line:
[[self imageView] setImage: gImag]; // send image to screen
It is display correctly because this line of code sees the orientation property and rotates the image appropriately (while NOT touching the underlying storage at 3264x2448).
Flow-of-control goes to the E-Mailer code now and the orientation property is still present in gImag so when the MFMailComposeViewController code displays the image in the outgoing E-Mail, it is correctly oriented. The physical image is still stored as 3264x2448.
The E-Mail is sent and, on the receiving end, knowledge of the orientation property has been lost so the receiving software displays the image as it is physically laid out as 3264x2448, i.e. landscape.
In debugging this, I ran into an additional confusion. And that is that the orientation property can be stripped from the UIImage, if you make a copy of it incorrectly.
This code shows the problem:
if ( [MFMailComposeViewController canSendMail] )
{
MFMailComposeViewController * mailVC = [MFMailComposeViewController new];
NSArray * aAddr = [NSArray arrayWithObjects: gAddr, nil];
UIImageOrientation orient = gImag.imageOrientation;
CGFloat width = CGImageGetWidth(gImag.CGImage);
CGFloat height = CGImageGetHeight(gImag.CGImage);
UIImage * tmp = [[UIImage alloc] initWithCGImage: gImag.CGImage];
orient = tmp.imageOrientation;
width = CGImageGetWidth(tmp.CGImage);
height = CGImageGetHeight(tmp.CGImage);
NSData * imageAsNSData = UIImageJPEGRepresentation( tmp );
[mailVC setMailComposeDelegate: self];
[mailVC setToRecipients: aAddr];
[mailVC setSubject: gSubj];
[mailVC addAttachmentData: imageAsNSData
mimeType: #"image/jpg"
fileName: #"myPhoto.jpg"];
[mailVC setMessageBody: #"Blah blah"
isHTML: NO];
[self presentViewController: mailVC
animated: YES
completion: nil];
}
else NSLog( #"Device is unable to send email in its current state." );
When we look at the debugging data for the new UIImage tmp, we get UIImageOrientationUp and width = 3264 and height = 2448.
The orientation property was stripped and the default orientation is Up. If you do not know the stripping is going on, it can really confuse things.
If I run this code, I now get the following results:
Things are unchanged in the imagePickerController code; the image is captured as before.
Flow-of-control goes on to the E-Mailer code but now the orientation property has been stripped from the tmp image so when the MFMailComposeViewController code displays the tmp image in the outgoing E-Mail, it is shown in landscape mode (because the default orientation is UIImageOrientationUp so there is no rotation done of the 3264x2448 image).
The E-Mail is sent and on the receiving end, knowledge of the orientation property is missing as well so the receiving software displays the image as it is physically laid out as 3264x2448, i.e. landscape.
This ‘stripping’ of the orientation property when making copies of the UIImage can be avoided, if one is aware it is going on, by using the following code to make the UIImage copy:
UIImage * tmp = [UIImage imageWithCGImage: gImag.CGImage
scale: gImag.scale
orientation: gImag.imageOrientation];
That would allow you to avoid losing the orientation property along the way but it would still not deal with its loss on the far end when you e-mail the image.
There’s a better way than all this messing about and worrying about the orientation property.
I found some code here that that I’ve integrated into my program. This code will physically rotate the underlying stored image according to its orientation property.
For a UIImage with an orientation of UIImageOrientationRight, it will physically rotate the UIImage so it ends up as 2448x3264 and it will strip away the orientation property so it is seen thereafter as the default UIImageOrientationUp.
For an UIImage with an orientation of UIImageOrientationUp, it does nothing. It lets the sleeping landscape dogs lie.
If you do this then I think (based on what I’ve seen so far), that the orientation property of the UIImage is superfluous thereafter. So long as it remains missing/stripped or set to UIImageOrientationUp, your images should be displayed correctly at each step along the way and on the distant end when the images embedded in your E-Mail is displayed.
Everything I’ve discussed in the answer, I have personally single-stepped and watched it happen.
So, here my final code that works:
gImag = (PIMG)[info valueForKey: UIImagePickerControllerOriginalImage];
[[self imageView] setImage: gImag]; // send image to screen
[self imagePickerControllerRelease: picker ]; // free the picker object
and
if ( [MFMailComposeViewController canSendMail] )
{
MFMailComposeViewController * mailVC = [MFMailComposeViewController new];
NSArray * aAddr = [NSArray arrayWithObjects: gAddr, nil];
//...lets not touch the original UIImage
UIImage * tmpImag = [UIImage imageWithCGImage: gImag.CGImage
scale: gImag.scale
orientation: gImag.imageOrientation];
//...do physical rotation, if needed
PIMG ImgOut = [gU scaleAndRotateImage: tmpImag];
//...note orientation is UIImageOrientationUp now
NSData * imageAsNSData = UIImageJPEGRepresentation( ImgOut, 0.9f );
[mailVC setMailComposeDelegate: self];
[mailVC setToRecipients: aAddr];
[mailVC setSubject: gSubj];
[mailVC addAttachmentData: imageAsNSData
mimeType: #"image/jpg"
fileName: #"myPhoto.jpg"];
[mailVC setMessageBody: #"Blah blah"
isHTML: NO];
[self presentViewController: mailVC
animated: YES
completion: nil];
}
else NSLog( #"Device is unable to send email in its current state." );
And, finally, here's the code I grabbed from here that does the physical rotation, if necessary:
- (UIImage *) scaleAndRotateImage: (UIImage *) imageIn
//...thx: http://blog.logichigh.com/2008/06/05/uiimage-fix/
{
int kMaxResolution = 3264; // Or whatever
CGImageRef imgRef = imageIn.CGImage;
CGFloat width = CGImageGetWidth(imgRef);
CGFloat height = CGImageGetHeight(imgRef);
CGAffineTransform transform = CGAffineTransformIdentity;
CGRect bounds = CGRectMake( 0, 0, width, height );
if ( width > kMaxResolution || height > kMaxResolution )
{
CGFloat ratio = width/height;
if (ratio > 1)
{
bounds.size.width = kMaxResolution;
bounds.size.height = bounds.size.width / ratio;
}
else
{
bounds.size.height = kMaxResolution;
bounds.size.width = bounds.size.height * ratio;
}
}
CGFloat scaleRatio = bounds.size.width / width;
CGSize imageSize = CGSizeMake( CGImageGetWidth(imgRef), CGImageGetHeight(imgRef) );
UIImageOrientation orient = imageIn.imageOrientation;
CGFloat boundHeight;
switch(orient)
{
case UIImageOrientationUp: //EXIF = 1
transform = CGAffineTransformIdentity;
break;
case UIImageOrientationUpMirrored: //EXIF = 2
transform = CGAffineTransformMakeTranslation(imageSize.width, 0.0);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
break;
case UIImageOrientationDown: //EXIF = 3
transform = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
transform = CGAffineTransformRotate(transform, M_PI);
break;
case UIImageOrientationDownMirrored: //EXIF = 4
transform = CGAffineTransformMakeTranslation(0.0, imageSize.height);
transform = CGAffineTransformScale(transform, 1.0, -1.0);
break;
case UIImageOrientationLeftMirrored: //EXIF = 5
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, imageSize.width);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;
case UIImageOrientationLeft: //EXIF = 6
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(0.0, imageSize.width);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;
case UIImageOrientationRightMirrored: //EXIF = 7
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeScale(-1.0, 1.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;
case UIImageOrientationRight: //EXIF = 8
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;
default:
[NSException raise: NSInternalInconsistencyException
format: #"Invalid image orientation"];
}
UIGraphicsBeginImageContext( bounds.size );
CGContextRef context = UIGraphicsGetCurrentContext();
if ( orient == UIImageOrientationRight || orient == UIImageOrientationLeft )
{
CGContextScaleCTM(context, -scaleRatio, scaleRatio);
CGContextTranslateCTM(context, -height, 0);
}
else
{
CGContextScaleCTM(context, scaleRatio, -scaleRatio);
CGContextTranslateCTM(context, 0, -height);
}
CGContextConcatCTM( context, transform );
CGContextDrawImage( UIGraphicsGetCurrentContext(), CGRectMake( 0, 0, width, height ), imgRef );
UIImage *imageCopy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return( imageCopy );
}
Cheers, from New Zealand.

I have struggled with image orientations in iOS SDK7 my self and I want to add something to this discussion in the hope it is useful to others. UIImageJPEGRepresentation works fine, but be aware that using UIImagePNGRepresentation might result in an image with incorrect orientation.
I'm developing an app in which a picture is uploaded after being taken with the device camera. I do not store the image temporally to disk, but rather send it directly to the server.
When rendering a UIImage to NSData, you can choose between UIImagePNGRepresentation and UIImageJPEGRepresentation.
When using UIImagePNGRepresentation, both a picture shot in portrait and landscape orientation result in a landscape image. In this case the portrait image is rotated 90 degrees counter clockwise.
When using UIImageJPEGRepresentation, both a portrait and landscape image result with a correct orientation.

swift version of the scale part:
class func scaleAndRotateImageUsingOrientation(imageIn : UIImage) -> UIImage
{
//takes into account the stored rotation on the image fromt he camera and makes it upright
let kMaxResolution = 3264; // Or whatever
let imgRef = imageIn.CGImage
let width = CGImageGetWidth(imgRef)
let height = CGImageGetHeight(imgRef)
var transform = CGAffineTransformIdentity
var bounds = CGRectMake( 0.0, 0.0, CGFloat(width), CGFloat(height) )
if ( width > kMaxResolution || height > kMaxResolution )
{
let ratio : CGFloat = CGFloat(width) / CGFloat(height);
if (ratio > 1)
{
bounds.size.width = CGFloat(kMaxResolution)
bounds.size.height = bounds.size.width / ratio;
}
else
{
bounds.size.height = CGFloat(kMaxResolution)
bounds.size.width = bounds.size.height * ratio
}
}
let scaleRatio : CGFloat = bounds.size.width / CGFloat(width)
var imageSize = CGSizeMake( CGFloat(CGImageGetWidth(imgRef)), CGFloat(CGImageGetHeight(imgRef)) )
let orient = imageIn.imageOrientation;
var boundHeight : CGFloat = 0.0
switch(orient)
{
case UIImageOrientation.Up: //EXIF = 1
transform = CGAffineTransformIdentity;
break;
case UIImageOrientation.UpMirrored: //EXIF = 2
transform = CGAffineTransformMakeTranslation(imageSize.width, 0.0);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
break;
case UIImageOrientation.Down: //EXIF = 3
transform = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
transform = CGAffineTransformRotate(transform, CGFloat(M_PI));
break;
case UIImageOrientation.DownMirrored: //EXIF = 4
transform = CGAffineTransformMakeTranslation(0.0, imageSize.height);
transform = CGAffineTransformScale(transform, 1.0, -1.0);
break;
case UIImageOrientation.LeftMirrored: //EXIF = 5
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, imageSize.width);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
transform = CGAffineTransformRotate(transform, 3.0 * CGFloat(M_PI) / 2.0);
break;
case UIImageOrientation.Left: //EXIF = 6
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(0.0, imageSize.width);
transform = CGAffineTransformRotate(transform, 3.0 * CGFloat(M_PI) / 2.0);
break;
case UIImageOrientation.RightMirrored: //EXIF = 7
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeScale(-1.0, 1.0);
transform = CGAffineTransformRotate(transform, CGFloat(M_PI) / 2.0);
break;
case UIImageOrientation.Right: //EXIF = 8
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
transform = CGAffineTransformRotate(transform, CGFloat(M_PI) / 2.0);
break;
default:
break
}
UIGraphicsBeginImageContext( bounds.size );
var context = UIGraphicsGetCurrentContext();
if ( orient == UIImageOrientation.Right || orient == UIImageOrientation.Left )
{
CGContextScaleCTM(context, -scaleRatio, scaleRatio);
CGContextTranslateCTM(context, CGFloat(-height), 0);
}
else
{
CGContextScaleCTM(context, scaleRatio, -scaleRatio);
CGContextTranslateCTM(context, 0, CGFloat(-height));
}
CGContextConcatCTM( context, transform );
CGContextDrawImage( UIGraphicsGetCurrentContext(), CGRectMake( 0, 0, CGFloat(width), CGFloat(height) ), imgRef );
let imageCopy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return( imageCopy );
}

I think you can get the UIImage API to handle the rotation for you automatically, without needing to manually do the transforms.
The documentation for the UIImage methods size and drawInRect: both say they take the orientation into account, so I drawInRect the UIImage into a new context and grab the resultant (auto-rotated) image from there. Here's the code in a category:
#interface UIImage (Orientation)
- (UIImage*)imageAdjustedForOrientation;
#end
#implementation UIImage (Orientation)
- (UIImage*)imageAdjustedForOrientation
{
// The UIImage methods size and drawInRect take into account
// the value of its imageOrientation property
// so the rendered image is rotated as necessary.
UIGraphicsBeginImageContextWithOptions(self.size, NO, self.scale);
[self drawInRect:CGRectMake(0, 0, self.size.width, self.size.height)];
UIImage *orientedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return orientedImage;
}
#end
I just found this better answer by an0 which is virtually the same, but also includes the optimisation of not re-redrawing the image if the orientation is already correct.

You need to rotate the image according to its orientation. Here's a trick how to do it easily:
NSData *data = UIImagePNGRepresentation(sourceImage);
UIImage *tmp = [UIImage imageWithData:data];
UIImage *fixed = [UIImage imageWithCGImage:tmp.CGImage
scale:sourceImage.scale
orientation:sourceImage.imageOrientation];

In case anyone ends up here and requires a Swift 4 solution:
// UIImage+orientedUp.swift
// ID Fusion Software Inc
// Created by Dave Poirier on 2017-12-27
// Unlicensed
//
import UIKit
extension UIImage {
func orientedUp() -> UIImage {
guard self.imageOrientation != UIImageOrientation.up,
let cgImage = self.cgImage,
let colorSpace = cgImage.colorSpace
else { return self }
guard let ctx: CGContext = CGContext(data: nil,
width: Int(self.size.width), height: Int(self.size.height),
bitsPerComponent: cgImage.bitsPerComponent, bytesPerRow: 0, space: colorSpace,
bitmapInfo: CGImageAlphaInfo.premultipliedLast.rawValue) else { return self }
ctx.concatenate(self.orientedUpTransform())
switch self.imageOrientation {
case UIImageOrientation.left, UIImageOrientation.leftMirrored, UIImageOrientation.right, UIImageOrientation.rightMirrored:
ctx.draw(cgImage, in: CGRect(x: 0, y: 0, width: self.size.height, height: self.size.width))
default:
ctx.draw(cgImage, in: CGRect(x: 0, y: 0, width: self.size.width, height: self.size.height))
}
guard let orientedCGImage: CGImage = ctx.makeImage()
else { return self }
return UIImage(cgImage: orientedCGImage)
}
func orientedUpTransform() -> CGAffineTransform {
var transform: CGAffineTransform = CGAffineTransform.identity
switch self.imageOrientation {
case UIImageOrientation.down, UIImageOrientation.downMirrored:
transform = transform.translatedBy(x: self.size.width, y: self.size.height)
transform = transform.rotated(by: CGFloat(Double.pi))
case UIImageOrientation.left, UIImageOrientation.leftMirrored:
transform = transform.translatedBy(x: self.size.width, y: 0)
transform = transform.rotated(by: CGFloat(Double.pi / 2.0))
case UIImageOrientation.right, UIImageOrientation.rightMirrored:
transform = transform.translatedBy(x: 0, y: self.size.height)
transform = transform.rotated(by: CGFloat(-Double.pi / 2.0))
default:
break
}
switch self.imageOrientation {
case UIImageOrientation.upMirrored, UIImageOrientation.downMirrored:
transform = transform.translatedBy(x: self.size.width, y: 0)
transform = transform.scaledBy(x: -1, y: 1)
case UIImageOrientation.leftMirrored, UIImageOrientation.rightMirrored:
transform = transform.translatedBy(x: self.size.height, y: 0)
transform = transform.scaledBy(x: -1, y: 1)
default:
break
}
return transform
}
}
You can then use it like this:
let image = UIImage(data: imageData)?.orientedUp()

Related

Not able to show the Image in proper format from camera in landscape mode

I am working on iPad App in Ios6, There when we click the right bar button i am giving an action like below:
-(IBAction)camerabuttonAction:(id)sender
{
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
picker.delegate = self;
self.popoverController = [[UIPopoverController alloc] initWithContentViewController:picker];
[self.popoverController presentPopoverFromRect:CGRectMake(50, -250, 500, 300) inView:appDelegate.splitview.view permittedArrowDirections:UIPopoverArrowDirectionUp animated:YES];
}
My problem is, When i am in Land scape mode if i click the button. The camera show in portrait mode(Image appears in reverse mode to see) after every second use. But, if i shake the iPad then it shows in LandScape i.e, in correct direction .
see the below images
When i am in Land scape mode if i click the button camera shows the image like below:
if i shake the iPad then camera shows the image like below:
I have tried a lot and googled,but I did not find any solution. It's killing my time so if any one have worked on it please guide me and post sample code.
I faced the same issue in my project i have tried the below and works for me
My code:
- (UIImage *)scaleAndRotateImage:(UIImage *)image {
int kMaxResolution = 640; // Or whatever
CGImageRef imgRef = image.CGImage;
CGFloat width = CGImageGetWidth(imgRef);
CGFloat height = CGImageGetHeight(imgRef);
CGAffineTransform transform = CGAffineTransformIdentity;
CGRect bounds = CGRectMake(0, 0, width, height);
if (width > kMaxResolution || height > kMaxResolution) {
CGFloat ratio = width/height;
if (ratio > 1) {
bounds.size.width = kMaxResolution;
bounds.size.height = roundf(bounds.size.width / ratio);
}
else {
bounds.size.height = kMaxResolution;
bounds.size.width = roundf(bounds.size.height * ratio);
}
}
CGFloat scaleRatio = bounds.size.width / width;
CGSize imageSize = CGSizeMake(CGImageGetWidth(imgRef), CGImageGetHeight(imgRef));
CGFloat boundHeight;
UIImageOrientation orient = image.imageOrientation;
switch(orient) {
case UIImageOrientationUp: //EXIF = 1
transform = CGAffineTransformIdentity;
break;
case UIImageOrientationUpMirrored: //EXIF = 2
transform = CGAffineTransformMakeTranslation(imageSize.width, 0.0);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
break;
case UIImageOrientationDown: //EXIF = 3
transform = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
transform = CGAffineTransformRotate(transform, M_PI);
break;
case UIImageOrientationDownMirrored: //EXIF = 4
transform = CGAffineTransformMakeTranslation(0.0, imageSize.height);
transform = CGAffineTransformScale(transform, 1.0, -1.0);
break;
case UIImageOrientationLeftMirrored: //EXIF = 5
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, imageSize.width);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;
case UIImageOrientationLeft: //EXIF = 6
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(0.0, imageSize.width);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;
case UIImageOrientationRightMirrored: //EXIF = 7
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeScale(-1.0, 1.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;
case UIImageOrientationRight: //EXIF = 8
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;
default:
[NSException raise:NSInternalInconsistencyException format:#"Invalid image orientation"];
}
UIGraphicsBeginImageContext(bounds.size);
CGContextRef context = UIGraphicsGetCurrentContext();
if (orient == UIImageOrientationRight || orient == UIImageOrientationLeft) {
CGContextScaleCTM(context, -scaleRatio, scaleRatio);
CGContextTranslateCTM(context, -height, 0);
}
else {
CGContextScaleCTM(context, scaleRatio, -scaleRatio);
CGContextTranslateCTM(context, 0, -height);
}
CGContextConcatCTM(context, transform);
CGContextDrawImage(UIGraphicsGetCurrentContext(), CGRectMake(0, 0, width, height), imgRef);
UIImage *imageCopy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return imageCopy;
}
You can try giving transform to your imagePickerController
imagePickerController.view.transform = CGAffineTransformMakeRotation(-M_PI/2);
I study about your issue in Google search and got some result or possibility like bellow point:-
result 1
some Answer say's that its a iOS 6 bug so you can not fixed It like bellow Quetion:-
iPad camera popover preview wrong rotation and scale
result 2
I don't think we can control orientation of camera.Camera orientation property is inbuilt,which changes with orientation of device.
iPad camera orientation portrait mode?
result 3
you can manage Scaled live iPhone Camera view in center physically move the picker frame. by Bellow code:-
[picker.view setFrame:CGRectMake(xOffset,yOffset,picker.view.frame.size.width,picker.view.frame.size.height)];
hope you got actually solution. about this bug or issue
I'm going to try a good guess here because I haven't done this myself...
Have you tried to create a subclass of UIImagePickerController and implement the methods of the interfaceOrientation from the ViewController (which is the super class)?
Please try this
[popController presentPopoverFromRect:CGRectMake(1024,
self.view.bounds.origin.y + (self.view.bounds.size.height / 2), 1, 1)
inView:self.view
permittedArrowDirections:UIPopoverArrowDirectionRight
animated:YES];
I have an ipad app only lanscape mode, I was freaking out because I read all these posts but I used this very simple code to take a picture and it works perfectly for my only lanscape app. Important I don´t choose an image from the library, I just take the picture and use it to put it over an other image, it works for me perfectly.
if ([UIImagePickerController isSourceTypeAvailable:
UIImagePickerControllerSourceTypeCamera])
{
UIImagePickerController *imagePicker =
[[UIImagePickerController alloc] init];
imagePicker.delegate = self;
imagePicker.sourceType =
UIImagePickerControllerSourceTypeCamera;
imagePicker.mediaTypes = [NSArray arrayWithObjects:
(NSString *) kUTTypeImage,
nil];
imagePicker.allowsEditing = NO;
[self presentViewController:imagePicker
animated:YES completion:nil];
// [imagePicker release];
newMedia = YES;
}
-(void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[self.popoverController dismissPopoverAnimated:true];
// [popoverController release];
NSString *mediaType = [info
objectForKey:UIImagePickerControllerMediaType];
[self dismissViewControllerAnimated:YES completion:nil];
if ([mediaType isEqualToString:(NSString *)kUTTypeImage]) {
UIImage *image = [info
objectForKey:UIImagePickerControllerOriginalImage];
imageView.image = image;
if (newMedia)
UIImageWriteToSavedPhotosAlbum(image,
self,
#selector(image:finishedSavingWithError:contextInfo:),
nil);
}
else if ([mediaType isEqualToString:(NSString *)kUTTypeMovie])
{
// Code here to support video if enabled
}
}

setting UIImageView content mode after applying a CIFIlter

Thanks for looking.
Here's my code
CIImage *result = _vignette.outputImage;
self.mainImageView.image = nil;
//self.mainImageView.contentMode = UIViewContentModeScaleAspectFit;
self.mainImageView.image = [UIImage imageWithCIImage:result];
self.mainImageView.contentMode = UIViewContentModeScaleAspectFit;
in here _vignette is correctly set up filter and image effect is applying to the image correctly.
I'm using a source image with resolution 500x375. My imageView has almost iPhone screen's resolution. So to avoid stretching I'm using AspectFit.
But after applying effect when I'm assigning the result image back to my imageView it streches. No matter which UIViewContentMode I use. It doesn't work. It seems it always applies ScaleToFill regardless the filter I've given.
Any idea why is this happening? Any suggestion is highly appreciated.
(1) Aspect Fit does stretch the image - to fit. If you don't want the image stretched at all, use Center (for example).
(2) imageWithCIImage gives you a very weird beast, a UIImage not based on CGImage, and so not susceptible to the normal rules of layer display. It is really nothing but a thin wrapper around CIImage, which is not what you want. You must convert (render) the CIFilter output thru CGImage to UIImage, thus giving you a UIImage that actually has some bits (CGImage, a bitmap). My discussion here gives you code that demonstrates:
http://www.apeth.com/iOSBook/ch15.html#_cifilter_and_ciimage
In other words, at some point you must call CIContext createCGImage:fromRect: to generate a CGImageRef from the output of your CIFilter, and pass that on into a UIImage. Until you do that, you don't have the output of your filter operations as a real UIImage.
Alternatively, you can draw the image from imageWithCIImage into a graphics context. For example, you can draw it into an image graphics context and then use that image.
What you can't do is display the image from imageWithCIImage directly. That's because it isn't an image! It has no underlying bitmap (CGImage). There's no there there. All it is is a set of CIFilter instructions for deriving the image.
This is an answer for Swift, inspired by this question and the solution by user1951992.
let imageView = UIImageView(frame: CGRect(x: 100, y: 200, width: 100, height: 50))
imageView.contentMode = .scaleAspectFit
//just an example for a CIFilter
//NOTE: we're using implicit unwrapping here because we're sure there is a filter
//(and an image) named exactly this
let filter = CIFilter(name: "CISepiaTone")!
let image = UIImage(named: "image")!
filter.setValue(CIImage(image: image), forKey: kCIInputImageKey)
filter.setValue(0.5, forKey: kCIInputIntensityKey)
guard let outputCGImage = filter?.outputImage,
let outputImage = CIContext().createCGImage(outputCGImage, from: outputCGImage.extent) else {
return
}
imageView.image = UIImage(cgImage: outputImage)
I just spent all day on this. I was getting orientation problems followed by poor quality output.
After snapping an image using the camera, I do this.
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
This causes a rotation problem so I needed to do this to it.
UIImage *scaleAndRotateImage(UIImage *image)
{
int kMaxResolution = image.size.height; // Or whatever
CGImageRef imgRef = image.CGImage;
CGFloat width = CGImageGetWidth(imgRef);
CGFloat height = CGImageGetHeight(imgRef);
CGAffineTransform transform = CGAffineTransformIdentity;
CGRect bounds = CGRectMake(0, 0, width, height);
if (width > kMaxResolution || height > kMaxResolution) {
CGFloat ratio = width/height;
if (ratio > 1) {
bounds.size.width = kMaxResolution;
bounds.size.height = bounds.size.width / ratio;
}
else {
bounds.size.height = kMaxResolution;
bounds.size.width = bounds.size.height * ratio;
}
}
CGFloat scaleRatio = bounds.size.width / width;
CGSize imageSize = CGSizeMake(CGImageGetWidth(imgRef), CGImageGetHeight(imgRef));
CGFloat boundHeight;
UIImageOrientation orient = image.imageOrientation;
switch(orient) {
case UIImageOrientationUp: //EXIF = 1
transform = CGAffineTransformIdentity;
break;
case UIImageOrientationUpMirrored: //EXIF = 2
transform = CGAffineTransformMakeTranslation(imageSize.width, 0.0);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
break;
case UIImageOrientationDown: //EXIF = 3
transform = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
transform = CGAffineTransformRotate(transform, M_PI);
break;
case UIImageOrientationDownMirrored: //EXIF = 4
transform = CGAffineTransformMakeTranslation(0.0, imageSize.height);
transform = CGAffineTransformScale(transform, 1.0, -1.0);
break;
case UIImageOrientationLeftMirrored: //EXIF = 5
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, imageSize.width);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;
case UIImageOrientationLeft: //EXIF = 6
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(0.0, imageSize.width);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;
case UIImageOrientationRightMirrored: //EXIF = 7
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeScale(-1.0, 1.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;
case UIImageOrientationRight: //EXIF = 8
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;
default:
[NSException raise:NSInternalInconsistencyException format:#"Invalid image orientation"];
}
UIGraphicsBeginImageContext(bounds.size);
CGContextRef context = UIGraphicsGetCurrentContext();
if (orient == UIImageOrientationRight || orient == UIImageOrientationLeft) {
CGContextScaleCTM(context, -scaleRatio, scaleRatio);
CGContextTranslateCTM(context, -height, 0);
}
else {
CGContextScaleCTM(context, scaleRatio, -scaleRatio);
CGContextTranslateCTM(context, 0, -height);
}
CGContextConcatCTM(context, transform);
CGContextDrawImage(UIGraphicsGetCurrentContext(), CGRectMake(0, 0, width, height), imgRef);
UIImage *imageCopy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return imageCopy;
}
Then when I apply my filter, I do this (with special thanks to this thread - specifically matt and Wex)
-(UIImage*)processImage:(UIImage*)image {
CIImage *inImage = [CIImage imageWithCGImage:image.CGImage];
CIFilter *filter = [CIFilter filterWithName:#"CIColorControls" keysAndValues:
kCIInputImageKey, inImage,
#"inputContrast", [NSNumber numberWithFloat:1.0],
nil];
UIImage *outImage = [filter outputImage];
//Juicy bit
CGImageRef cgimageref = [[CIContext contextWithOptions:nil] createCGImage:outImage fromRect:[outImage extent]];
return [UIImage imageWithCGImage:cgimageref];
}

iOS application crashes during consecutive photo capturing

In my application, I am capturing five photos sequentially on a button click. But, when I press the home button in device in the midst of this photo shoot, as usual the app enters its background stage, but when I try to relaunch, the app crashes. Can you please suggest any solution for this problem?
Thanks in advance
Hi please refer this three methods this may resolve your issue of memory leaks, actually crash is due to memory leaks because sometimes when we take more photos with high resolution and we are using it's our application then iphone is not capable of handle much more memory.
one more thing i need to say sorry because i don't know hoe to format code.
(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissModalViewControllerAnimated:YES];
[picker release];
[popover dismissPopoverAnimated:YES];
[popover release];
UIImage *capturedImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
[self performSelector:#selector(waitUntillImageCaptured:) withObject:capturedImage afterDelay:0.2];
}
-(void)waitUntillImageCaptured:(UIImage *)originalImage
{
UIImage *tempimg;
tempimg = [self scaleAndRotateImage:originalImage];
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
NSString *imagePath = [[appDelegate applicationDocumentDirectoryString] stringByAppendingPathComponent:#"ClientTempthumb.png"];
//UIImageView *tempView = [[UIImageView alloc] initWithImage:tempimg];
NSData *data = [NSData dataWithData:UIImagePNGRepresentation(tempimg)];
if(data != nil)
{
[data writeToFile:imagePath atomically:YES];
}
else
{
[[NSFileManager defaultManager] removeItemAtPath:imagePath error:NULL];
}
[btnPhoto setImage:tempimg forState:UIControlStateNormal];
[pool release];
// [appDelegate hideLoadingView];
}
(UIImage *)scaleAndRotateImage:(UIImage *)image {
int kMaxResolution = 550; // Or whatever
CGImageRef imgRef = image.CGImage;
CGFloat width = CGImageGetWidth(imgRef);
CGFloat height = CGImageGetHeight(imgRef);
CGAffineTransform transform = CGAffineTransformIdentity;
CGRect bounds = CGRectMake(0, 0, width, height);
if (width > kMaxResolution || height > kMaxResolution) {
CGFloat ratio = width/height;
if (ratio > 1)
{
bounds.size.width = kMaxResolution;
bounds.size.height = roundf(bounds.size.width / ratio);
}
else
{
bounds.size.height = kMaxResolution;
bounds.size.width = roundf(bounds.size.height * ratio);
}
}
CGFloat scaleRatio = bounds.size.width / width;
CGSize imageSize = CGSizeMake(CGImageGetWidth(imgRef), CGImageGetHeight(imgRef));
CGFloat boundHeight;
UIImageOrientation orient = image.imageOrientation;
switch(orient)
{
case UIImageOrientationUp: //EXIF = 1
// landscape right
transform = CGAffineTransformIdentity;
break;
case UIImageOrientationUpMirrored: //EXIF = 2
transform = CGAffineTransformMakeTranslation(imageSize.width, 0.0);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
break;
case UIImageOrientationDown: //EXIF = 3
// landscape left
transform = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
transform = CGAffineTransformRotate(transform, M_PI);
break;
case UIImageOrientationDownMirrored: //EXIF = 4
transform = CGAffineTransformMakeTranslation(0.0, imageSize.height);
transform = CGAffineTransformScale(transform, 1.0, -1.0);
break;
case UIImageOrientationLeftMirrored: //EXIF = 5
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, imageSize.width);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;
case UIImageOrientationLeft: //EXIF = 6
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(0.0, imageSize.width);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;
case UIImageOrientationRightMirrored: //EXIF = 7
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeScale(-1.0, 1.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;
case UIImageOrientationRight: //EXIF = 8
// Portrait Mode
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;
default:
[NSException raise:NSInternalInconsistencyException format:#"Invalid image orientation"];
}
UIGraphicsBeginImageContext(bounds.size);
CGContextRef context = UIGraphicsGetCurrentContext();
if (orient == UIImageOrientationRight || orient == UIImageOrientationLeft) {
CGContextScaleCTM(context, -scaleRatio, scaleRatio);
CGContextTranslateCTM(context, -height, 0);
}
else {
CGContextScaleCTM(context, scaleRatio, -scaleRatio);
CGContextTranslateCTM(context, 0, -height);
}
CGContextConcatCTM(context, transform);
CGContextDrawImage(UIGraphicsGetCurrentContext(), CGRectMake(0, 0, width, height), imgRef);
UIImage *imageCopy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return imageCopy;
}

iOS: Image get rotated 90 degree after saved as PNG representation data

I have researched enough to get this working but not able to fix it. After taking picture from camera as long as I have image stored as UIImage, it's fine but as soon as I stored this image as PNG representation, its get rotated 90 degree.
Following is my code and all things I tried:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = [info valueForKey:UIImagePickerControllerMediaType];
if([mediaType isEqualToString:(NSString*)kUTTypeImage])
{
AppDelegate *delegate = (AppDelegate *)[[UIApplication sharedApplication] delegate];
delegate.originalPhoto = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
NSLog(#"Saving photo");
[self saveImage];
NSLog(#"Fixing orientation");
delegate.fixOrientationPhoto = [self fixOrientation:[UIImage imageWithContentsOfFile:[delegate filePath:imageName]]];
NSLog(#"Scaling photo");
delegate.scaledAndRotatedPhoto = [self scaleAndRotateImage:[UIImage imageWithContentsOfFile:[delegate filePath:imageName]]];
}
[picker dismissModalViewControllerAnimated:YES];
[picker release];
}
- (void)saveImage
{
AppDelegate *delegate = (AppDelegate *)[[UIApplication sharedApplication] delegate];
NSData *imageData = UIImagePNGRepresentation(delegate.originalPhoto);
[imageData writeToFile:[delegate filePath:imageName] atomically:YES];
}
Here fixOrientation and scaleAndRotateImage functions taken from here and here respectively. They works fine and rotate image when I apply them on UIImage but doesn't work if I save image as PNG representation and apply them.
Please refere the following picture after executing above functions:
Starting with iOS 4.0 when the camera takes a photo it does not rotate it before saving, it
simply sets a rotation flag in the EXIF data of the JPEG.If you save a UIImage as a JPEG, it
will set the rotation flag.PNGs do not support a rotation flag, so if you save a UIImage as a
PNG, it will be rotated incorrectly and not have a flag set to fix it. So if you want PNG
images you must rotate them yourself, for that check this link.
Swift 4.2
Add the following as UIImage extension,
extension UIImage {
func fixOrientation() -> UIImage? {
if self.imageOrientation == UIImage.Orientation.up {
return self
}
UIGraphicsBeginImageContext(self.size)
self.draw(in: CGRect(origin: .zero, size: self.size))
let normalizedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return normalizedImage
}
}
Example usage:
let cameraImage = //image captured from camera
let orientationFixedImage = cameraImage.fixOrientation()
Explanation:
The magic happens when you call UIImage's draw(in:) function, which redraws the image respecting originally captured orientation settings. [link to docs]
Make sure to call UIGraphicsBeginImageContext with image's size before calling draw to let draw rewrite the UIImage in current context's space.
UIGraphicsGetImageFromCurrentImageContext lets you capture whatever is the result of the redrawn image, & UIGraphicsEndImageContext ends and frees up the graphics context.
Swift 3.1 version of the UIImage extension posted by Rao:
extension UIImage {
func fixOrientation() -> UIImage {
if self.imageOrientation == UIImageOrientation.up {
return self
}
UIGraphicsBeginImageContextWithOptions(self.size, false, self.scale)
self.draw(in: CGRect(x: 0, y: 0, width: self.size.width, height: self.size.height))
if let normalizedImage: UIImage = UIGraphicsGetImageFromCurrentImageContext() {
UIGraphicsEndImageContext()
return normalizedImage
} else {
return self
}
}
}
Usage:
let cameraImage = //image captured from camera
let orientationFixedImage = cameraImage.fixOrientation()
Swift 4/5:
UIImageOrientation.up has been renamed to UIImage.Orientation.up
I have found the following tips to be hugely useful:
1. natural output is landscape
2. .width / .height ARE affected by .imageOrientation
3 use short dimension rather than .width
(1) the 'natural' output of the camera for stills IS LANDSCAPE.
this is counterintuitive. portrait is the only way offered by UIImagePickerController etc. But you will get UIImageOrientationRight as the "normal" orientation when using the "normal" portrait camera
(The way I remember this - the natural output for video is (of course) landscape; so stills are the same - even though the iPhone is all about portrait.)
(2) .width and .height are indeed affected by the .imageOrientation!!!!!!!!!
Be sure to do this, and try it both ways on your iPhone,
NSLog(#"fromImage.imageOrientation is %d", fromImage.imageOrientation);
NSLog(#"fromImage.size.width %f fromImage.size.height %f",
fromImage.size.width, fromImage.size.height);
you'll see that the .height and .width swap, "even though" the real pixels are landscape.
(3) simply using the "short dimension" rather than .width, can often solve many problems
I found this to be incredibly helpful. Say you want maybe the top square of the image:
CGRect topRectOfOriginal = CGRectMake(0,0, im.size.width,im.size.width);
that actually won't work, you'll get a squished image, when the camera is ("really") being held landscape.
however if you very simply do this
float shortDimension = fminf(im.size.width, im.size.height);
CGRect topRectOfOriginal = CGRectMake(0,0, shortDimension,shortDimension);
then "everything is fixed" and you actually "do not need to worry about" the orientation flag. Again point (3) is not a cure-all, but it very often does solve all problems.
Hope it helps someone save some time.
I found this code here, which actually fixed it for me. For my app, I took a picture and saved it, and everytime I loaded it, it would have the annoying rotation attached (I looked up, and it's apparently something to do with the EXIF and the way iPhone takes and stores images). This code fixed it for me. I have to say, it was originally as an addition to a class / an extension /category (you can find the original from the link. I used it like below as a simple method, as I didn't really want to make a whole class or category for just this. I only used portrait, but I think the code works for any orientation. I'm not sure though
Rant over, here's the code:
- (UIImage *)fixOrientationForImage:(UIImage*)neededImage {
// No-op if the orientation is already correct
if (neededImage.imageOrientation == UIImageOrientationUp) return neededImage;
// We need to calculate the proper transformation to make the image upright.
// We do it in 2 steps: Rotate if Left/Right/Down, and then flip if Mirrored.
CGAffineTransform transform = CGAffineTransformIdentity;
switch (neededImage.imageOrientation) {
case UIImageOrientationDown:
case UIImageOrientationDownMirrored:
transform = CGAffineTransformTranslate(transform, neededImage.size.width, neededImage.size.height);
transform = CGAffineTransformRotate(transform, M_PI);
break;
case UIImageOrientationLeft:
case UIImageOrientationLeftMirrored:
transform = CGAffineTransformTranslate(transform, neededImage.size.width, 0);
transform = CGAffineTransformRotate(transform, M_PI_2);
break;
case UIImageOrientationRight:
case UIImageOrientationRightMirrored:
transform = CGAffineTransformTranslate(transform, 0, neededImage.size.height);
transform = CGAffineTransformRotate(transform, -M_PI_2);
break;
case UIImageOrientationUp:
case UIImageOrientationUpMirrored:
break;
}
switch (neededImage.imageOrientation) {
case UIImageOrientationUpMirrored:
case UIImageOrientationDownMirrored:
transform = CGAffineTransformTranslate(transform, neededImage.size.width, 0);
transform = CGAffineTransformScale(transform, -1, 1);
break;
case UIImageOrientationLeftMirrored:
case UIImageOrientationRightMirrored:
transform = CGAffineTransformTranslate(transform, neededImage.size.height, 0);
transform = CGAffineTransformScale(transform, -1, 1);
break;
case UIImageOrientationUp:
case UIImageOrientationDown:
case UIImageOrientationLeft:
case UIImageOrientationRight:
break;
}
// Now we draw the underlying CGImage into a new context, applying the transform
// calculated above.
CGContextRef ctx = CGBitmapContextCreate(NULL, neededImage.size.width, neededImage.size.height,
CGImageGetBitsPerComponent(neededImage.CGImage), 0,
CGImageGetColorSpace(neededImage.CGImage),
CGImageGetBitmapInfo(neededImage.CGImage));
CGContextConcatCTM(ctx, transform);
switch (neededImage.imageOrientation) {
case UIImageOrientationLeft:
case UIImageOrientationLeftMirrored:
case UIImageOrientationRight:
case UIImageOrientationRightMirrored:
// Grr...
CGContextDrawImage(ctx, CGRectMake(0,0,neededImage.size.height,neededImage.size.width), neededImage.CGImage);
break;
default:
CGContextDrawImage(ctx, CGRectMake(0,0,neededImage.size.width,neededImage.size.height), neededImage.CGImage);
break;
}
// And now we just create a new UIImage from the drawing context
CGImageRef cgimg = CGBitmapContextCreateImage(ctx);
UIImage *img = [UIImage imageWithCGImage:cgimg];
CGContextRelease(ctx);
CGImageRelease(cgimg);
return img;
}
I'm not sure how useful this will be for you, but I hope it helped :)
Try this,
You can use
NSData *somenewImageData = UIImageJPEGRepresentation(newimg,1.0);
instead of
NSData *somenewImageData = UIImagePNGRepresentation(newimg);
Pls Try the following code
UIImage *sourceImage = ... // Our image
CGRect selectionRect = CGRectMake(100.0, 100.0, 300.0, 400.0);
CGImageRef resultImageRef = CGImageCreateWithImageInRect(sourceImage.CGImage,
selectionRect);
UIImage *resultImage = [[UIImage alloc] initWithCGImage:resultImageRef];
And
CGRect TransformCGRectForUIImageOrientation(CGRect source, UIImageOrientation orientation, CGSize imageSize) {
switch (orientation) {
case UIImageOrientationLeft: { // EXIF #8
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI_2);
return CGRectApplyAffineTransform(source, txCompound);
}
case UIImageOrientationDown: { // EXIF #3
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI);
return CGRectApplyAffineTransform(source, txCompound);
}
case UIImageOrientationRight: { // EXIF #6
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(0.0, imageSize.width);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI + M_PI_2);
return CGRectApplyAffineTransform(source, txCompound);
}
case UIImageOrientationUp: // EXIF #1 - do nothing
default: // EXIF 2,4,5,7 - ignore
return source;
}
}
...
UIImage *sourceImage = ... // Our image
CGRect selectionRect = CGRectMake(100.0, 100.0, 300.0, 400.0);
CGRect transformedRect = TransformCGRectForUIImageOrientation(selectionRect, sourceImage.imageOrientation, sourceImage.size);
CGImageRef resultImageRef = CGImageCreateWithImageInRect(sourceImage.CGImage, transformedRect);
UIImage *resultImage = [[UIImage alloc] initWithCGImage:resultImageRef];
I have referanced from following link have look for more detail
Best Regards :-)
Try this code:
NSData *imageData = UIImageJPEGRepresentation(delegate.originalPhoto,100);

How to set the resolution of a saved image from the UIImagePickerController

I'm using an UIImagePickerController to set a picture from my cameraroll to a UIImageView. However the image is getting scaled automaticly inside this UIImageView because I mentioned 'scale to fit' inside the IB. I'd like to save my chosen image with a resolution of 80x80 pixels so that I won't have to scale. (My App is getting realy slow because of this scaling issue.)
Here's a snippit from my code:
-(void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingImage : (UIImage *)image
editingInfo:(NSDictionary *)editingInfo
{
[picker dismissModalViewControllerAnimated:YES];
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
photoView.image = image;
NSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *pngFilePath = [NSString stringWithFormat:#"%#/foto1.png",docDir];
NSData *data = [NSData dataWithData:UIImagePNGRepresentation(image)];
[data writeToFile:pngFilePath atomically:YES];
}
Help is greatly appreciated!
I found this code here: Link
// ==============================================================
// resizedImage
// ==============================================================
// Return a scaled down copy of the image.
UIImage* resizedImage(UIImage *inImage, CGRect thumbRect)
{
CGImageRef imageRef = [inImage CGImage];
CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef);
// There's a wierdness with kCGImageAlphaNone and CGBitmapContextCreate
// see Supported Pixel Formats in the Quartz 2D Programming Guide
// Creating a Bitmap Graphics Context section
// only RGB 8 bit images with alpha of kCGImageAlphaNoneSkipFirst, kCGImageAlphaNoneSkipLast, kCGImageAlphaPremultipliedFirst,
// and kCGImageAlphaPremultipliedLast, with a few other oddball image kinds are supported
// The images on input here are likely to be png or jpeg files
if (alphaInfo == kCGImageAlphaNone)
alphaInfo = kCGImageAlphaNoneSkipLast;
// Build a bitmap context that's the size of the thumbRect
CGContextRef bitmap = CGBitmapContextCreate(
NULL,
thumbRect.size.width, // width
thumbRect.size.height, // height
CGImageGetBitsPerComponent(imageRef), // really needs to always be 8
4 * thumbRect.size.width, // rowbytes
CGImageGetColorSpace(imageRef),
alphaInfo
);
// Draw into the context, this scales the image
CGContextDrawImage(bitmap, thumbRect, imageRef);
// Get an image from the context and a UIImage
CGImageRef ref = CGBitmapContextCreateImage(bitmap);
UIImage* result = [UIImage imageWithCGImage:ref];
CGContextRelease(bitmap); // ok if NULL
CGImageRelease(ref);
return result;
}
The problem with Sabobin's answer is that it screws up with the image orientation.
I've found this solution that solves the orientation problem and has a faster performance:
- (UIImage *)scaleAndRotateImage:(UIImage *)image {
int kMaxResolution = 320; // Or whatever
CGImageRef imgRef = image.CGImage;
CGFloat width = CGImageGetWidth(imgRef);
CGFloat height = CGImageGetHeight(imgRef);
CGAffineTransform transform = CGAffineTransformIdentity;
CGRect bounds = CGRectMake(0, 0, width, height);
if (width > kMaxResolution || height > kMaxResolution) {
CGFloat ratio = width/height;
if (ratio > 1) {
bounds.size.width = kMaxResolution;
bounds.size.height = bounds.size.width / ratio;
}
else {
bounds.size.height = kMaxResolution;
bounds.size.width = bounds.size.height * ratio;
}
}
CGFloat scaleRatio = bounds.size.width / width;
CGSize imageSize = CGSizeMake(CGImageGetWidth(imgRef), CGImageGetHeight(imgRef));
CGFloat boundHeight;
UIImageOrientation orient = image.imageOrientation;
switch(orient) {
case UIImageOrientationUp: //EXIF = 1
transform = CGAffineTransformIdentity;
break;
case UIImageOrientationUpMirrored: //EXIF = 2
transform = CGAffineTransformMakeTranslation(imageSize.width, 0.0);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
break;
case UIImageOrientationDown: //EXIF = 3
transform = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
transform = CGAffineTransformRotate(transform, M_PI);
break;
case UIImageOrientationDownMirrored: //EXIF = 4
transform = CGAffineTransformMakeTranslation(0.0, imageSize.height);
transform = CGAffineTransformScale(transform, 1.0, -1.0);
break;
case UIImageOrientationLeftMirrored: //EXIF = 5
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, imageSize.width);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;
case UIImageOrientationLeft: //EXIF = 6
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(0.0, imageSize.width);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;
case UIImageOrientationRightMirrored: //EXIF = 7
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeScale(-1.0, 1.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;
case UIImageOrientationRight: //EXIF = 8
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;
default:
[NSException raise:NSInternalInconsistencyException format:#"Invalid image orientation"];
}
UIGraphicsBeginImageContext(bounds.size);
CGContextRef context = UIGraphicsGetCurrentContext();
if (orient == UIImageOrientationRight || orient == UIImageOrientationLeft) {
CGContextScaleCTM(context, -scaleRatio, scaleRatio);
CGContextTranslateCTM(context, -height, 0);
}
else {
CGContextScaleCTM(context, scaleRatio, -scaleRatio);
CGContextTranslateCTM(context, 0, -height);
}
CGContextConcatCTM(context, transform);
CGContextDrawImage(UIGraphicsGetCurrentContext(), CGRectMake(0, 0, width, height), imgRef);
UIImage *imageCopy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return imageCopy;
}
You can resize the selected image using ImageHelper class.