Get the size of an image after resizing in iPhone sdk - iphone

I have image view on which i am displaying image selected from library. To display the fine quality picture i used to rescale the picture using below method. The image quality i am getting is perfect but i need to set the imageView frame according to the size of newly created image. but when i use newImage.size.width it is giving me the width of original image view. Please help me to set the image view frame according to displayed image size. Thanks in advance
-(UIImage *)scaleImage:(UIImage *)img toRectSize:(CGRect)screenRect
{
UIGraphicsBeginImageContext(screenRect.size);
float hfactor = img.size.width / screenRect.size.width;
float vfactor = img.size.height / screenRect.size.height;
float factor = MAX(hfactor, vfactor);
float newWidth = img.size.width / factor;
float newHeight = img.size.height / factor;
float leftOffset = (screenRect.size.width - newWidth) / 2;
float topOffset = (screenRect.size.height - newHeight) / 2;
CGRect newRect = CGRectMake(leftOffset, topOffset, newWidth, newHeight);
[img drawInRect:newRect blendMode:kCGBlendModePlusDarker alpha:1];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

Try this code that I used for resizing image and you will get the new frame as well.The ratio seem to be fixed but you can change it as per your requirement.
-(UIImage*)ImageResize:(UIImage*)image
{
if(image==NULL)
{
return NULL;
}
else
{
float actualHeight = image.size.height;
float actualWidth = image.size.width;
float imgRatio = actualWidth/actualHeight;
float maxRatio = 130.0/160.0;
if(imgRatio!=maxRatio)
{
if(imgRatio < maxRatio)
{
imgRatio = 160.0 / actualHeight;
actualWidth = imgRatio * actualWidth;
actualHeight = 160.0;
}
else
{
imgRatio = 130.0 / actualWidth;
actualHeight = imgRatio * actualHeight;
actualWidth = 130.0;
}
}
CGRect rect = CGRectMake(0.0, 0.0, actualWidth, actualHeight);
UIGraphicsBeginImageContext(rect.size);
[image drawInRect:rect];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
}
The below code can be used for specific image size that you can pass.
-(UIImage *)thumbnailWithImageWithoutScale:(UIImage *)image size:(CGSize)wantSize
{
UIImage * targetImage;
if (nil == image) {
targetImage = nil;
}else{
CGSize size = image.size;
CGRect rect;
if (wantSize.width/wantSize.height > size.width/size.height) {
rect.size.width = wantSize.height*size.width/size.height;
rect.size.height = wantSize.height;
rect.origin.x = (wantSize.width - rect.size.width)/2;
rect.origin.y = 0;
} else{
rect.size.width = wantSize.width;
rect.size.height = wantSize.width*size.height/size.width;
rect.origin.x = 0;
rect.origin.y = (wantSize.height - rect.size.height)/2;
}
UIGraphicsBeginImageContext(wantSize);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetFillColorWithColor(context, [[UIColor clearColor] CGColor]);
UIRectFill(CGRectMake(0, 0, wantSize.width, wantSize.height));//clear background
[image drawInRect:rect];
targetImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
return targetImage;
}

Try this,
resizedImage = [self imageWithImage:originalImage scaledToSize:CGSizeMake(45,45)];
self.imageView.image = resizedImage;
- (UIImage*)imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

Are you using Autolayout?
If yes you have to change width and height constraints of your image view not the frame (you can create outlets of them in the same way as you do for other controls).
For autolayout you can change frame in viewDidAppear method (in viewWillAppear or viewWillLoad it may not work.

Related

iOS : Save image with custom resolution

Hi I am try to capture a view then save as an image into Photo Library , but I need create a custom resolution for captured image , here is my code but when app saves the images the resolution is low !
UIGraphicsBeginImageContextWithOptions(self.captureView.bounds.size, self.captureView.opaque, 0.0);
[self.captureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * screenshot = UIGraphicsGetImageFromCurrentImageContext();
CGRect cropRect = CGRectMake(0 ,0 ,1435 ,1435);
CGImageRef imageRef = CGImageCreateWithImageInRect([screenshot CGImage], cropRect);
CGImageRelease(imageRef);
UIImageWriteToSavedPhotosAlbum(screenshot , nil, nil, nil);
UIGraphicsEndImageContext();
but the resolution in iPhone is : 320 x 320 and retina is : 640 x 640
I would be grateful if you help me to fix this issue .
Your code is pretty close. What you need to do is re-render the screenshot at the custom resolution. I modified your code to do this:
UIView* captureView = self.view;
/* Capture the screen shoot at native resolution */
UIGraphicsBeginImageContextWithOptions(captureView.bounds.size, captureView.opaque, 0.0);
[captureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
/* Render the screen shot at custom resolution */
CGRect cropRect = CGRectMake(0 ,0 ,1435 ,1435);
UIGraphicsBeginImageContextWithOptions(cropRect.size, captureView.opaque, 1.0f);
[screenshot drawInRect:cropRect];
UIImage * customScreenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
/* Save to the photo album */
UIImageWriteToSavedPhotosAlbum(customScreenShot , nil, nil, nil);
Note that if capture view is not square then the image will be distorted. The saved image will always be square and 1435x1435 pixels.
have a look at this answer. The code includes rotation but nonetheless the questioner asked the same question: "How to get a […] image from an UIImageView at its full resolution?"
copied content (in case of deletion or whatever):
- (UIImage *)capturedView
{
float imageScale = sqrtf(powf(self.captureView.transform.a, 2.f) + powf(self.captureView.transform.c, 2.f));
CGFloat widthScale = self.captureView.bounds.size.width / self.captureView.image.size.width;
CGFloat heightScale = self.captureView.bounds.size.height / self.captureView.image.size.height;
float contentScale = MIN(widthScale, heightScale);
float effectiveScale = imageScale * contentScale;
CGSize captureSize = CGSizeMake(enclosingView.bounds.size.width / effectiveScale, enclosingView.bounds.size.height / effectiveScale);
NSLog(#"effectiveScale = %0.2f, captureSize = %#", effectiveScale, NSStringFromCGSize(captureSize));
UIGraphicsBeginImageContextWithOptions(captureSize, YES, 0.0);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextScaleCTM(context, 1/effectiveScale, 1/effectiveScale);
[enclosingView.layer renderInContext:context];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
First get your image in UIImage object. Create your size what ever you want and use following..
UIImage *image = // you image;
CGSize size;
if ([[UIScreen mainScreen] respondsToSelector:#selector(displayLinkWithTarget:selector:)] &&
([UIScreen mainScreen].scale == 2.0)) {
// RETINA DISPLAY
size = CGSizeMake(640, 640);
}
else {
// Non Ratina device
size = CGSizeMake(320, 320);
}
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *destImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Now you will get destImage with new resolution.
Hope this is what you are looking for :)
-(UIImage*)processImageRect:(UIImage*)image:(CGSize)sizeToForm {
// Draw image1
UIGraphicsBeginImageContext(CGSizeMake(sizeToForm.width, sizeToForm.height));
[image drawInRect:CGRectMake(0.0, 0.0, sizeToForm.width, sizeToForm.height)];
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
return resultingImage;
}
Go with this may solve your issue.
You can use that :
UIImageExtras.h
#interface UIImage (Extras)
-(UIImage*)imageByScalingAndCroppingForSize:(CGSize)targetSize;
#end
UIImageExtras.m
#import "UIImageExtras.h"
#implementation UIImage (Extras)
- (UIImage*)imageByScalingAndCroppingForSize:(CGSize)targetSize
{
//Image de base
UIImage *sourceImage = self;
//Image redimenssionnée
UIImage *newImage = nil;
//Taille de l'image de base
CGSize imageSize = sourceImage.size;
//Longueur et largeur
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
//Dimension désirée
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
//Echelle...
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
//Si taille des image est différentes on redimensionne de facon proportionnelle
if (CGSizeEqualToSize(imageSize, targetSize) == NO)
{
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor > heightFactor)
scaleFactor = widthFactor; // scale to fit height
else
scaleFactor = heightFactor; // scale to fit width
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
//Centre l'image
if (widthFactor > heightFactor)
{
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
}
else if (widthFactor < heightFactor)
{
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
UIGraphicsBeginImageContext(targetSize);
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
if(newImage == nil)
NSLog(#"could not scale image");
UIGraphicsEndImageContext();
return newImage;
}
#end
This worked for me. I tried with image that i got from Facebook SDK
http://pulkitgoyal.in/resizing-high-resolution-images-on-ios-without-memory-issues/
I think ALAssetRepresentation can help you.
CGSize sizePic = CGSizeMake(320, 460);
UIGraphicsBeginImageContext(sizePic);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *imagePic = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(imagePic, nil, nil, nil);
#import <ImageIO/ImageIO.h>
#import <MobileCoreServices/MobileCoreServices.h>
+ (UIImage *)resizeImage:(UIImage *)image toResolution:(int)resolution {
NSData *imageData = UIImagePNGRepresentation(image);
CGImageSourceRef src = CGImageSourceCreateWithData((__bridge CFDataRef)imageData, NULL);
CFDictionaryRef options = (__bridge CFDictionaryRef) #{
(id) kCGImageSourceCreateThumbnailWithTransform : #YES,
(id) kCGImageSourceCreateThumbnailFromImageAlways : #YES,
(id) kCGImageSourceThumbnailMaxPixelSize : #(resolution)
};
CGImageRef thumbnail = CGImageSourceCreateThumbnailAtIndex(src, 0, options);
CFRelease(src);
UIImage *img = [[UIImage alloc]initWithCGImage:thumbnail];
return img;
}

How to show black empty space in CGContext for iOS

I have an image that is 640x480 pixels and I need to crop and center it into a 596x596 px UIImage. Any empty space should be black (it should be black above and below the image). Right now i'm cropping it like this...
-(UIImage*)cropImage:(UIImage *)theImage toFitSize:(CGSize)theSize
{
CGFloat CROP_X = floorf((theImage.size.width-theSize.width)/2);
CGFloat CROP_Y = floorf((theImage.size.height-theSize.height)/2);
CGRect imageRect = CGRectMake(CROP_X, CROP_Y, theSize.width, theSize.height);
UIGraphicsBeginImageContext(imageRect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetRGBFillColor(context, 0, 0, 0, 1);
CGRect drawRect = CGRectMake(-imageRect.origin.x, -imageRect.origin.y, theImage.size.width, theImage.size.height);
CGContextClipToRect(context, CGRectMake(0, 0, imageRect.size.width, imageRect.size.height));
[theImage drawInRect:drawRect];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return finalImage;
}
And I also tried
- (UIImage *)croppedImage:(CGRect)bounds
{
CGImageRef imageRef = CGImageCreateWithImageInRect([self CGImage], bounds);
UIImage *croppedImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return croppedImage;
}
But the empty space comes out as transparent. How do i do this?
Thanks
Do you really need to alter the image? If you're just going to present the clipped image, you can configure the UIImageView backgroundColor property to get the desired effect.
CGRect largerRect = CGRectMake(/* larger rect */);
CGRect smallerRect = CGRectMake(/* smaller rect */);
UIImage *croppedImage = [self cropImage:largerImage toFitSize:smallerRect];
// make the view the original size
UIImageView *imageView = [[UIImageView alloc] initWithFrame:largerRect];
imageView.image = croppedImage;
// center the cropped image and give it a loud background color
imageView.contentMode = UIViewContentModeCenter;
imageView.backgroundColor = [UIColor redColor];
this is resize image working code you only use resizeimage function and image is resize as you want width height
-`-(UIImage *)resizeImage:(UIImage *)image{
float width = image.size.width;
float height = image.size.height;
float maxSide = 310;
if (width >= height && width > maxSide)
{
width = maxSide;
height = (height*(width/image.size.width));
}
else{
if (height > maxSide)
{
height = maxSide;
width = (width * (height/image.size.height));
}
}
if ((int)width % 2 != 0)
{
width-- ;
}
if ((int)height %2 !=0)
{
height-- ;
}
UIImage *newImage;
if (width != image.size.width)
newImage = [self scaleImage:image ToSize:CGSizeMake(width,height)];
else
newImage = image;
return newImage;}- (UIImage*)scaleImage:(UIImage*)image ToSize:(CGSize)targetSize{
if (image == nil)
return nil;
UIImage *sourceImage = image;
UIImage *newImage = nil;
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO)
{
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor > heightFactor)
scaleFactor = widthFactor; // scale to fit height
else
scaleFactor = heightFactor; // scale to fit width
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor > heightFactor)
{
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
}
else
if (widthFactor < heightFactor)
{
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
UIGraphicsBeginImageContext(targetSize); // this will crop
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
if(newImage == nil)
{
}
UIGraphicsEndImageContext();
return newImage;
}
`

UIImageOrientation issue while upload on server

I don't know how this happens but when I take image from camera in portrait mode and upload to server it is displaying fine but when I take the same image from Photo Library and upload to server it is displaying in Landscape mode..
Don't know how this happens? and very stuck from last 5 hours..
I had gone through this and this but doesn't got success yet.
Any one help me about this problem?
Thanks in advance.
UPDATE
- (UIImage *)imageToFitSize:(CGSize)fitSize method:(MGImageResizingMethod)resizeMethod
{
float imageScaleFactor = 1.0;
#if __IPHONE_OS_VERSION_MAX_ALLOWED >= 40000
if ([self respondsToSelector:#selector(scale)]) {
imageScaleFactor = [self scale];
}
#endif
float sourceWidth = [self size].width * imageScaleFactor;
float sourceHeight = [self size].height * imageScaleFactor;
float targetWidth = fitSize.width;
float targetHeight = fitSize.height;
BOOL cropping = !(resizeMethod == MGImageResizeScale);
// Calculate aspect ratios
float sourceRatio = sourceWidth / sourceHeight;
float targetRatio = targetWidth / targetHeight;
// Determine what side of the source image to use for proportional scaling
BOOL scaleWidth = (sourceRatio <= targetRatio);
// Deal with the case of just scaling proportionally to fit, without cropping
scaleWidth = (cropping) ? scaleWidth : !scaleWidth;
// Proportionally scale source image
float scalingFactor, scaledWidth, scaledHeight;
if (scaleWidth) {
scalingFactor = 1.0 / sourceRatio;
scaledWidth = targetWidth;
scaledHeight = round(targetWidth * scalingFactor);
} else {
scalingFactor = sourceRatio;
scaledWidth = round(targetHeight * scalingFactor);
scaledHeight = targetHeight;
}
float scaleFactor = scaledHeight / sourceHeight;
// Calculate compositing rectangles
CGRect sourceRect, destRect;
if (cropping) {
destRect = CGRectMake(0, 0, targetWidth, targetHeight);
float destX, destY;
if (resizeMethod == MGImageResizeCrop) {
// Crop center
destX = round((scaledWidth - targetWidth) / 2.0);
destY = round((scaledHeight - targetHeight) / 2.0);
} else if (resizeMethod == MGImageResizeCropStart) {
// Crop top or left (prefer top)
if (scaleWidth) {
// Crop top
destX = 0.0;
destY = 0.0;
} else {
// Crop left
destX = 0.0;
destY = round((scaledHeight - targetHeight) / 2.0);
}
} else if (resizeMethod == MGImageResizeCropEnd) {
// Crop bottom or right
if (scaleWidth) {
// Crop bottom
destX = round((scaledWidth - targetWidth) / 2.0);
destY = round(scaledHeight - targetHeight);
} else {
// Crop right
destX = round(scaledWidth - targetWidth);
destY = round((scaledHeight - targetHeight) / 2.0);
}
}
sourceRect = CGRectMake(destX / scaleFactor, destY / scaleFactor,
targetWidth / scaleFactor, targetHeight / scaleFactor);
} else {
sourceRect = CGRectMake(0, 0, sourceWidth, sourceHeight);
destRect = CGRectMake(0, 0, scaledWidth, scaledHeight);
}
// Create appropriately modified image.
UIImage *image = nil;
#if __IPHONE_OS_VERSION_MAX_ALLOWED >= 40000
if ([[[UIDevice currentDevice] systemVersion] floatValue] >= 4.0) {
UIGraphicsBeginImageContextWithOptions(destRect.size, NO, 0.0); // 0.0 for scale means "correct scale for device's main screen".
CGImageRef sourceImg;
if(resizeMethod == MGImageResizeCrop)
sourceImg = CGImageCreateWithImageInRect([self CGImage], sourceRect); // cropping happens here.
else
sourceImg = CGImageRetain([self CGImage]); // scaling happens here.
image = [UIImage imageWithCGImage:sourceImg scale:0.0 orientation:self.imageOrientation]; // create cropped UIImage.
[image drawInRect:destRect]; // the actual scaling happens here, and orientation is taken care of automatically.
CGImageRelease(sourceImg);
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
#endif
if (!image) {
// Try older method.
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(NULL, fitSize.width, fitSize.height, 8, (fitSize.width * 4),
colorSpace, kCGImageAlphaPremultipliedLast);
CGImageRef sourceImg;
if(resizeMethod == MGImageResizeCrop)
sourceImg = CGImageCreateWithImageInRect([self CGImage], sourceRect); // cropping happens here.
else
sourceImg = CGImageRetain([self CGImage]); // scaling happens here.
//CGImageRef sourceImg = CGImageCreateWithImageInRect([self CGImage], sourceRect);
CGContextDrawImage(context, destRect, sourceImg);
CGImageRelease(sourceImg);
CGImageRef finalImage = CGBitmapContextCreateImage(context);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
image = [UIImage imageWithCGImage:finalImage];
CGImageRelease(finalImage);
}
return image;
}
Where MGImageResizingMethod is enum that I had defined and passing MGImageResizeScale as argument in function.
try it may be working. you have to set the condition according your image like
if(image == fromCamera){
[image fixOrientation];
}
else{
//please do not convert it to orientation
}
I suggest you to visit this two reference and you will get solution...
Camera image changes orientation
Image became horizontal after successfully uploaded on server using Http Post
Hope, this will help you..

How to resize the image programmatically in objective-c in iphone

I have an application where I am displaying large images in a small space.
The images are quite large, but I am only displaying them in 100x100 pixel frames.
My app is responding slowly because of the size fo the images I am using.
To improve performance, how can I resize the images programmatically using Objective-C?
Please find the following code.
- (UIImage *)imageWithImage:(UIImage *)image convertToSize:(CGSize)size {
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *destImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return destImage;
}
This code is for just change image scale not for resizing. You have to set CGSize as your image width and hight so the image will not stretch and it arrange at the middle.
- (UIImage *)imageWithImage:(UIImage *)image scaledToFillSize:(CGSize)size
{
CGFloat scale = MAX(size.width/image.size.width, size.height/image.size.height);
CGFloat width = image.size.width * scale;
CGFloat height = image.size.height * scale;
CGRect imageRect = CGRectMake((size.width - width)/2.0f,
(size.height - height)/2.0f,
width,
height);
UIGraphicsBeginImageContextWithOptions(size, NO, 0);
[image drawInRect:imageRect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
My favorite way to do this is with CGImageSourceCreateThumbnailAtIndex (in the ImageIO framework). The name is a bit misleading.
Here's an excerpt of some code from a recent app of mine.
CGFloat maxw = // whatever;
CGFloat maxh = // whatever;
CGImageSourceRef src = NULL;
if ([imageSource isKindOfClass:[NSURL class]])
src = CGImageSourceCreateWithURL((__bridge CFURLRef)imageSource, nil);
else if ([imageSource isKindOfClass:[NSData class]])
src = CGImageSourceCreateWithData((__bridge CFDataRef)imageSource, nil);
// if at double resolution, double the thumbnail size and use double-resolution image
CGFloat scale = 1;
if ([[UIScreen mainScreen] scale] > 1.0) {
scale = 2;
maxw *= 2;
maxh *= 2;
}
// load the image at the desired size
NSDictionary* d = #{
(id)kCGImageSourceShouldAllowFloat: (id)kCFBooleanTrue,
(id)kCGImageSourceCreateThumbnailWithTransform: (id)kCFBooleanTrue,
(id)kCGImageSourceCreateThumbnailFromImageAlways: (id)kCFBooleanTrue,
(id)kCGImageSourceThumbnailMaxPixelSize: #((int)(maxw > maxh ? maxw : maxh))
};
CGImageRef imref = CGImageSourceCreateThumbnailAtIndex(src, 0, (__bridge CFDictionaryRef)d);
if (NULL != src)
CFRelease(src);
UIImage* im = [UIImage imageWithCGImage:imref scale:scale orientation:UIImageOrientationUp];
if (NULL != imref)
CFRelease(imref);
If you are using a image on different sizes and resizing each time it will degrade your app performance. Solution is don't resize them just use button in place of imageview. and just set the image on button it will resize automatically and you will get great performance.
I was also resizing images while setting it on cell but my app got slow So I used Button in place of imageview (not resizing images programatically button is doing this job) and it is working perfectly fine.
-(UIImage *)scaleImage:(UIImage *)image toSize:. (CGSize)targetSize
{
//If scaleFactor is not touched, no scaling will occur
CGFloat scaleFactor = 1.0;
//Deciding which factor to use to scale the image (factor = targetSize / imageSize)
if (image.size.width > targetSize.width ||
image.size.height > targetSize.height || image.size.width == image.size.height)
if (!((scaleFactor = (targetSize.width /
image.size.width)) > (targetSize.height /
image.size.height))) //scale to fit width, or
scaleFactor = targetSize.height / image.size.height; // scale to fit heigth.
Since the code ran perfectly fine in iOS 4, for backwards compatibility I added a check for OS version and for anything below 5.0 the old code would work.
- (UIImage *)resizedImage:(CGSize)newSize interpolationQuality:(CGInterpolationQuality)quality {
BOOL drawTransposed;
CGAffineTransform transform = CGAffineTransformIdentity;
if ([[[UIDevice currentDevice] systemVersion] floatValue] >= 5.0) {
// Apprently in iOS 5 the image is already correctly rotated, so we don't need to rotate it manually
drawTransposed = NO;
} else {
switch (self.imageOrientation) {
case UIImageOrientationLeft:
case UIImageOrientationLeftMirrored:
case UIImageOrientationRight:
case UIImageOrientationRightMirrored:
drawTransposed = YES;
break;
default:
drawTransposed = NO;
}
transform = [self transformForOrientation:newSize];
}
return [self resizedImage:newSize
transform:transform
drawTransposed:drawTransposed
interpolationQuality:quality];
}
You can use this.
[m_Image.layer setMinificationFilter:kCAFilterTrilinear];
This thread is old, but it is what I pulled up when trying to solve this problem. Once the image is scaled it was not displaying well in my container even though I turned auto layout off. The easiest way for me to solve this for display in a table row, was to paint the image on a white background that had a fixed size.
Helper function
+(UIImage*)scaleMaintainAspectRatio:(UIImage*)sourceImage :(float)i_width :(float)i_height
{
float newHeight = 0.0;
float newWidth = 0.0;
float oldWidth = sourceImage.size.width;
float widthScaleFactor = i_width / oldWidth;
float oldHeight = sourceImage.size.height;
float heightScaleFactor = i_height / oldHeight;
if (heightScaleFactor > widthScaleFactor) {
newHeight = oldHeight * widthScaleFactor;
newWidth = sourceImage.size.width * widthScaleFactor;
} else {
newHeight = sourceImage.size.height * heightScaleFactor;
newWidth = oldWidth * heightScaleFactor;
}
// return image in white rect
float cxPad = i_width - newWidth;
float cyPad = i_height - newHeight;
if (cyPad > 0) {
cyPad = cyPad / 2.0;
}
if (cxPad > 0) {
cxPad = cxPad / 2.0;
}
CGSize size = CGSizeMake(i_width, i_height);
UIGraphicsBeginImageContextWithOptions(CGSizeMake(size.width, size.height), YES, 0.0);
[[UIColor whiteColor] setFill];
UIRectFill(CGRectMake(0, 0, size.width, size.height));
[sourceImage drawInRect:CGRectMake((int)cxPad, (int)cyPad, newWidth, newHeight)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
// will return scaled image at actual size, not in white rect
// UIGraphicsBeginImageContext(CGSizeMake(newWidth, newHeight));
// [sourceImage drawInRect:CGRectMake(0, 0, newWidth, newHeight)];
// UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
// UIGraphicsEndImageContext();
// return newImage;
}
I called this like this from my table view cellForRowAtIndexPath
PFFile *childsPicture = [object objectForKey:#"picture"];
[childsPicture getDataInBackgroundWithBlock:^(NSData *imageData, NSError *error) {
if (!error) {
UIImage *largePicture = [UIImage imageWithData:imageData];
UIImage *scaledPicture = [Utility scaleMaintainAspectRatio:largePicture :70.0 :70.0 ];
PFImageView *thumbnailImageView = (PFImageView*)[cell viewWithTag:100];
thumbnailImageView.image = scaledPicture;
[self.tableView reloadData];
}
}];
Hello from the end of 2018.
Solved with next solution (you need only last line, first & second are just for explanation):
NSURL *url = [NSURL URLWithString:response.json[0][#"photo_50"]];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *image = [UIImage imageWithData:data scale:customScale];
'customScale' is scale which you want (>1 if image must be smaller, <1 if image must be bigger).
This c method will resize your image with cornerRadius "Without effecting image's quality" :
UIImage *Resize_Image(UIImage *iImage, CGFloat iSize, CGFloat icornerRadius) {
CGFloat scale = MAX(CGSizeMake(iSize ,iSize).width/iImage.size.width, CGSizeMake(iSize ,iSize).height/iImage.size.height);
CGFloat width = iImage.size.width * scale;
CGFloat height = iImage.size.height * scale;
CGRect imageRect = CGRectMake((CGSizeMake(iSize ,iSize).width - width)/2.0f,(CGSizeMake(iSize ,iSize).height - height)/2.0f,width,height);
UIGraphicsBeginImageContextWithOptions(CGSizeMake(iSize ,iSize), NO, 0);
[[UIBezierPath bezierPathWithRoundedRect:imageRect cornerRadius:icornerRadius] addClip];
[iImage drawInRect:imageRect];
UIImage *ResizedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return ResizedImage;
}
This is how to use :
UIImage *ResizedImage = Resize_Image([UIImage imageNamed:#"image.png"], 64, 14.4);
I do not remember where i took the first 4 lines ..

UIImage: Resize, then Crop

I've been bashing my face into this one for literally days now and even though I feel constantly that I am right on the edge of revelation, I simply cannot achieve my goal.
I thought, ahead of time in the conceptual phases of my design, that it would be a trivial matter to grab a image from the iPhone's camera or library, scale it down to a specified height, using a function equivalent to the Aspect Fill option of UIImageView (entirely in code), and then crop off anything that did not fit within a passed CGRect.
Getting the original image from camera or library, was trivial. I am shocked at how difficult the other two steps have proved to be.
The attached image shows what I am trying to achieve. Would someone please be kind enough to hold my hand? Every code example I have found so far seems to smash the image, be upside down, look like crap, draw out of bounds, or otherwise just not work correctly.
I needed the same thing - in my case, to pick the dimension that fits once scaled, and then crop each end to fit the rest to the width. (I'm working in landscape, so might not have noticed any deficiencies in portrait mode.) Here's my code - it's part of a categeory on UIImage. Target size in my code is always set to the full screen size of the device.
#implementation UIImage (Extras)
#pragma mark -
#pragma mark Scale and crop image
- (UIImage*)imageByScalingAndCroppingForSize:(CGSize)targetSize
{
UIImage *sourceImage = self;
UIImage *newImage = nil;
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO)
{
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor > heightFactor)
{
scaleFactor = widthFactor; // scale to fit height
}
else
{
scaleFactor = heightFactor; // scale to fit width
}
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor > heightFactor)
{
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
}
else
{
if (widthFactor < heightFactor)
{
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
}
UIGraphicsBeginImageContext(targetSize); // this will crop
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
if(newImage == nil)
{
NSLog(#"could not scale image");
}
//pop the context to get back to the default
UIGraphicsEndImageContext();
return newImage;
}
An older post contains code for a method to resize your UIImage. The relevant portion is as follows:
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
As far as cropping goes, I believe that if you alter the method to use a different size for the scaling than for the context, your resulting image should be clipped to the bounds of the context.
+ (UIImage *)scaleImage:(UIImage *)image toSize:(CGSize)targetSize {
//If scaleFactor is not touched, no scaling will occur
CGFloat scaleFactor = 1.0;
//Deciding which factor to use to scale the image (factor = targetSize / imageSize)
if (image.size.width > targetSize.width || image.size.height > targetSize.height)
if (!((scaleFactor = (targetSize.width / image.size.width)) > (targetSize.height / image.size.height))) //scale to fit width, or
scaleFactor = targetSize.height / image.size.height; // scale to fit heigth.
UIGraphicsBeginImageContext(targetSize);
//Creating the rect where the scaled image is drawn in
CGRect rect = CGRectMake((targetSize.width - image.size.width * scaleFactor) / 2,
(targetSize.height - image.size.height * scaleFactor) / 2,
image.size.width * scaleFactor, image.size.height * scaleFactor);
//Draw the image into the rect
[image drawInRect:rect];
//Saving the image, ending image context
UIImage *scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return scaledImage;
}
I propose this one. Isn't she a beauty? ;)
There's a great piece of code related to the resizing of images + several other operations. I came around this one when trying to figure ou how to resize images...
http://vocaro.com/trevor/blog/2009/10/12/resize-a-uiimage-the-right-way/
This is a version of Jane Sales' answer in Swift. Cheers!
public func resizeImage(image: UIImage, size: CGSize) -> UIImage? {
var returnImage: UIImage?
var scaleFactor: CGFloat = 1.0
var scaledWidth = size.width
var scaledHeight = size.height
var thumbnailPoint = CGPointMake(0, 0)
if !CGSizeEqualToSize(image.size, size) {
let widthFactor = size.width / image.size.width
let heightFactor = size.height / image.size.height
if widthFactor > heightFactor {
scaleFactor = widthFactor
} else {
scaleFactor = heightFactor
}
scaledWidth = image.size.width * scaleFactor
scaledHeight = image.size.height * scaleFactor
if widthFactor > heightFactor {
thumbnailPoint.y = (size.height - scaledHeight) * 0.5
} else if widthFactor < heightFactor {
thumbnailPoint.x = (size.width - scaledWidth) * 0.5
}
}
UIGraphicsBeginImageContextWithOptions(size, true, 0)
var thumbnailRect = CGRectZero
thumbnailRect.origin = thumbnailPoint
thumbnailRect.size.width = scaledWidth
thumbnailRect.size.height = scaledHeight
image.drawInRect(thumbnailRect)
returnImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return returnImage
}
Here you go. This one is perfect ;-)
EDIT: see below comment - "Does not work with certain images, fails with: CGContextSetInterpolationQuality: invalid context 0x0 error"
// Resizes the image according to the given content mode, taking into account the image's orientation
- (UIImage *)resizedImageWithContentMode:(UIViewContentMode)contentMode imageToScale:(UIImage*)imageToScale bounds:(CGSize)bounds interpolationQuality:(CGInterpolationQuality)quality {
//Get the size we want to scale it to
CGFloat horizontalRatio = bounds.width / imageToScale.size.width;
CGFloat verticalRatio = bounds.height / imageToScale.size.height;
CGFloat ratio;
switch (contentMode) {
case UIViewContentModeScaleAspectFill:
ratio = MAX(horizontalRatio, verticalRatio);
break;
case UIViewContentModeScaleAspectFit:
ratio = MIN(horizontalRatio, verticalRatio);
break;
default:
[NSException raise:NSInvalidArgumentException format:#"Unsupported content mode: %d", contentMode];
}
//...and here it is
CGSize newSize = CGSizeMake(imageToScale.size.width * ratio, imageToScale.size.height * ratio);
//start scaling it
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGImageRef imageRef = imageToScale.CGImage;
CGContextRef bitmap = CGBitmapContextCreate(NULL,
newRect.size.width,
newRect.size.height,
CGImageGetBitsPerComponent(imageRef),
0,
CGImageGetColorSpace(imageRef),
CGImageGetBitmapInfo(imageRef));
CGContextSetInterpolationQuality(bitmap, quality);
// Draw into the context; this scales the image
CGContextDrawImage(bitmap, newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(bitmap);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
// Clean up
CGContextRelease(bitmap);
CGImageRelease(newImageRef);
return newImage;
}
I found that the Swift 3 posted by Evgenii Kanvets does not uniformly scale the image.
Here is my Swift 4 version of the function that does not squish the image:
static func resizedCroppedImage(image: UIImage, newSize:CGSize) -> UIImage? {
// This function returns a newImage, based on image
// - image is scaled uniformaly to fit into a rect of size newSize
// - if the newSize rect is of a different aspect ratio from the source image
// the new image is cropped to be in the center of the source image
// (the excess source image is removed)
var ratio: CGFloat = 0
var delta: CGFloat = 0
var drawRect = CGRect()
if newSize.width > newSize.height {
ratio = newSize.width / image.size.width
delta = (ratio * image.size.height) - newSize.height
drawRect = CGRect(x: 0, y: -delta / 2, width: newSize.width, height: newSize.height + delta)
} else {
ratio = newSize.height / image.size.height
delta = (ratio * image.size.width) - newSize.width
drawRect = CGRect(x: -delta / 2, y: 0, width: newSize.width + delta, height: newSize.height)
}
UIGraphicsBeginImageContextWithOptions(newSize, true, 0.0)
image.draw(in: drawRect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
I modified Brad Larson's Code. It will aspect fill the image in given rect.
-(UIImage*) scaleAndCropToSize:(CGSize)newSize;
{
float ratio = self.size.width / self.size.height;
UIGraphicsBeginImageContext(newSize);
if (ratio > 1) {
CGFloat newWidth = ratio * newSize.width;
CGFloat newHeight = newSize.height;
CGFloat leftMargin = (newWidth - newHeight) / 2;
[self drawInRect:CGRectMake(-leftMargin, 0, newWidth, newHeight)];
}
else {
CGFloat newWidth = newSize.width;
CGFloat newHeight = newSize.height / ratio;
CGFloat topMargin = (newHeight - newWidth) / 2;
[self drawInRect:CGRectMake(0, -topMargin, newSize.width, newSize.height/ratio)];
}
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
scrollView = [[UIScrollView alloc] initWithFrame:CGRectMake(0.0,0.0,ScreenWidth,ScreenHeigth)];
[scrollView setBackgroundColor:[UIColor blackColor]];
[scrollView setDelegate:self];
[scrollView setShowsHorizontalScrollIndicator:NO];
[scrollView setShowsVerticalScrollIndicator:NO];
[scrollView setMaximumZoomScale:2.0];
image=[image scaleToSize:CGSizeMake(ScreenWidth, ScreenHeigth)];
imageView = [[UIImageView alloc] initWithImage:image];
UIImageView* imageViewBk = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"background.png"]];
[self.view addSubview:imageViewBk];
CGRect rect;
rect.origin.x=0;
rect.origin.y=0;
rect.size.width = image.size.width;
rect.size.height = image.size.height;
[imageView setFrame:rect];
[scrollView setContentSize:[imageView frame].size];
[scrollView setMinimumZoomScale:[scrollView frame].size.width / [imageView frame].size.width];
[scrollView setZoomScale:[scrollView minimumZoomScale]];
[scrollView addSubview:imageView];
[[self view] addSubview:scrollView];
then you can take screen shots to your image by this
float zoomScale = 1.0 / [scrollView zoomScale];
CGRect rect;
rect.origin.x = [scrollView contentOffset].x * zoomScale;
rect.origin.y = [scrollView contentOffset].y * zoomScale;
rect.size.width = [scrollView bounds].size.width * zoomScale;
rect.size.height = [scrollView bounds].size.height * zoomScale;
CGImageRef cr = CGImageCreateWithImageInRect([[imageView image] CGImage], rect);
UIImage *cropped = [UIImage imageWithCGImage:cr];
CGImageRelease(cr);
Xamarin.iOS version for accepted answer on how to resize and then crop UIImage (Aspect Fill) is
below
public static UIImage ScaleAndCropImage(UIImage sourceImage, SizeF targetSize)
{
var imageSize = sourceImage.Size;
UIImage newImage = null;
var width = imageSize.Width;
var height = imageSize.Height;
var targetWidth = targetSize.Width;
var targetHeight = targetSize.Height;
var scaleFactor = 0.0f;
var scaledWidth = targetWidth;
var scaledHeight = targetHeight;
var thumbnailPoint = PointF.Empty;
if (imageSize != targetSize)
{
var widthFactor = targetWidth / width;
var heightFactor = targetHeight / height;
if (widthFactor > heightFactor)
{
scaleFactor = widthFactor;// scale to fit height
}
else
{
scaleFactor = heightFactor;// scale to fit width
}
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor > heightFactor)
{
thumbnailPoint.Y = (targetHeight - scaledHeight) * 0.5f;
}
else
{
if (widthFactor < heightFactor)
{
thumbnailPoint.X = (targetWidth - scaledWidth) * 0.5f;
}
}
}
UIGraphics.BeginImageContextWithOptions(targetSize, false, 0.0f);
var thumbnailRect = new RectangleF(thumbnailPoint, new SizeF(scaledWidth, scaledHeight));
sourceImage.Draw(thumbnailRect);
newImage = UIGraphics.GetImageFromCurrentImageContext();
if (newImage == null)
{
Console.WriteLine("could not scale image");
}
//pop the context to get back to the default
UIGraphics.EndImageContext();
return newImage;
}
I converted Sam Wirch's guide to swift and it worked well for me, although there's some very slight "squishing" in the final image that I couldn't resolve.
func resizedCroppedImage(image: UIImage, newSize:CGSize) -> UIImage {
var ratio: CGFloat = 0
var delta: CGFloat = 0
var offset = CGPointZero
if image.size.width > image.size.height {
ratio = newSize.width / image.size.width
delta = (ratio * image.size.width) - (ratio * image.size.height)
offset = CGPointMake(delta / 2, 0)
} else {
ratio = newSize.width / image.size.height
delta = (ratio * image.size.height) - (ratio * image.size.width)
offset = CGPointMake(0, delta / 2)
}
let clipRect = CGRectMake(-offset.x, -offset.y, (ratio * image.size.width) + delta, (ratio * image.size.height) + delta)
UIGraphicsBeginImageContextWithOptions(newSize, true, 0.0)
UIRectClip(clipRect)
image.drawInRect(clipRect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
If anyone wants the objective c version, it's on his website.
Here is a Swift 3 version of Sam Wirch's guide to swift posted by William T.
extension UIImage {
static func resizedCroppedImage(image: UIImage, newSize:CGSize) -> UIImage? {
var ratio: CGFloat = 0
var delta: CGFloat = 0
var offset = CGPoint.zero
if image.size.width > image.size.height {
ratio = newSize.width / image.size.width
delta = (ratio * image.size.width) - (ratio * image.size.height)
offset = CGPoint(x: delta / 2, y: 0)
} else {
ratio = newSize.width / image.size.height
delta = (ratio * image.size.height) - (ratio * image.size.width)
offset = CGPoint(x: 0, y: delta / 2)
}
let clipRect = CGRect(x: -offset.x, y: -offset.y, width: (ratio * image.size.width) + delta, height: (ratio * image.size.height) + delta)
UIGraphicsBeginImageContextWithOptions(newSize, true, 0.0)
UIRectClip(clipRect)
image.draw(in: clipRect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
}
The following simple code worked for me.
[imageView setContentMode:UIViewContentModeScaleAspectFill];
[imageView setClipsToBounds:YES];
- (UIImage*)imageScale:(CGFloat)scaleFactor cropForSize:(CGSize)targetSize
{
targetSize = !targetSize.width?self.size:targetSize;
UIGraphicsBeginImageContext(targetSize); // this will crop
CGRect thumbnailRect = CGRectZero;
thumbnailRect.size.width = targetSize.width*scaleFactor;
thumbnailRect.size.height = targetSize.height*scaleFactor;
CGFloat xOffset = (targetSize.width- thumbnailRect.size.width)/2;
CGFloat yOffset = (targetSize.height- thumbnailRect.size.height)/2;
thumbnailRect.origin = CGPointMake(xOffset,yOffset);
[self drawInRect:thumbnailRect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
if(newImage == nil)
{
NSLog(#"could not scale image");
}
UIGraphicsEndImageContext();
return newImage;
}
Below the example of work:
Left image - (origin image)
; Right image with scale x2
If you want to scale image but retain its frame(proportions), call method this way:
[yourImage imageScale:2.0f cropForSize:CGSizeZero];
This question seems to have been put to rest, but in my quest for a solution that I could more easily understand (and written in Swift), I arrived at this (also posted to: How to crop the UIImage?)
I wanted to be able to crop from a region based on an aspect ratio, and scale to a size based on a outer bounding extent. Here is my variation:
import AVFoundation
import ImageIO
class Image {
class func crop(image:UIImage, crop source:CGRect, aspect:CGSize, outputExtent:CGSize) -> UIImage {
let sourceRect = AVMakeRectWithAspectRatioInsideRect(aspect, source)
let targetRect = AVMakeRectWithAspectRatioInsideRect(aspect, CGRect(origin: CGPointZero, size: outputExtent))
let opaque = true, deviceScale:CGFloat = 0.0 // use scale of device's main screen
UIGraphicsBeginImageContextWithOptions(targetRect.size, opaque, deviceScale)
let scale = max(
targetRect.size.width / sourceRect.size.width,
targetRect.size.height / sourceRect.size.height)
let drawRect = CGRect(origin: -sourceRect.origin * scale, size: image.size * scale)
image.drawInRect(drawRect)
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return scaledImage
}
}
There are a couple things that I found confusing, the separate concerns of cropping and resizing. Cropping is handled with the origin of the rect that you pass to drawInRect, and scaling is handled by the size portion. In my case, I needed to relate the size of the cropping rect on the source, to my output rect of the same aspect ratio. The scale factor is then output / input, and this needs to be applied to the drawRect (passed to drawInRect).
One caveat is that this approach effectively assumes that the image you are drawing is larger than the image context. I have not tested this, but I think you can use this code to handle cropping / zooming, but explicitly defining the scale parameter to be the aforementioned scale parameter. By default, UIKit applies a multiplier based on the screen resolution.
Finally, it should be noted that this UIKit approach is higher level than CoreGraphics / Quartz and Core Image approaches, and seems to handle image orientation issues. It is also worth mentioning that it is pretty fast, second to ImageIO, according to this post here: http://nshipster.com/image-resizing/
Swift version:
static func imageWithImage(image:UIImage, newSize:CGSize) ->UIImage {
UIGraphicsBeginImageContextWithOptions(newSize, true, UIScreen.mainScreen().scale);
image.drawInRect(CGRectMake(0, 0, newSize.width, newSize.height))
let newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage
}