A thin whiteline is been added when resize the image - iphone

When we resizing the image (after downloading and before storing that in document directory), by the following code:
-(UIImage *)resizeImage:(UIImage *)image withSize:(CGSize)newSize
{
float actualHeight = image.size.height;
float actualWidth = image.size.width;
float imgRatio = actualWidth/actualHeight;
float maxRatio = newSize.width/newSize.height;
if(imgRatio!=maxRatio){
if(imgRatio < maxRatio){
imgRatio = newSize.width / actualHeight;
actualWidth = imgRatio * actualWidth;
actualHeight = newSize.width;
}
else{
imgRatio = newSize.height / actualWidth;
actualHeight = imgRatio * actualHeight;
actualWidth = newSize.height;
}
}
CGRect rect = CGRectMake(0.0, 0.0, actualWidth, actualHeight);
UIGraphicsBeginImageContext(rect.size);
[image drawInRect:rect];
UIImage *resizedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//[resizedImage release];
return [resizedImage autorelease];
}
this produce a re sized image with the thin white line added towards it's orientation(as if image is landscape white line is added to it's bottom and if image is portrait white line is added to it's right hand).
please tell that, how to get rid of that white line?
Thank you.

Although the size is specified in floating point units, the actual image is always an integral number of pixels.
When you calculate the new size to preserve the aspect ratio, you will typically have only one of the sides as a whole number of pixels, while the other scales to have some fractional part. When you then draw the old image into that rect, it doesn't quite fill the new image. So what you see as a white line is the graphics system's way of rendering the pixels that are part image, part background.
In essence, what you want to do is not quite possible, so you need to fudge it somehow. There are several possibilities:
Scale the image such that the aspect ratio is not perfectly preserved but you have integral values, for example by rounding:
actualWidth = round(imgRatio * actualWidth);
Maintain the aspect ratio but clip the fractional edge. The easiest way to do this is probably to make the image context a little smaller:
UIGraphicsBeginImageContext(CGSizeMake(floor(actualWidth), floor(actualHeight)));
[image drawInRect:rect];
Just fill the background first with some colour that's less obvious than white. This is a dreadful kludge, obviously, but could be effective in the right circumstances, for example if you're always drawing the image against a black background.
On a separate note, you can't call anything after return, so your final release line isn't doing anything. This is just as well because the image returned from UIGraphicsGetImageFromCurrentImageContext is autoreleased -- you should not be releasing it anyway.

This code will fix your problem:
+ (UIImage *)scaleImageProportionally:(UIImage *)image {
if (MAX(image.size.height, image.size.width) <= DEFAULT_PHOTO_MAX_SIZE) {
return image;
}
else {
CGFloat targetWidth = 0;
CGFloat targetHeight = 0;
if (image.size.height > image.size.width) {
CGFloat ratio = image.size.height / image.size.width;
targetHeight = DEFAULT_PHOTO_MAX_SIZE;
targetWidth = roundf(DEFAULT_PHOTO_MAX_SIZE/ ratio);
}
else {
CGFloat ratio = image.size.width / image.size.height;
targetWidth = DEFAULT_PHOTO_MAX_SIZE;
targetHeight = roundf(DEFAULT_PHOTO_MAX_SIZE/ ratio);
}
CGSize targetSize = CGSizeMake(targetWidth, targetHeight);
UIImage *sourceImage = image;
UIImage *newImage = nil;
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
targetWidth = targetSize.width;
targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0, 0.0);
if (!CGSizeEqualToSize(imageSize, targetSize)) {
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor < heightFactor)
scaleFactor = widthFactor;
else
scaleFactor = heightFactor;
scaledWidth = roundf(width * scaleFactor);
scaledHeight = roundf(height * scaleFactor);
// center the image
if (widthFactor < heightFactor) {
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
} else if (widthFactor > heightFactor) {
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
UIGraphicsBeginImageContext(targetSize);
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
if (newImage == nil) NSLog(#"could not scale image");
return newImage;
}
}

Related

Adjust UIImage size as per UIImageView size without losing its quality in iOS

I have one image of size 3264 × 2448 and an image view of size 768 × 1024. I want to get an image size equal to the view's size without losing image quality.
I googled for it; AspectFit not giving me proper output. I tried the following code:
- (UIImage *)imageByScalingProportionallyToSize: (CGSize)targetSize {
UIImage *sourceImage = self;
UIImage *newImage = nil;
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO) {
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor < heightFactor)
scaleFactor = widthFactor;
else
scaleFactor = heightFactor;
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor < heightFactor) {
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
} else if (widthFactor > heightFactor) {
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
// this is actually the interesting part:
UIGraphicsBeginImageContext(targetSize);
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
if(newImage == nil) NSLog(#"could not scale image");
return newImage ;
}
Please try below code :
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
specify your new size as (768,1024) and pass your original image, but make sure your height and width ratio of original image should be same as converted image ratio.
I used this article as reference for some of my projects included image resizing. UIImage+Resize category works pretty well except the significant performance hit until the resized version of image is created.
You may cache the resized version on disk to avoid resizing it again in future. Also I haven't tried yet but you can try resizing image in a background thread to avoid blocking the UI.
The dimensions of the input image isnt a multiple of your UIImageView's dimensions, and thats why aspectFit wont work here (will leave gaps on the sides most likely). if you use scaleToFill your image will warp because again your image dimensions are off. I think you will have to do something about the dimensions of one of these to preserve the image properly.

How to crop UIImage with fixed dimensions?

I have an uiimage with dimensions (0,0,320,460)
How to crop this image to a dimension (10,30,300,300)
Target size in my code is always set to the full screen size of the device (so you have to change it).
#implementation UIImage (Extras)
#pragma mark -
#pragma mark Scale and crop image
- (UIImage*)imageByScalingAndCroppingForSize:(CGSize)targetSize
{
UIImage *sourceImage = self;
UIImage *newImage = nil;
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO)
{
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor > heightFactor)
{
scaleFactor = widthFactor; // scale to fit height
}
else
{
scaleFactor = heightFactor; // scale to fit width
}
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor > heightFactor)
{
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
}
else
{
if (widthFactor < heightFactor)
{
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
}
UIGraphicsBeginImageContext(targetSize); // this will crop
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
if(newImage == nil)
{
NSLog(#"could not scale image");
}
//pop the context to get back to the default
UIGraphicsEndImageContext();
return newImage;
}

UIImageView not resizing when bounds or frame is changed?

So I have a custom view (its a custom textview that sits on top of keyboard); however it can resize so I would like the background image (UIImageView with UIImage inside of it) to scale with it. The following code does nothing:
//size of entire custom view
CGRect bFrame = self.keyboard.frame;
bFrame.origin.y += heightDifference;
bFrame.size.height += heightDifference;
//lets set frame and bounds for the uiimageview
self.keyboard.background.frame = bFrame;
self.keyboard.background.bounds = bFrame;
//I thought the lines above would work, but they didn't, trying to reset the image and change its content mode as a hack.. .still no beans.
self.keyboard.background.image = [UIImage imageNamed:#"keyboard_backgroundv1_5.png"];
self.keyboard.background.contentMode = UIViewContentModeScaleToFill;
This function may help you
CGSize targetSize = CGSizeMake(self.keyboard.frame.size.width, self.keyboard.frame.size.height);
self.keyboard.background.image = [self imageByScalingProportionallyToSize:targetSize ImageForConvert:[UIImage imageNamed:#"keyboard_backgroundv1_5.png"]];
-(UIImage *)imageByScalingProportionallyToSize:(CGSize)targetSize ImageForConvert:(UIImage *)sourceImage
{
UIImage *newImage = nil;
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO)
{
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor < heightFactor)
scaleFactor = widthFactor;
else
scaleFactor = heightFactor;
scaledWidth = width * scaleFactor*1;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor < heightFactor)
{
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
} else if (widthFactor > heightFactor) {
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
UIGraphicsBeginImageContext(targetSize);
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage ;
}

How can I make the large image into small with as good quality as the original image?

I am using imagview with size of 80X80 to display large image (1024 X 780).
While placing the large image into the imageview, the image looks squeezed, compressed something not like the quality one.
My question is, how can I make the large image into small with as good quality as the original image ?
In the image view, please make the imageview.mode as centre not scale to fit or any other.
I believe this would put ur any scaled image at finest quality in ur 80x80 dimension.
In this case, you should do proper Scaling of the Image that u want to show in the ImageView
- (UIImage *)imageByScalingProportionallyToSize:(CGSize)targetSize
{
UIImage *sourceImage = chosenImage;
UIImage *newImage = nil;
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO) {
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor < heightFactor)
scaleFactor = widthFactor;
else
scaleFactor = heightFactor;
scaledWidth = width * scaleFactor*1;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor < heightFactor) {
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
} else if (widthFactor > heightFactor) {
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
UIGraphicsBeginImageContext(targetSize);
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage ;
}
[imageView setContentMode:UIViewContentModeScaleAspectFill];
You should however consider resizing the image and saving it on the disk if you want to use it more then once.
To resize the image:
The simplest way to resize an UIImage?
UIImage resize (Scale proportion)
UIImage: Resize, then Crop
etc..
Going from 1034x780 into 80x80 is virtually impossible to keep the quality, as there isn't enough space to capture all the details.
You can try to scale it:
+ (UIImage*)imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
you can set image mode [imageView setContentMode:UIViewContentModeScaleAspectFill];
imageView.autoresizingMask = ( UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight );
[imageView setClipsToBounds:YES];

UIImage: Resize, then Crop

I've been bashing my face into this one for literally days now and even though I feel constantly that I am right on the edge of revelation, I simply cannot achieve my goal.
I thought, ahead of time in the conceptual phases of my design, that it would be a trivial matter to grab a image from the iPhone's camera or library, scale it down to a specified height, using a function equivalent to the Aspect Fill option of UIImageView (entirely in code), and then crop off anything that did not fit within a passed CGRect.
Getting the original image from camera or library, was trivial. I am shocked at how difficult the other two steps have proved to be.
The attached image shows what I am trying to achieve. Would someone please be kind enough to hold my hand? Every code example I have found so far seems to smash the image, be upside down, look like crap, draw out of bounds, or otherwise just not work correctly.
I needed the same thing - in my case, to pick the dimension that fits once scaled, and then crop each end to fit the rest to the width. (I'm working in landscape, so might not have noticed any deficiencies in portrait mode.) Here's my code - it's part of a categeory on UIImage. Target size in my code is always set to the full screen size of the device.
#implementation UIImage (Extras)
#pragma mark -
#pragma mark Scale and crop image
- (UIImage*)imageByScalingAndCroppingForSize:(CGSize)targetSize
{
UIImage *sourceImage = self;
UIImage *newImage = nil;
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO)
{
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor > heightFactor)
{
scaleFactor = widthFactor; // scale to fit height
}
else
{
scaleFactor = heightFactor; // scale to fit width
}
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor > heightFactor)
{
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
}
else
{
if (widthFactor < heightFactor)
{
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
}
UIGraphicsBeginImageContext(targetSize); // this will crop
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
if(newImage == nil)
{
NSLog(#"could not scale image");
}
//pop the context to get back to the default
UIGraphicsEndImageContext();
return newImage;
}
An older post contains code for a method to resize your UIImage. The relevant portion is as follows:
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
As far as cropping goes, I believe that if you alter the method to use a different size for the scaling than for the context, your resulting image should be clipped to the bounds of the context.
+ (UIImage *)scaleImage:(UIImage *)image toSize:(CGSize)targetSize {
//If scaleFactor is not touched, no scaling will occur
CGFloat scaleFactor = 1.0;
//Deciding which factor to use to scale the image (factor = targetSize / imageSize)
if (image.size.width > targetSize.width || image.size.height > targetSize.height)
if (!((scaleFactor = (targetSize.width / image.size.width)) > (targetSize.height / image.size.height))) //scale to fit width, or
scaleFactor = targetSize.height / image.size.height; // scale to fit heigth.
UIGraphicsBeginImageContext(targetSize);
//Creating the rect where the scaled image is drawn in
CGRect rect = CGRectMake((targetSize.width - image.size.width * scaleFactor) / 2,
(targetSize.height - image.size.height * scaleFactor) / 2,
image.size.width * scaleFactor, image.size.height * scaleFactor);
//Draw the image into the rect
[image drawInRect:rect];
//Saving the image, ending image context
UIImage *scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return scaledImage;
}
I propose this one. Isn't she a beauty? ;)
There's a great piece of code related to the resizing of images + several other operations. I came around this one when trying to figure ou how to resize images...
http://vocaro.com/trevor/blog/2009/10/12/resize-a-uiimage-the-right-way/
This is a version of Jane Sales' answer in Swift. Cheers!
public func resizeImage(image: UIImage, size: CGSize) -> UIImage? {
var returnImage: UIImage?
var scaleFactor: CGFloat = 1.0
var scaledWidth = size.width
var scaledHeight = size.height
var thumbnailPoint = CGPointMake(0, 0)
if !CGSizeEqualToSize(image.size, size) {
let widthFactor = size.width / image.size.width
let heightFactor = size.height / image.size.height
if widthFactor > heightFactor {
scaleFactor = widthFactor
} else {
scaleFactor = heightFactor
}
scaledWidth = image.size.width * scaleFactor
scaledHeight = image.size.height * scaleFactor
if widthFactor > heightFactor {
thumbnailPoint.y = (size.height - scaledHeight) * 0.5
} else if widthFactor < heightFactor {
thumbnailPoint.x = (size.width - scaledWidth) * 0.5
}
}
UIGraphicsBeginImageContextWithOptions(size, true, 0)
var thumbnailRect = CGRectZero
thumbnailRect.origin = thumbnailPoint
thumbnailRect.size.width = scaledWidth
thumbnailRect.size.height = scaledHeight
image.drawInRect(thumbnailRect)
returnImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return returnImage
}
Here you go. This one is perfect ;-)
EDIT: see below comment - "Does not work with certain images, fails with: CGContextSetInterpolationQuality: invalid context 0x0 error"
// Resizes the image according to the given content mode, taking into account the image's orientation
- (UIImage *)resizedImageWithContentMode:(UIViewContentMode)contentMode imageToScale:(UIImage*)imageToScale bounds:(CGSize)bounds interpolationQuality:(CGInterpolationQuality)quality {
//Get the size we want to scale it to
CGFloat horizontalRatio = bounds.width / imageToScale.size.width;
CGFloat verticalRatio = bounds.height / imageToScale.size.height;
CGFloat ratio;
switch (contentMode) {
case UIViewContentModeScaleAspectFill:
ratio = MAX(horizontalRatio, verticalRatio);
break;
case UIViewContentModeScaleAspectFit:
ratio = MIN(horizontalRatio, verticalRatio);
break;
default:
[NSException raise:NSInvalidArgumentException format:#"Unsupported content mode: %d", contentMode];
}
//...and here it is
CGSize newSize = CGSizeMake(imageToScale.size.width * ratio, imageToScale.size.height * ratio);
//start scaling it
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGImageRef imageRef = imageToScale.CGImage;
CGContextRef bitmap = CGBitmapContextCreate(NULL,
newRect.size.width,
newRect.size.height,
CGImageGetBitsPerComponent(imageRef),
0,
CGImageGetColorSpace(imageRef),
CGImageGetBitmapInfo(imageRef));
CGContextSetInterpolationQuality(bitmap, quality);
// Draw into the context; this scales the image
CGContextDrawImage(bitmap, newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(bitmap);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
// Clean up
CGContextRelease(bitmap);
CGImageRelease(newImageRef);
return newImage;
}
I found that the Swift 3 posted by Evgenii Kanvets does not uniformly scale the image.
Here is my Swift 4 version of the function that does not squish the image:
static func resizedCroppedImage(image: UIImage, newSize:CGSize) -> UIImage? {
// This function returns a newImage, based on image
// - image is scaled uniformaly to fit into a rect of size newSize
// - if the newSize rect is of a different aspect ratio from the source image
// the new image is cropped to be in the center of the source image
// (the excess source image is removed)
var ratio: CGFloat = 0
var delta: CGFloat = 0
var drawRect = CGRect()
if newSize.width > newSize.height {
ratio = newSize.width / image.size.width
delta = (ratio * image.size.height) - newSize.height
drawRect = CGRect(x: 0, y: -delta / 2, width: newSize.width, height: newSize.height + delta)
} else {
ratio = newSize.height / image.size.height
delta = (ratio * image.size.width) - newSize.width
drawRect = CGRect(x: -delta / 2, y: 0, width: newSize.width + delta, height: newSize.height)
}
UIGraphicsBeginImageContextWithOptions(newSize, true, 0.0)
image.draw(in: drawRect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
I modified Brad Larson's Code. It will aspect fill the image in given rect.
-(UIImage*) scaleAndCropToSize:(CGSize)newSize;
{
float ratio = self.size.width / self.size.height;
UIGraphicsBeginImageContext(newSize);
if (ratio > 1) {
CGFloat newWidth = ratio * newSize.width;
CGFloat newHeight = newSize.height;
CGFloat leftMargin = (newWidth - newHeight) / 2;
[self drawInRect:CGRectMake(-leftMargin, 0, newWidth, newHeight)];
}
else {
CGFloat newWidth = newSize.width;
CGFloat newHeight = newSize.height / ratio;
CGFloat topMargin = (newHeight - newWidth) / 2;
[self drawInRect:CGRectMake(0, -topMargin, newSize.width, newSize.height/ratio)];
}
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
scrollView = [[UIScrollView alloc] initWithFrame:CGRectMake(0.0,0.0,ScreenWidth,ScreenHeigth)];
[scrollView setBackgroundColor:[UIColor blackColor]];
[scrollView setDelegate:self];
[scrollView setShowsHorizontalScrollIndicator:NO];
[scrollView setShowsVerticalScrollIndicator:NO];
[scrollView setMaximumZoomScale:2.0];
image=[image scaleToSize:CGSizeMake(ScreenWidth, ScreenHeigth)];
imageView = [[UIImageView alloc] initWithImage:image];
UIImageView* imageViewBk = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"background.png"]];
[self.view addSubview:imageViewBk];
CGRect rect;
rect.origin.x=0;
rect.origin.y=0;
rect.size.width = image.size.width;
rect.size.height = image.size.height;
[imageView setFrame:rect];
[scrollView setContentSize:[imageView frame].size];
[scrollView setMinimumZoomScale:[scrollView frame].size.width / [imageView frame].size.width];
[scrollView setZoomScale:[scrollView minimumZoomScale]];
[scrollView addSubview:imageView];
[[self view] addSubview:scrollView];
then you can take screen shots to your image by this
float zoomScale = 1.0 / [scrollView zoomScale];
CGRect rect;
rect.origin.x = [scrollView contentOffset].x * zoomScale;
rect.origin.y = [scrollView contentOffset].y * zoomScale;
rect.size.width = [scrollView bounds].size.width * zoomScale;
rect.size.height = [scrollView bounds].size.height * zoomScale;
CGImageRef cr = CGImageCreateWithImageInRect([[imageView image] CGImage], rect);
UIImage *cropped = [UIImage imageWithCGImage:cr];
CGImageRelease(cr);
Xamarin.iOS version for accepted answer on how to resize and then crop UIImage (Aspect Fill) is
below
public static UIImage ScaleAndCropImage(UIImage sourceImage, SizeF targetSize)
{
var imageSize = sourceImage.Size;
UIImage newImage = null;
var width = imageSize.Width;
var height = imageSize.Height;
var targetWidth = targetSize.Width;
var targetHeight = targetSize.Height;
var scaleFactor = 0.0f;
var scaledWidth = targetWidth;
var scaledHeight = targetHeight;
var thumbnailPoint = PointF.Empty;
if (imageSize != targetSize)
{
var widthFactor = targetWidth / width;
var heightFactor = targetHeight / height;
if (widthFactor > heightFactor)
{
scaleFactor = widthFactor;// scale to fit height
}
else
{
scaleFactor = heightFactor;// scale to fit width
}
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor > heightFactor)
{
thumbnailPoint.Y = (targetHeight - scaledHeight) * 0.5f;
}
else
{
if (widthFactor < heightFactor)
{
thumbnailPoint.X = (targetWidth - scaledWidth) * 0.5f;
}
}
}
UIGraphics.BeginImageContextWithOptions(targetSize, false, 0.0f);
var thumbnailRect = new RectangleF(thumbnailPoint, new SizeF(scaledWidth, scaledHeight));
sourceImage.Draw(thumbnailRect);
newImage = UIGraphics.GetImageFromCurrentImageContext();
if (newImage == null)
{
Console.WriteLine("could not scale image");
}
//pop the context to get back to the default
UIGraphics.EndImageContext();
return newImage;
}
I converted Sam Wirch's guide to swift and it worked well for me, although there's some very slight "squishing" in the final image that I couldn't resolve.
func resizedCroppedImage(image: UIImage, newSize:CGSize) -> UIImage {
var ratio: CGFloat = 0
var delta: CGFloat = 0
var offset = CGPointZero
if image.size.width > image.size.height {
ratio = newSize.width / image.size.width
delta = (ratio * image.size.width) - (ratio * image.size.height)
offset = CGPointMake(delta / 2, 0)
} else {
ratio = newSize.width / image.size.height
delta = (ratio * image.size.height) - (ratio * image.size.width)
offset = CGPointMake(0, delta / 2)
}
let clipRect = CGRectMake(-offset.x, -offset.y, (ratio * image.size.width) + delta, (ratio * image.size.height) + delta)
UIGraphicsBeginImageContextWithOptions(newSize, true, 0.0)
UIRectClip(clipRect)
image.drawInRect(clipRect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
If anyone wants the objective c version, it's on his website.
Here is a Swift 3 version of Sam Wirch's guide to swift posted by William T.
extension UIImage {
static func resizedCroppedImage(image: UIImage, newSize:CGSize) -> UIImage? {
var ratio: CGFloat = 0
var delta: CGFloat = 0
var offset = CGPoint.zero
if image.size.width > image.size.height {
ratio = newSize.width / image.size.width
delta = (ratio * image.size.width) - (ratio * image.size.height)
offset = CGPoint(x: delta / 2, y: 0)
} else {
ratio = newSize.width / image.size.height
delta = (ratio * image.size.height) - (ratio * image.size.width)
offset = CGPoint(x: 0, y: delta / 2)
}
let clipRect = CGRect(x: -offset.x, y: -offset.y, width: (ratio * image.size.width) + delta, height: (ratio * image.size.height) + delta)
UIGraphicsBeginImageContextWithOptions(newSize, true, 0.0)
UIRectClip(clipRect)
image.draw(in: clipRect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
}
The following simple code worked for me.
[imageView setContentMode:UIViewContentModeScaleAspectFill];
[imageView setClipsToBounds:YES];
- (UIImage*)imageScale:(CGFloat)scaleFactor cropForSize:(CGSize)targetSize
{
targetSize = !targetSize.width?self.size:targetSize;
UIGraphicsBeginImageContext(targetSize); // this will crop
CGRect thumbnailRect = CGRectZero;
thumbnailRect.size.width = targetSize.width*scaleFactor;
thumbnailRect.size.height = targetSize.height*scaleFactor;
CGFloat xOffset = (targetSize.width- thumbnailRect.size.width)/2;
CGFloat yOffset = (targetSize.height- thumbnailRect.size.height)/2;
thumbnailRect.origin = CGPointMake(xOffset,yOffset);
[self drawInRect:thumbnailRect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
if(newImage == nil)
{
NSLog(#"could not scale image");
}
UIGraphicsEndImageContext();
return newImage;
}
Below the example of work:
Left image - (origin image)
; Right image with scale x2
If you want to scale image but retain its frame(proportions), call method this way:
[yourImage imageScale:2.0f cropForSize:CGSizeZero];
This question seems to have been put to rest, but in my quest for a solution that I could more easily understand (and written in Swift), I arrived at this (also posted to: How to crop the UIImage?)
I wanted to be able to crop from a region based on an aspect ratio, and scale to a size based on a outer bounding extent. Here is my variation:
import AVFoundation
import ImageIO
class Image {
class func crop(image:UIImage, crop source:CGRect, aspect:CGSize, outputExtent:CGSize) -> UIImage {
let sourceRect = AVMakeRectWithAspectRatioInsideRect(aspect, source)
let targetRect = AVMakeRectWithAspectRatioInsideRect(aspect, CGRect(origin: CGPointZero, size: outputExtent))
let opaque = true, deviceScale:CGFloat = 0.0 // use scale of device's main screen
UIGraphicsBeginImageContextWithOptions(targetRect.size, opaque, deviceScale)
let scale = max(
targetRect.size.width / sourceRect.size.width,
targetRect.size.height / sourceRect.size.height)
let drawRect = CGRect(origin: -sourceRect.origin * scale, size: image.size * scale)
image.drawInRect(drawRect)
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return scaledImage
}
}
There are a couple things that I found confusing, the separate concerns of cropping and resizing. Cropping is handled with the origin of the rect that you pass to drawInRect, and scaling is handled by the size portion. In my case, I needed to relate the size of the cropping rect on the source, to my output rect of the same aspect ratio. The scale factor is then output / input, and this needs to be applied to the drawRect (passed to drawInRect).
One caveat is that this approach effectively assumes that the image you are drawing is larger than the image context. I have not tested this, but I think you can use this code to handle cropping / zooming, but explicitly defining the scale parameter to be the aforementioned scale parameter. By default, UIKit applies a multiplier based on the screen resolution.
Finally, it should be noted that this UIKit approach is higher level than CoreGraphics / Quartz and Core Image approaches, and seems to handle image orientation issues. It is also worth mentioning that it is pretty fast, second to ImageIO, according to this post here: http://nshipster.com/image-resizing/
Swift version:
static func imageWithImage(image:UIImage, newSize:CGSize) ->UIImage {
UIGraphicsBeginImageContextWithOptions(newSize, true, UIScreen.mainScreen().scale);
image.drawInRect(CGRectMake(0, 0, newSize.width, newSize.height))
let newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage
}