Resize image to fit proportions - iphone

In the design of my app, I have a square space for an image that comes from a remote server. However, occasionally the image is a landscape rectangle instead of a square.
I don't want to crop the image, but instead scale it down far enough to fit inside the square constraints, then fill in the remaining space with some background color, white maybe.

Set the contentMode of your UIImageView to UIViewContentModeScaleAspectFit.

I don't know if this helps or not, but I use this to fit the size of the View to match the image. It takes the provided rect, and trims it to match the image. The returned CGRect can be then be applied to the view. I used this so I could add a shadow to the image (which looks wrong if the view doesn't match the image perfectly).
- (CGRect) resizeCGRect:(CGRect)rect toImage:(UIImage *)image{
CGSize size = rect.size;
CGSize iSize = image.size;
if (iSize.width > iSize.height){
if (iSize.width / iSize.height > size.width / size.height)
size.height = size.width * (iSize.height / iSize.width);
else
size.width = size.height * (iSize.width / iSize.height);
} else {
if (iSize.height / iSize.width > size.height / size.width)
size.width = size.height * (iSize.width / iSize.height);
else
size.height = size.width * (iSize.height / iSize.width);
}
rect.size = size;
return rect;
}

Create a UIImageView with a background color of the color you want. Create your image from your server, set the image in the image view, and then set the contentMode = UIViewContentModeScaleAspectFit
UIImageView *backgroundColorWhite = [[UIImageView alloc] initWithFrame:someObject.frame];
backgroundColorWhite.backgroundColor = [UIColor whiteColor];
UIImage *serverImage = [UIImage imageWithData:serverData];
backgroundColorWhite.contentMode = UIViewContentModeScaleAspectFit;
[backgroundColorWhite setImage:serverImage];

Related

Prevent Stretching image from uiimage in iphone sdk

Code snippet
self.newsImage = [[UIImageView alloc] initWithFrame:CGRectMake(0,0,300,130)];
//set placeholder image or cell won't update when image is loaded
self.newsImage.image = [UIImage imageNamed:#"newsDetail.png"];
//load the image
self.newsImage.imageURL = [NSURL URLWithString:imageBig];
[imageBack addSubview:self.newsImage];
I have one image on 40*40 size but image view size 300*130. How to avoid stretching image.
I want center of the UIImageview.
Thanks in Advance !!!!
Just center the content:
self.newsImage.contentMode = UIViewContentModeCenter;
ou have to set CGSize as your image width and hight so image will not stretch and it arrange at the middle.
- (UIImage )imageWithImage:(UIImage )image scaledToFillSize:(CGSize)size
{
CGFloat scale = MAX(size.width/image.size.width, size.height/image.size.height);
CGFloat width = image.size.width * scale;
CGFloat height = image.size.height * scale;
CGRect imageRect = CGRectMake((size.width - width)/2.0f,
(size.height - height)/2.0f,
width,
height);
UIGraphicsBeginImageContextWithOptions(size, NO, 0);
[image drawInRect:imageRect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

Get width of a resized image after UIViewContentModeScaleAspectFit

I have my UIImageView and I put an image into it that I resize like this:
UIImageView *attachmentImageNew = [[UIImageView alloc] initWithFrame:CGRectMake(5.5, 6.5, 245, 134)];
attachmentImageNew.image = actualImage;
attachmentImageNew.backgroundColor = [UIColor redColor];
attachmentImageNew.contentMode = UIViewContentModeScaleAspectFit;
I tried getting the width of the resized picture in my UIImageView by doing this:
NSLog(#"Size of pic is %f", attachmentImageNew.image.size.width);
But it actually returns me the width of the original picture. Any ideas on how do I get the frame of the picture that I see on screen?
EDIT: Here's how my UIImageView looks, red area is its backgroundColor
I don't know is there more clear solution, but this works:
float widthRatio = imageView.bounds.size.width / imageView.image.size.width;
float heightRatio = imageView.bounds.size.height / imageView.image.size.height;
float scale = MIN(widthRatio, heightRatio);
float imageWidth = scale * imageView.image.size.width;
float imageHeight = scale * imageView.image.size.height;
A solution in Swift:
let currentHeight = imageView.bounds.size.height
let currentWidth = imageView.bounds.size.width
let newWidth = UIScreen.mainScreen().bounds.width
let newHeight = (newWidth * currentHeight) / currentWidth
println(newHeight)
It has the reference of your original image, so always gives the same dimensions as of original image.
To get the dimensions of new image you have to check aspect ratio. I have derived a formula for my need using different images of different size using Preview that how it resizes image according to its aspect ratio.
According to Apple's Documentation for UIViewContentModeScaleAspectFit
It Scales the content (In your case actualImage) to fit the size of the view (In your case attachmentImageNew) by maintaining the
aspect ratio.
That means your image size ( after the scaling ) should be same as your UIImageView.
UPDATE :
One new Suggestion to you if you don't want to use UIViewContentModeScaleAspectFit and if you can fix the size of scaledImage then you can use below code to scale the image for fix newSize and then you can use the width to your code.
CGSize newSize = CGSizeMake(100.0,50.0);
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
in your case:
float yourScaledImageWitdh = attachmentImageNew.frame.size.height * attachmentImageNew.image.size.width / attachmentImageNew.image.size.height
NSLog(#"width of resized pic is %f", yourScaledImageWitdh);
but you should also check, before, the original image proportion vs the imageView proportion, my line of code is good just in case the red area is added horizontally, not in case your original image proportion is wider than the imageView frame proportion
For Swift 2, you can use this code snippet to align an aspectFitted image to the bottom of the screen (sampled from earlier answers to this question -> thanks to you, guys)
let screenSize: CGRect = UIScreen.mainScreen().bounds
let screenWidth = CGFloat(screenSize.width)
let screenHeight = CGFloat(screenSize.height)
imageView.frame = CGRectMake(0, screenHeight - (imageView.image?.size.height)! , screenWidth, (imageView.image?.size.height)!)
let newSize:CGSize = getScaledSizeOfImage(imageView.image!, toSize: self.view.frame.size)
imageView.frame = CGRectMake(0, screenHeight - (newSize.height) , screenWidth, newSize.height)
func getScaledSizeOfImage(image: UIImage, toSize: CGSize) -> CGSize
{
let widthRatio = toSize.width/image.size.width
let heightRatio = toSize.height/image.size.height
let scale = min(widthRatio, heightRatio)
let imageWidth = scale*image.size.width
let imageHeight = scale*image.size.height
return CGSizeMake(imageWidth, imageHeight)
}

iOS Resizing images for thumbs?

I am kinda new to handling images, there are many things that I do not know, so bear with me. Basically I take image with camera, and present it inside UIImageView, which is small view 60:80. Image is automatically resized to fit UIImageView, and everything looks fine.
My question is - Do I need to do some more image operations (is order to maximize efficency), or that's all?
Please use following code which will gives you better thumbnails
-(UIImage *)generatePhotoThumbnail:(UIImage *)image
{
CGSize size = image.size;
CGSize croppedSize;
CGFloat ratio = 120.0;
CGFloat offsetX = 0.0;
CGFloat offsetY = 0.0;
if (size.width > size.height) {
offsetX = (size.height - size.width) / 2;
croppedSize = CGSizeMake(size.height, size.height);
} else
{
offsetY = (size.width - size.height) / 2;
croppedSize = CGSizeMake(size.width, size.width);
}
CGRect clippedRect = CGRectMake(offsetX * -1, offsetY * -1, croppedSize.width, croppedSize.height);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], clippedRect);
CGRect rect = CGRectMake(0.0, 0.0, ratio, ratio);
UIGraphicsBeginImageContext(rect.size);
[[UIImage imageWithCGImage:imageRef] drawInRect:rect];
UIImage *thumbnail = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGImageRelease(imageRef);
return thumbnail;
}
Resize image function would be:
-(UIImage *)resizeImage:(CGSize)imgSize
UIGraphicsBeginImageContext(imgSize);
[image drawInRect:CGRectMake(0,0,imgSize.width,imgSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Save resized image in document directory with unique names. Later access thumbnail image from document directory.
If count of images is more than use lazy loading of images to display images.
If you are happy with the way the picture is displayed, then you don't need any additional code. Image views have no problems rescaling images for you and they are quite efficient at that, so don't worry about it.

How to make border image to be fitted as per aspectfit uiimageview size?

In my application i have 2 uiimageview.one imageview contains the image that user selects either from photo library or take picture from camera.contentmode of this imageview is aspectfit so that image is not stretched if it is of lesser size.There is other imageview which contains image that is border to that first imageview.when user selects a image that is of similar size of imageview then border is appropriate place.but when image is of lesser size border is not coming properly.there is gap between border image and selected photo image.i dont want that gap.How can i fix this?
The issue got solved by using following code:
-(CGRect)frameForImage:(UIImage*)image inImageViewAspectFit:(UIImageView*)imageView
{
float imageRatio = image.size.width / image.size.height;
float viewRatio = imageView.frame.size.width / imageView.frame.size.height;
if(imageRatio < viewRatio)
{
float scale = imageView.frame.size.height / image.size.height;
float width = scale * image.size.width;
float topLeftX = (imageView.frame.size.width - width) * 0.5;
return CGRectMake(topLeftX, 0, width, imageView.frame.size.height);
}
else
{
float scale = imageView.frame.size.width / image.size.width;
float height = scale * image.size.height;
float topLeftY = (imageView.frame.size.height - height) * 0.5;
return CGRectMake(0, topLeftY, imageView.frame.size.width, height);
}
}

How to clip image in scale in iPhone?

I have a large sized image (2048*2048px), this image is shown as 320*320 on iPhone screen. I want to do this:
In my APP, user can open large sized image(e.g. 2048*2048), the image is shown as 320*320 on iPhone screen, and there is rectangle over the image, user can move the rectangle anywhere within image on iPhone screen, e.g. rectangle(100, 100, 300, 200), then I want to clip the original sized image within the rectangle area in scale.
I tried many ways,
UIImageView *originalImageView = [[UIImage View alloc] initWithImage:originalImage]];
CGRect rect = CGRectMake(100, 100, 300, 200);
UIImage *cropImage = [UIImage imageWithCGImage:CGImageCreateWithImageInRect([originalImageView.image CGImage], rect)];
But I got the cropImage is just 300*200 sized image, not scale properly.
How about doing this, it will preserve the original image quality
CGSize bounds = CGSizeMake(320,320) // Considering image is shown in 320*320
CGRect rect = CGRectMake(100, 100, 220, 200); //rectangle area to be cropped
float widthFactor = rect.size.width * (originalImage.size.width/bounds.size.width);
float heightFactor = rect.size.height * (originalImage.size.height/bounds.size.height);
float factorX = rect.origin.x * (originalImage.size.width/bounds.size.width);
float factorY = rect.origin.y * (originalImage.size.height/bounds.size.height);
CGRect factoredRect = CGRectMake(factorX,factorY,widthFactor,heightFactor);
UIImage *cropImage = [UIImage imageWithCGImage:CGImageCreateWithImageInRect([originalImage CGImage], factoredRect)];
And most importantly if you want to crop image that imagePickerController returns, then this can be done by built in function as below,
imagePickerController.allowsEditing = YES;
Firstly resize image with size 320*320 using this method:
+(UIImage *)resizeImage:(UIImage *)image width:(float)width height:(float)height
{
CGSize newSize;
newSize.width = width;
newSize.height = height
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
Now set resized image in imageView
UIImage *resizeImage = [YourControllerName resizeImage:originalImage width:320 height:320];
UIImageView *originalImageView = [[UIImage View alloc] initWithImage:resizeImage]];
You can now crop
CGRect rect = CGRectMake(100, 100, 300, 200);
UIImage *cropImage = [UIImage imageWithCGImage:CGImageCreateWithImageInRect([originalImageView.image CGImage], rect)];
Why not calculate the scale factor (e.g. originalImageWidth/smallImageWidth)?
If the rectangle is (100,100,300,200) in your small image, you should clip your lage image at size (100*factor,100*factor,300*factor,200*factor).