I am working on one application where i have to load images from server.
I am trying to load application screenshots from a link of appstore.
I am getting the image but not so sharp & clear. I am fetching the image in background & everything works fine.But the resulted image looks a little blurry. I am testing this image in retina display. Anyone has any idea why it is happening. Any solution will be helpful.
Thanks,
Here is my code for image loading :
// This will create the imageview with required frame & use the url to load the image
-(void)loadAppsScreenShots:(int)i Frame:(CGRect)frame withImageUrl:(NSString *)urlStr
{
UIImageView *appImageView = [[UIImageView alloc] init];
frame.origin.x = 0;
appImageView.frame = frame;
appImageView.tag = i;
sharedImageCache = [ImageCache sharedImageCacheInstance];
UIImage *image1 = [sharedImageCache getCachedImage:[NSString stringWithFormat:#"%#",urlStr]];
if (image1==nil)
{
// Show indicator till image loads
UIActivityIndicatorView *indiView = [[UIActivityIndicatorView alloc] initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleWhite];
indiView.center = CGPointMake(appImageView.frame.size.width/2, appImageView.frame.size.height/2);
[appImageView addSubview:indiView];
[indiView startAnimating];
indiView.hidden = FALSE;
// Show label indicating image loading process
UILabel *loadingLbl = [[UILabel alloc] initWithFrame:CGRectMake(0, 0, 200, 25)];
loadingLbl.text = #"";//#"Please wait...";
loadingLbl.center = CGPointMake(appImageView.frame.size.width/2 + 5, appImageView.frame.size.height/2 + 23);
loadingLbl.font = [UIFont fontWithName:#"Helvetica-Bold" size:15.0f];
loadingLbl.textAlignment = UITextAlignmentCenter;
loadingLbl.backgroundColor = [UIColor clearColor];
loadingLbl.textColor = [UIColor whiteColor];
[appImageView addSubview:loadingLbl];
[appImageView sendSubviewToBack:loadingLbl];
loadingLbl.hidden = FALSE;
// Dictionalry to get all objects & pass it to method where we load the data
NSMutableDictionary *dict = [[NSMutableDictionary alloc] init];
[dict setObject:appImageView forKey:#"imageView"];
if (urlStr != nil) {
[dict setObject:urlStr forKey:#"url"];
}
[dict setObject:indiView forKey:#"indi"];
[dict setObject:loadingLbl forKey:#"loadingLbl"];
[self performSelectorInBackground:#selector(loadImageFromURLAndSaveInDocDir:) withObject:dict];
}
else
{
appImageView.image = image1;
}
[[appView viewWithTag:i] addSubview:appImageView];
[appView bringSubviewToFront:appImageView];
appImageView.contentMode = UIViewContentModeScaleAspectFit;
appImageView=nil;
}
-(void)loadImageFromURLAndSaveInDocDir:(NSMutableDictionary *)dict
{
#autoreleasepool
{
UIImageView *cellImageViewObj = [dict objectForKey:#"imageView"];
NSString *url;
UIActivityIndicatorView *indiview = [dict objectForKey:#"indi"];
UILabel *Lbl = [dict objectForKey:#"loadingLbl"];
if ([dict objectForKey:#"url"])
{
url = [dict objectForKey:#"url"];
// fetch the data
NSURL *imgURL = [NSURL URLWithString:url];
NSData *imgData = [NSData dataWithContentsOfURL:imgURL];
NSString *filename = [Utils getFileNameFromURL:url];
// Cache the image
[sharedImageCache cacheImage:[NSString stringWithFormat:#"%#",filename] :imgData];
UIImage *image1 = [[UIImage alloc] initWithData:imgData];
cellImageViewObj.image = image1;
image1=nil;
}
else {
url = #"";
}
// set the content mode & hide the indicator & label
cellImageViewObj.contentMode = UIViewContentModeScaleAspectFit;
[indiview stopAnimating];
indiview.hidden = TRUE;
Lbl.hidden = TRUE;
dict = nil;
}
}
What is wrong am i doing.
The problem is that you are showing the image at its natural size. On a retina device, you need images which are twice as wide and twice as tall as the view they get drawn into.
Say the image is 200x200 and you are going to show it in a 100x100 view. The proper way to do this is:
get a CGImageRef of the data
create a UIImage using the below method and a scale of 2 (for retina)
(UIImage *)imageWithCGImage:(CGImageRef)imageRef scale:(CGFloat)scale orientation:(UIImageOrientation)orientation
The result is an image sized 100x100 but with a scale of two.
That said, since you specify 'UIViewContentModeScaleAspectFit', you may be able to just take the 200x200 image and had it off to the UIImageView, but in this case you must force the imageView to have a frame.size of 100x100.
If the image size is different than imageview, then you can proportionally scale the server image as per your view.
- (UIImage *)imageByScalingProportionallyToSize:(CGSize)targetSize;
- (UIImage *)imageByScalingProportionallyToSize:(CGSize)targetSize {
UIImage *sourceImage = self;
UIImage *newImage = nil;
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO) {
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor < heightFactor)
scaleFactor = widthFactor;
else
scaleFactor = heightFactor;
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor < heightFactor) {
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
} else if (widthFactor > heightFactor) {
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
// this is actually the interesting part:
UIGraphicsBeginImageContext(targetSize);
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
if(newImage == nil) NSLog(#"could not scale image");
return newImage ;
}
Related
I have this UIImageView and I have the values of its max height and max width. What I want to achieve is that I want to take the image (with any aspect ratio and any resolution) and I want it to fit in the borders, so the picture does not exceed them, but it can shrink them as it wants. (marked red in the picture):
Right now the image fits the necessary size properly, but I have 2 worries:
1. The UIImageView is not equal the size of the resized image, thus leaving red background (and I don't want that)
2. If the image is smaller that the height of my UIImageView it is not resized to be smaller, it stays the same height.
Here's my code and I know its wrong:
UIImage *actualImage = [attachmentsArray lastObject];
UIImageView *attachmentImageNew = [[UIImageView alloc] initWithFrame:CGRectMake(5.5, 6.5, 245, 134)];
attachmentImageNew.image = actualImage;
attachmentImageNew.backgroundColor = [UIColor redColor];
attachmentImageNew.contentMode = UIViewContentModeScaleAspectFit;
So how do I dynamically change the size not only of the UIImageView.image, but of the whole UIImageView, thus making its size totally adjustable to its content. Any help would be much appreciated, thanks!
When you get the width and height of a resized image Get width of a resized image after UIViewContentModeScaleAspectFit, you can resize your imageView:
imageView.frame = CGRectMake(0, 0, resizedWidth, resizedHeight);
imageView.center = imageView.superview.center;
I haven't checked if it works, but I think all should be OK
- (UIImage *)image:(UIImage*)originalImage scaledToSize:(CGSize)size
{
//avoid redundant drawing
if (CGSizeEqualToSize(originalImage.size, size))
{
return originalImage;
}
//create drawing context
UIGraphicsBeginImageContextWithOptions(size, NO, 0.0f);
//draw
[originalImage drawInRect:CGRectMake(0.0f, 0.0f, size.width, size.height)];
//capture resultant image
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//return image
return image;
}
This is the Swift equivalent for Rajneesh071's answer, using extensions
UIImage {
func scaleToSize(aSize :CGSize) -> UIImage {
if (CGSizeEqualToSize(self.size, aSize)) {
return self
}
UIGraphicsBeginImageContextWithOptions(aSize, false, 0.0)
self.drawInRect(CGRectMake(0.0, 0.0, aSize.width, aSize.height))
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
}
Usage:
let image = UIImage(named: "Icon")
item.icon = image?.scaleToSize(CGSize(width: 30.0, height: 30.0))
Use the category below and then apply border from Quartz into your image:
[yourimage.layer setBorderColor:[[UIColor whiteColor] CGColor]];
[yourimage.layer setBorderWidth:2];
The category:
UIImage+AutoScaleResize.h
#import <Foundation/Foundation.h>
#interface UIImage (AutoScaleResize)
- (UIImage *)imageByScalingAndCroppingForSize:(CGSize)targetSize;
#end
UIImage+AutoScaleResize.m
#import "UIImage+AutoScaleResize.h"
#implementation UIImage (AutoScaleResize)
- (UIImage *)imageByScalingAndCroppingForSize:(CGSize)targetSize
{
UIImage *sourceImage = self;
UIImage *newImage = nil;
CGSize imageSize = sourceImage.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
CGFloat targetWidth = targetSize.width;
CGFloat targetHeight = targetSize.height;
CGFloat scaleFactor = 0.0;
CGFloat scaledWidth = targetWidth;
CGFloat scaledHeight = targetHeight;
CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
if (CGSizeEqualToSize(imageSize, targetSize) == NO)
{
CGFloat widthFactor = targetWidth / width;
CGFloat heightFactor = targetHeight / height;
if (widthFactor > heightFactor)
{
scaleFactor = widthFactor; // scale to fit height
}
else
{
scaleFactor = heightFactor; // scale to fit width
}
scaledWidth = width * scaleFactor;
scaledHeight = height * scaleFactor;
// center the image
if (widthFactor > heightFactor)
{
thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
}
else
{
if (widthFactor < heightFactor)
{
thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
}
}
}
UIGraphicsBeginImageContext(targetSize); // this will crop
CGRect thumbnailRect = CGRectZero;
thumbnailRect.origin = thumbnailPoint;
thumbnailRect.size.width = scaledWidth;
thumbnailRect.size.height = scaledHeight;
[sourceImage drawInRect:thumbnailRect];
newImage = UIGraphicsGetImageFromCurrentImageContext();
if(newImage == nil)
{
NSLog(#"could not scale image");
}
//pop the context to get back to the default
UIGraphicsEndImageContext();
return newImage;
}
#end
If you have the size of the image, why don't you set the frame.size of the image view to be of this size?
EDIT----
Ok, so seeing your comment I propose this:
UIImageView *imageView;
//so let's say you're image view size is set to the maximum size you want
CGFloat maxWidth = imageView.frame.size.width;
CGFloat maxHeight = imageView.frame.size.height;
CGFloat viewRatio = maxWidth / maxHeight;
CGFloat imageRatio = image.size.height / image.size.width;
if (imageRatio > viewRatio) {
CGFloat imageViewHeight = round(maxWidth * imageRatio);
imageView.frame = CGRectMake(0, ceil((self.bounds.size.height - imageViewHeight) / 2.f), maxWidth, imageViewHeight);
}
else if (imageRatio < viewRatio) {
CGFloat imageViewWidth = roundf(maxHeight / imageRatio);
imageView.frame = CGRectMake(ceil((maxWidth - imageViewWidth) / 2.f), 0, imageViewWidth, maxHeight);
} else {
//your image view is already at the good size
}
This code will resize your image view to its image ratio, and also position the image view to the same centre as your "default" position.
PS: I hope you're setting imageView.layer.shouldRasterise = YES
and imageView.layer.rasterizationScale = [UIScreen mainScreen].scale;
if you're using CALayer shadow effect ;) It will greatly improve the performance of your UI.
I think what you want is a different content mode. Try using UIViewContentModeScaleToFill. This will scale the content to fit the size of ur UIImageView by changing the aspect ratio of the content if necessary.
Have a look to the content mode section on the official doc to get a better idea of the different content mode available (it is illustrated with images).
if([[SDWebImageManager sharedManager] diskImageExistsForURL:[NSURL URLWithString:#"URL STRING1"]])
{
NSString *key = [[SDWebImageManager sharedManager] cacheKeyForURL:[NSURL URLWithString:#"URL STRING1"]];
UIImage *tempImage=[self imageWithImage:[[SDImageCache sharedImageCache] imageFromDiskCacheForKey:key] scaledToWidth:cell.imgview.bounds.size.width];
cell.imgview.image=tempImage;
}
else
{
[cell.imgview sd_setImageWithURL:[NSURL URLWithString:#"URL STRING1"] placeholderImage:nil completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, NSURL *imageURL)
{
UIImage *tempImage=[self imageWithImage:image scaledToWidth:cell.imgview.bounds.size.width];
cell.imgview.image=tempImage;
// [tableView beginUpdates];
// [tableView endUpdates];
}];
}
I already do Lot of efforts from my Side. finally I need help. thanks
Goal :
1) How I fit imageView inside ScrollView
2) How I crop a zoomed Image in inside scrollView.
I have a imageView inside Scroll View. I wants crop image after zoomed which is display inside scrollview boundary. I cropped image already but it not exactly same which i wants.
Here I set backgroundColor Black to my scrollView. And when I place imageView inside it, it's not fit.
after zoom image inside scroll view is
and after crop image is
and my code is here :
- (void)viewDidLoad
{
[super viewDidLoad];
CGRect frame1 = CGRectMake(50,50,200,200);
CGRect frame2 = CGRectMake(50,50,200,200);
imageView1=[[UIImageView alloc]initWithFrame:frame1];
imageView1.image= [UIImage imageNamed:#"back.jpeg"];
imageView1.backgroundColor=[UIColor brownColor];
imageView1.contentMode = UIViewContentModeScaleAspectFit;
scroll=[[UIScrollView alloc]initWithFrame:frame2];
scroll.backgroundColor=[UIColor blackColor];
[scroll addSubview:imageView1];
scroll.delegate=self;
[self.view addSubview:scroll];
[self setContentSizeForScrollView];
self.imageView1.userInteractionEnabled = YES;
}
setting scroll view contentSize
-(void)setContentSizeForScrollView
{
// scroll.contentSize = CGSizeMake(imageView1.frame.size.width,imageView1.frame.size.height);
scroll.contentSize = CGSizeMake(200, 200);
scroll.minimumZoomScale = .50;
scroll.maximumZoomScale = 1.5;
}
and my crop logic is
-(IBAction)cropButtonClicked
{
//Calculate the required area from the scrollview
CGRect rect = CGRectMake(50, 50, 200,200);
UIImage *image = [self imageByCropping:imageView1.image toRect:rect];
imageView1.image=image;
imageView1.contentMode = UIViewContentModeScaleAspectFit;
}
And this method crop image :
- (UIImage*)imageByCropping:(UIImage *)myImage toRect:(CGRect)cropToArea{
CGImageRef cropImageRef = CGImageCreateWithImageInRect(myImage.CGImage, cropToArea);
UIImage* cropped = [UIImage imageWithCGImage:cropImageRef];
CGImageRelease(cropImageRef);
return cropped;
}
Answer of my own Question : After many efforts i found the answer of both of my questions.
and it work good for me. I share here, may be it help someone. :)
1) Fit image View inside Scroll View. I use this link
- (void)centerScrollViewContents {
CGSize boundsSize = scroll.bounds.size;
CGRect contentsFrame = self.imageView1.frame;
if (contentsFrame.size.width < boundsSize.width) {
contentsFrame.origin.x = (boundsSize.width - contentsFrame.size.width) / 2.0f;
}
else {
contentsFrame.origin.x = 0.0f;
}
if (contentsFrame.size.height < boundsSize.height) {
contentsFrame.origin.y = (boundsSize.height - contentsFrame.size.height) / 2.0f;
}
else {
contentsFrame.origin.y = 0.0f;
}
self.imageView1.frame = contentsFrame;
}
2) Crop a zoomed Image in inside scrollView. I use this link
UIGraphicsBeginImageContext(CGSizeMake(200, 200));
[scroll.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *fullScreenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
imageView1.contentMode = UIViewContentModeScaleAspectFill;
UIImageWriteToSavedPhotosAlbum(fullScreenshot, nil, nil, nil);
return fullScreenshot;
I had a similar problem recently, I got the solution using the relative idea ( Xa,b = Xa,c - Xb,c )
-(IBAction)crop:(id)sender{
// F = frame
// i = image
// iv = imageview
// sv = self.view
// of = offset
//Frame Image in imageView coordinates
CGRect Fi_iv = [self frameForImage:self.image inImageViewAspectFit:self.imageView];
//Frame ImageView in self.view coordinates
CGRect Fiv_sv = self.imageView.frame;
//Frame Image in self.view coordinates
CGRect Fi_sv = CGRectMake(Fi_iv.origin.x + Fiv_sv.origin.x
,Fi_iv.origin.y + Fiv_sv.origin.y,
Fi_iv.size.width, Fi_iv.size.height);
//ScrollView offset
CGPoint offset = self.scrollView.contentOffset;
//Frame Image in offset coordinates
CGRect Fi_of = CGRectMake(Fi_sv.origin.x - offset.x,
Fi_sv.origin.y - offset.y,
Fi_sv.size.width,
Fi_sv.size.height);
CGFloat scale = self.imageView.image.size.width/Fi_of.size.width;
//the crop frame in image offset coordinates
CGRect Fcrop_iof = CGRectMake((self.cropView.frame.origin.x - Fi_of.origin.x)*scale,
(self.cropView.frame.origin.y - Fi_of.origin.y)*scale,
self.cropView.frame.size.width*scale,
self.cropView.frame.size.height*scale);
UIImage *image = [self image:self.imageView.image cropRect:Fcrop_iof];
// Use this crop image...
You can crop the image using:
//Use this code to crop when you have the right frame
-(UIImage*)image:(UIImage *)image cropRect:(CGRect)frame
{
// Create a new UIImage
CGImageRef imageRef = CGImageCreateWithImageInRect(image.CGImage, frame);
UIImage *croppedImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return croppedImage;
}
Since you are using the ImageView in Aspect Fit, you need to calculate the image frame inside the imageView
-(CGRect)frameForImage:(UIImage*)image inImageViewAspectFit:(UIImageView*)imageView
{
float imageRatio = image.size.width / image.size.height;
float viewRatio = imageView.frame.size.width / imageView.frame.size.height;
if(imageRatio < viewRatio)
{
float scale = imageView.frame.size.height / image.size.height;
float width = scale * image.size.width;
float topLeftX = (imageView.frame.size.width - width) * 0.5;
return CGRectMake(topLeftX, 0, width, imageView.frame.size.height);
}
else
{
float scale = imageView.frame.size.width / image.size.width;
float height = scale * image.size.height;
float topLeftY = (imageView.frame.size.height - height) * 0.5;
return CGRectMake(0, topLeftY, imageView.frame.size.width, height);
}
}
I hope this code can work for you as it worked for me.
Best,
Rafael.
I'm trying to take a UIImageView, resize that image and display it. Then i want to break up that image into smaller pieces and display them.
The resized image displays correctly but it appears that the "split up" images are too big; i'm thinking they are coming from the original (slightly bigger) image. The following screenshot shows the resized image on the left and the a column of split up images from the left hand side.
The fact that the resized image is displaying correctly and the smaller ones aren't has me confused. Any ideas would be appreciated - or even an alternative. Here is my code:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
width = imgOriginal.frame.size.width;
height = imgOriginal.frame.size.height;
whratio = width/height;
[self getOriginalImageInfo];
[self resizeImage];
resizedImage = [self resizeImage];
}
-(void) getOriginalImageInfo {
lblWidth.text = [NSString stringWithFormat:#"%0.2f", width];
lblHeight.text = [NSString stringWithFormat:#"%0.2f", height];
lblWHRatio.text = [NSString stringWithFormat:#"%0.2f", whratio];
}
-(UIImageView*) resizeImage {
if (whratio >= 0.7 && whratio <=0.89) {
imgOriginal.frame = CGRectMake(20, 20, 500, 600);
imgOriginal.autoresizingMask = NO;
float resizedWHRatio = (imgOriginal.frame.size.width)/(imgOriginal.frame.size.height);
lblResizedWidth.text = [NSString stringWithFormat:#"%0.2f", imgOriginal.frame.size.width];
lblResizedHeight.text = [NSString stringWithFormat:#"%0.2f", imgOriginal.frame.size.height];
lblResizedWHRatio.text = [NSString stringWithFormat:#"%0.2f", resizedWHRatio];
return imgOriginal;
}
return nil;
}
- (IBAction)easyPressed:(id)sender {
NSLog(#"Easy button pressed");
CGImageRef original = [resizedImage.image CGImage];
float pieceWidth = (resizedImage.frame.size.width) / 5;
float pieceHeight = (resizedImage.frame.size.height) / 6;
for (int i =0; i<6; i++) {
CGRect rectangle = CGRectMake(0, 0+((float)i*pieceHeight), pieceWidth, pieceHeight);
CGImageRef newTile = CGImageCreateWithImageInRect(original, rectangle);
UIImageView *puzzlePiece = [[UIImageView alloc] initWithFrame:CGRectMake(611, 20+((float)i*pieceHeight), pieceWidth, pieceHeight)];
puzzlePiece.tag = i+1;
puzzlePiece.image = [UIImage imageWithCGImage:newTile];
puzzlePiece.alpha = 0.5;
[puzzlePiece.layer setBorderColor: [[UIColor blackColor] CGColor]];
[puzzlePiece.layer setBorderWidth: 1.0];
[self.view addSubview:puzzlePiece];
}
}
Try applying the same scaling factor to the partial images as you are to the main image!!!
OR
Take the screenshot of the UIImageView being displayed and split that up instead.
I'm currently working on a possibility to print the content of a view via Airprint.
For this feature I'm creating a UIImage from the view and send it to UIPrintInteractionController.
The problem is that the image is resized to the full resolution of the paper and not it's original size (approx. 300x500px). Does anybody know how to create a proper page from my image.
Here is the code:
/** Create UIImage from UIScrollView**/
-(UIImage*)printScreen{
UIImage* img = nil;
UIGraphicsBeginImageContext(scrollView.contentSize);
{
CGPoint savedContentOffset = scrollView.contentOffset;
CGRect savedFrame = scrollView.frame;
scrollView.contentOffset = CGPointZero;
scrollView.frame = CGRectMake(0, 0, scrollView.contentSize.width, scrollView.contentSize.height);
scrollView.backgroundColor = [UIColor whiteColor];
[scrollView.layer renderInContext: UIGraphicsGetCurrentContext()];
img = UIGraphicsGetImageFromCurrentImageContext();
scrollView.contentOffset = savedContentOffset;
scrollView.frame = savedFrame;
scrollView.backgroundColor = [UIColor clearColor];
}
UIGraphicsEndImageContext();
return img;
}
/** Print view content via AirPrint **/
-(void)doPrint{
if ([UIPrintInteractionController isPrintingAvailable])
{
UIPrintInteractionController *pic = [UIPrintInteractionController sharedPrintController];
UIImage *image = [(ReservationOverView*)self.view printScreen];
NSData *myData = [NSData dataWithData:UIImagePNGRepresentation(image)];
if(pic && [UIPrintInteractionController canPrintData: myData] ) {
pic.delegate =(id<UIPrintInteractionControllerDelegate>) self;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputPhoto;
printInfo.jobName = [NSString stringWithFormat:#"Reservation-%#",self.reservation.reservationID];
printInfo.duplex = UIPrintInfoDuplexNone;
pic.printInfo = printInfo;
pic.showsPageRange = YES;
pic.printingItem = myData;
//pic.delegate = self;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) = ^(UIPrintInteractionController *printController, BOOL completed, NSError *error) {
if (!completed && error) {
NSLog(#"FAILED! due to error in domain %# with error code %u", error.domain, error.code);
}
};
[pic presentAnimated:YES completionHandler:completionHandler];
}
}
}
I've tried to resize the image manually, but this does not work properly.
I've found this sample code on Apple:
https://developer.apple.com/library/ios/samplecode/PrintPhoto/Listings/Classes_PrintPhotoPageRenderer_m.html#//apple_ref/doc/uid/DTS40010366-Classes_PrintPhotoPageRenderer_m-DontLinkElementID_6
And it looks like the proper way to size an image for printing (so it doesn't fill the entire page) is to implement your own UIPrintPageRenderer and implement:
- (void)drawPageAtIndex:(NSInteger)pageIndex inRect:(CGRect)printableRect
The printableRect will tell you the size of the paper and you can scale it down to however much you want (presumably by calculating some DPI).
Update: I ended up implementing my own ImagePageRenderer:
- (void)drawPageAtIndex:(NSInteger)pageIndex inRect:(CGRect)printableRect
{
if( self.image )
{
CGSize printableAreaSize = printableRect.size;
// Apple uses 72dpi by default for printing images. This
// renders out the image to be giant. Instead, we should
// resize our image to our desired dpi.
CGFloat dpiScale = kAppleDPI / self.dpi;
CGFloat imageWidth = self.image.size.width * dpiScale;
CGFloat imageHeight = self.image.size.height * dpiScale;
// scale image if paper is too small
BOOL scaleImage = printableAreaSize.width < imageWidth || printableAreaSize.height < imageHeight;
if( scaleImage )
{
CGFloat widthScale = (CGFloat)printableAreaSize.width / imageWidth;
CGFloat heightScale = (CGFloat)printableAreaSize.height / imageHeight;
// Choose smaller scale so there's no clipping
CGFloat scale = widthScale < heightScale ? widthScale : heightScale;
imageWidth *= scale;
imageHeight *= scale;
}
// If you want to center vertically, horizontally, or both,
// modify the origin below.
CGRect destRect = CGRectMake( printableRect.origin.x,
printableRect.origin.y,
imageWidth,
imageHeight );
// Use UIKit to draw the image to destRect.
[self.image drawInRect:destRect];
}
else
{
NSLog( #"no image to print" );
}
}
UIImage *image = [UIImage imageNamed:#"myImage"];
[image drawInRect: destinationRect];
UIImage *thumbnail = UIGraphicsGetImageFromCurrentImageContext();
UIImageWriteToSavedPhotosAlbum(image,nil,nil,nil);
The destinationRect will be sized according to the dimensions of the downsized version.
I have an application where I am displaying large images in a small space.
The images are quite large, but I am only displaying them in 100x100 pixel frames.
My app is responding slowly because of the size fo the images I am using.
To improve performance, how can I resize the images programmatically using Objective-C?
Please find the following code.
- (UIImage *)imageWithImage:(UIImage *)image convertToSize:(CGSize)size {
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *destImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return destImage;
}
This code is for just change image scale not for resizing. You have to set CGSize as your image width and hight so the image will not stretch and it arrange at the middle.
- (UIImage *)imageWithImage:(UIImage *)image scaledToFillSize:(CGSize)size
{
CGFloat scale = MAX(size.width/image.size.width, size.height/image.size.height);
CGFloat width = image.size.width * scale;
CGFloat height = image.size.height * scale;
CGRect imageRect = CGRectMake((size.width - width)/2.0f,
(size.height - height)/2.0f,
width,
height);
UIGraphicsBeginImageContextWithOptions(size, NO, 0);
[image drawInRect:imageRect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
My favorite way to do this is with CGImageSourceCreateThumbnailAtIndex (in the ImageIO framework). The name is a bit misleading.
Here's an excerpt of some code from a recent app of mine.
CGFloat maxw = // whatever;
CGFloat maxh = // whatever;
CGImageSourceRef src = NULL;
if ([imageSource isKindOfClass:[NSURL class]])
src = CGImageSourceCreateWithURL((__bridge CFURLRef)imageSource, nil);
else if ([imageSource isKindOfClass:[NSData class]])
src = CGImageSourceCreateWithData((__bridge CFDataRef)imageSource, nil);
// if at double resolution, double the thumbnail size and use double-resolution image
CGFloat scale = 1;
if ([[UIScreen mainScreen] scale] > 1.0) {
scale = 2;
maxw *= 2;
maxh *= 2;
}
// load the image at the desired size
NSDictionary* d = #{
(id)kCGImageSourceShouldAllowFloat: (id)kCFBooleanTrue,
(id)kCGImageSourceCreateThumbnailWithTransform: (id)kCFBooleanTrue,
(id)kCGImageSourceCreateThumbnailFromImageAlways: (id)kCFBooleanTrue,
(id)kCGImageSourceThumbnailMaxPixelSize: #((int)(maxw > maxh ? maxw : maxh))
};
CGImageRef imref = CGImageSourceCreateThumbnailAtIndex(src, 0, (__bridge CFDictionaryRef)d);
if (NULL != src)
CFRelease(src);
UIImage* im = [UIImage imageWithCGImage:imref scale:scale orientation:UIImageOrientationUp];
if (NULL != imref)
CFRelease(imref);
If you are using a image on different sizes and resizing each time it will degrade your app performance. Solution is don't resize them just use button in place of imageview. and just set the image on button it will resize automatically and you will get great performance.
I was also resizing images while setting it on cell but my app got slow So I used Button in place of imageview (not resizing images programatically button is doing this job) and it is working perfectly fine.
-(UIImage *)scaleImage:(UIImage *)image toSize:. (CGSize)targetSize
{
//If scaleFactor is not touched, no scaling will occur
CGFloat scaleFactor = 1.0;
//Deciding which factor to use to scale the image (factor = targetSize / imageSize)
if (image.size.width > targetSize.width ||
image.size.height > targetSize.height || image.size.width == image.size.height)
if (!((scaleFactor = (targetSize.width /
image.size.width)) > (targetSize.height /
image.size.height))) //scale to fit width, or
scaleFactor = targetSize.height / image.size.height; // scale to fit heigth.
Since the code ran perfectly fine in iOS 4, for backwards compatibility I added a check for OS version and for anything below 5.0 the old code would work.
- (UIImage *)resizedImage:(CGSize)newSize interpolationQuality:(CGInterpolationQuality)quality {
BOOL drawTransposed;
CGAffineTransform transform = CGAffineTransformIdentity;
if ([[[UIDevice currentDevice] systemVersion] floatValue] >= 5.0) {
// Apprently in iOS 5 the image is already correctly rotated, so we don't need to rotate it manually
drawTransposed = NO;
} else {
switch (self.imageOrientation) {
case UIImageOrientationLeft:
case UIImageOrientationLeftMirrored:
case UIImageOrientationRight:
case UIImageOrientationRightMirrored:
drawTransposed = YES;
break;
default:
drawTransposed = NO;
}
transform = [self transformForOrientation:newSize];
}
return [self resizedImage:newSize
transform:transform
drawTransposed:drawTransposed
interpolationQuality:quality];
}
You can use this.
[m_Image.layer setMinificationFilter:kCAFilterTrilinear];
This thread is old, but it is what I pulled up when trying to solve this problem. Once the image is scaled it was not displaying well in my container even though I turned auto layout off. The easiest way for me to solve this for display in a table row, was to paint the image on a white background that had a fixed size.
Helper function
+(UIImage*)scaleMaintainAspectRatio:(UIImage*)sourceImage :(float)i_width :(float)i_height
{
float newHeight = 0.0;
float newWidth = 0.0;
float oldWidth = sourceImage.size.width;
float widthScaleFactor = i_width / oldWidth;
float oldHeight = sourceImage.size.height;
float heightScaleFactor = i_height / oldHeight;
if (heightScaleFactor > widthScaleFactor) {
newHeight = oldHeight * widthScaleFactor;
newWidth = sourceImage.size.width * widthScaleFactor;
} else {
newHeight = sourceImage.size.height * heightScaleFactor;
newWidth = oldWidth * heightScaleFactor;
}
// return image in white rect
float cxPad = i_width - newWidth;
float cyPad = i_height - newHeight;
if (cyPad > 0) {
cyPad = cyPad / 2.0;
}
if (cxPad > 0) {
cxPad = cxPad / 2.0;
}
CGSize size = CGSizeMake(i_width, i_height);
UIGraphicsBeginImageContextWithOptions(CGSizeMake(size.width, size.height), YES, 0.0);
[[UIColor whiteColor] setFill];
UIRectFill(CGRectMake(0, 0, size.width, size.height));
[sourceImage drawInRect:CGRectMake((int)cxPad, (int)cyPad, newWidth, newHeight)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
// will return scaled image at actual size, not in white rect
// UIGraphicsBeginImageContext(CGSizeMake(newWidth, newHeight));
// [sourceImage drawInRect:CGRectMake(0, 0, newWidth, newHeight)];
// UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
// UIGraphicsEndImageContext();
// return newImage;
}
I called this like this from my table view cellForRowAtIndexPath
PFFile *childsPicture = [object objectForKey:#"picture"];
[childsPicture getDataInBackgroundWithBlock:^(NSData *imageData, NSError *error) {
if (!error) {
UIImage *largePicture = [UIImage imageWithData:imageData];
UIImage *scaledPicture = [Utility scaleMaintainAspectRatio:largePicture :70.0 :70.0 ];
PFImageView *thumbnailImageView = (PFImageView*)[cell viewWithTag:100];
thumbnailImageView.image = scaledPicture;
[self.tableView reloadData];
}
}];
Hello from the end of 2018.
Solved with next solution (you need only last line, first & second are just for explanation):
NSURL *url = [NSURL URLWithString:response.json[0][#"photo_50"]];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *image = [UIImage imageWithData:data scale:customScale];
'customScale' is scale which you want (>1 if image must be smaller, <1 if image must be bigger).
This c method will resize your image with cornerRadius "Without effecting image's quality" :
UIImage *Resize_Image(UIImage *iImage, CGFloat iSize, CGFloat icornerRadius) {
CGFloat scale = MAX(CGSizeMake(iSize ,iSize).width/iImage.size.width, CGSizeMake(iSize ,iSize).height/iImage.size.height);
CGFloat width = iImage.size.width * scale;
CGFloat height = iImage.size.height * scale;
CGRect imageRect = CGRectMake((CGSizeMake(iSize ,iSize).width - width)/2.0f,(CGSizeMake(iSize ,iSize).height - height)/2.0f,width,height);
UIGraphicsBeginImageContextWithOptions(CGSizeMake(iSize ,iSize), NO, 0);
[[UIBezierPath bezierPathWithRoundedRect:imageRect cornerRadius:icornerRadius] addClip];
[iImage drawInRect:imageRect];
UIImage *ResizedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return ResizedImage;
}
This is how to use :
UIImage *ResizedImage = Resize_Image([UIImage imageNamed:#"image.png"], 64, 14.4);
I do not remember where i took the first 4 lines ..