Objective c - UIImage resizing issue - iphone

I have a resource (.png file) that show a picture frame (border).
This .png file is size 100x100px, and the border width is 10px.
My Question:
How can I create another UIImage from this image, with a different size, without ruin the border's width?
The Problem:
When I try to draw the new image from the original image with CGContextDrawImage I get a new image with the new size, but my border proportion is ruin.
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newWidth, newHeight));
CGImageRef imageRef = //... the image
// Build a context that's the same dimensions as the new size
CGContextRef bitmap = CGBitmapContextCreate(NULL,
newRect.size.width,
newRect.size.height,
CGImageGetBitsPerComponent(imageRef),
0,
CGImageGetColorSpace(imageRef),
CGImageGetBitmapInfo(imageRef));
// Set the quality level to use when rescaling
CGContextSetInterpolationQuality(bitmap, kCGInterpolationHigh);
// Draw into the context; this scales the image
CGContextDrawImage(bitmap, newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(bitmap);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
// Clean up
CGContextRelease(bitmap);
CGImageRelease(newImageRef);
For example, when I tried to create an image size 800x100p, I get an image with very thin top and bottom border.
What I need is that the border will stay the same width
*note
Using resizableImageWithCapInsets: wont help me, because I need a new image with the new size to save on the disc.

You can use resizableImageWithCapInsets:
UIImage *img = [UIImage imageNamed:#"myResource"];
img = [img resizableImageWithCapInsets:UIEdgeInsetsMake(10,10,10,10)];
I've never used this approach with CGContextDrawImage, but it should work.

Related

How to know if a UIImage is representable in PNG or JPG?

I got a UIImage from UIImagePickerController, and using the code from this site to resize the image
- (UIImage *)resizedImage:(CGSize)newSize
transform:(CGAffineTransform)transform
drawTransposed:(BOOL)transpose
interpolationQuality:(CGInterpolationQuality)quality {
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGRect transposedRect = CGRectMake(0, 0, newRect.size.height, newRect.size.width);
CGImageRef imageRef = self.CGImage;
// Build a context that's the same dimensions as the new size
CGContextRef bitmap = CGBitmapContextCreate(NULL,
newRect.size.width,
newRect.size.height,
CGImageGetBitsPerComponent(imageRef),
0,
CGImageGetColorSpace(imageRef),
CGImageGetBitmapInfo(imageRef));
// Rotate and/or flip the image if required by its orientation
CGContextConcatCTM(bitmap, transform);
// Set the quality level to use when rescaling
CGContextSetInterpolationQuality(bitmap, quality);
// Draw into the context; this scales the image
CGContextDrawImage(bitmap, transpose ? transposedRect : newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(bitmap);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
// Clean up
CGContextRelease(bitmap);
CGImageRelease(newImageRef);
return newImage;
}
UIImagePNGRepresentation() failed to return NSData on re-sized image, but UIImageJPEGRepresentation() succeed.
How do we know if a UIImage is presentable in PNG or JPEG? What missed in the above code that make the resized image can not be represented in PNG?
According to apple document: "This function may return nil if the image has no data or if the underlying CGImageRef contains data in an unsupported bitmap format."
What bitmap format supported by PNG presentation? How to make an UIImage PNG-supported format?
That was a mistake that in another part of the code the image was rescaled with the following
CGContextRef context = CGBitmapContextCreate(NULL,
size.width,
size.height,
8,
0,
CGImageGetColorSpace(source),
kCGImageAlphaNoneSkipFirst);
Changing kCGImageAlphaNoneSkipFirst to CGImageGetBitmapInfo(source) fixed the problem
go to following link...
How to check if downloaded PNG image is corrupt?
it may help you...
Let me know it is working or not...
Happy Coding!!!!

How can we set a large UIImage on small UIImageview

I am using imagview with size of 320X320 to display large image (350 X 783).
While placing the large image into the imageview, the image looks squeezed, compressed something not like the quality one.
My question is, how can I make the large image into small with as good quality as the original image ?
You can set your image view a proper content mode, like that
imageView.contentMode = UIViewContentModeScaleAspectFit
-(UIImage*)scaleToSize:(CGSize)size {
// Create a bitmap graphics context
// This will also set it as the current context
UIGraphicsBeginImageContext(size);
// Draw the scaled image in the current context
[self drawInRect:CGRectMake(0, 0, size.width, size.height)];
// Create a new image from current context
UIImage* scaledImage = UIGraphicsGetImageFromCurrentImageContext();
// Pop the current context from the stack
UIGraphicsEndImageContext();
// Return our new scaled image
return scaledImage;
}
You can do like this,
UIImage* sourceImage = "yourImage";
CGSize newSize=CGSizeMake(80,80);
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
yourImageView.image=newImage;

Image is getting stretched and blur after crop

I am having and UIImageView and there is another canvas view over the UIImageView. I want to cut the UIImageView's image according to the canvas view's frame. But after crop the image is getting stretched and blur after crop. Below are my codes.
[UIImage *images = [self captureScreenInRect1:canvas.frame];
self.imgViewCurrent.contentMode = UIViewContentModeScaleToFill;
self.imgViewCurrent.image = images;
- (UIImage *)captureScreenInRect1:(CGRect)captureFrame {
CALayer *layer;
layer = self.view.layer;enter image description here
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 1.0);
CGContextClipToRect (UIGraphicsGetCurrentContext(),captureFrame);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGImageRef imageRef = CGImageCreateWithImageInRect([screenImage CGImage], captureFrame);
UIImage *img = [UIImage imageWithCGImage:imageRef scale:0.0 orientation:UIImageOrientationUp];
CGImageRelease(imageRef);
return img;
}
change this line
self.imgViewCurrent.contentMode = UIViewContentModeScaleToFill;
to this line
self.imgViewCurrent.contentMode = UIViewContentModeScaleAspectFit
UIViewContentModeScaleToFill will basically stretches the image.
After cropping your image, you can resize your image with Custom Size. It may help you.
-(UIImage*) resizedImage:(UIImage *)inImage: (CGRect) thumbRect
{
CGImageRef imageRef = [inImage CGImage];
CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef);
// There's a wierdness with kCGImageAlphaNone and CGBitmapContextCreate
// see Supported Pixel Formats in the Quartz 2D Programming Guide
// Creating a Bitmap Graphics Context section
// only RGB 8 bit images with alpha of kCGImageAlphaNoneSkipFirst, kCGImageAlphaNoneSkipLast, kCGImageAlphaPremultipliedFirst,
// and kCGImageAlphaPremultipliedLast, with a few other oddball image kinds are supported
// The images on input here are likely to be png or jpeg files
if (alphaInfo == kCGImageAlphaNone)
alphaInfo = kCGImageAlphaNoneSkipLast;
// Build a bitmap context that's the size of the thumbRect
CGContextRef bitmap = CGBitmapContextCreate(
NULL,
thumbRect.size.width, // width
thumbRect.size.height, // height
CGImageGetBitsPerComponent(imageRef), // really needs to always be 8
4 * thumbRect.size.width, // rowbytes
CGImageGetColorSpace(imageRef),
alphaInfo
);
// Draw into the context, this scales the image
CGContextDrawImage(bitmap, thumbRect, imageRef);
// Get an image from the context and a UIImage
CGImageRef ref = CGBitmapContextCreateImage(bitmap);
UIImage* result = [UIImage imageWithCGImage:ref];
CGContextRelease(bitmap); // ok if NULL
CGImageRelease(ref);
return result;
}

Is it possible to isolate a single color in an UIImage/CGImageRef

Wondering if there is a way to isolate a single color in an image either using masks or perhaps even a custom color space. I'm ultimately looking for a fast way to isolate 14 colors out of an image - figured if there was a masking method it might may be faster than walking through the pixels.
Any help is appreciated!
You could use a custom color space (documentation here) and then substitute it for "CGColorSpaceCreateDeviceGray()" in the following code:
- (UIImage *)convertImageToGrayScale:(UIImage *)image
{
// Create image rectangle with current image width/height
CGRect imageRect = CGRectMake(0, 0, image.size.width, image.size.height);
// Grayscale color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray(); // <- SUBSTITUTE HERE
// Create bitmap content with current image size and grayscale colorspace
CGContextRef context = CGBitmapContextCreate(nil, image.size.width, image.size.height, 8, 0, colorSpace, kCGImageAlphaNone);
// Draw image into current context, with specified rectangle
// using previously defined context (with grayscale colorspace)
CGContextDrawImage(context, imageRect, [image CGImage]);
// Create bitmap image info from pixel data in current context
CGImageRef imageRef = CGBitmapContextCreateImage(context);
// Create a new UIImage object
UIImage *newImage = [UIImage imageWithCGImage:imageRef];
// Release colorspace, context and bitmap information
CGColorSpaceRelease(colorSpace);
CGContextRelease(context);
CFRelease(imageRef);
// Return the new grayscale image
return newImage;
}
This code is from this blog which is worth a look at for removing colors from images.

iPhone: How to Maintain Original Image Size Thoughout Image Edits

I am developing an iPhone app that resizes and merges images.
I want to select two photos of size 1600x1200 from photo library and then merge both into a single image and save that new image back to the photo library.
However, I can't get the right size for the merged image.
I take two image views of frame 320x480 and set the view's image to my imported images. After manipulating the images (zooming, cropping, rotating), I then save the image to album. When I check the image size it shows 600x800. How do I get the original size of 1600*1200?
I've been stuck on this problem from two weeks!
Thanks in advance.
The frame of the UIImageView has nothing to do with the size of the image it displays. If you display a 1200x1600 pixel in a 75x75 imageView the image size in memory is still 1200x1600. Somewhere in your processing of the image you are resetting its size.
You need to resize the images programmatically behind the scenes and ignore how they are displayed. For highest fidelity, I suggest preforming all processing on the image at full size and then resizing only the final result. For speed and low memory use, resize smaller first, process and then resize again as needed.
I use Trevor Harmon's UIImage+Resize to resize images.
His core method looks like this:
- (UIImage *)resizedImage:(CGSize)newSize
transform:(CGAffineTransform)transform
drawTransposed:(BOOL)transpose
interpolationQuality:(CGInterpolationQuality)quality
{
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGRect transposedRect = CGRectMake(0, 0, newRect.size.height, newRect.size.width);
CGImageRef imageRef = self.CGImage;
// Build a context that's the same dimensions as the new size
CGContextRef bitmap = CGBitmapContextCreate(NULL,
newRect.size.width,
newRect.size.height,
CGImageGetBitsPerComponent(imageRef),
0,
CGImageGetColorSpace(imageRef),
CGImageGetBitmapInfo(imageRef));
// Rotate and/or flip the image if required by its orientation
CGContextConcatCTM(bitmap, transform);
// Set the quality level to use when rescaling
CGContextSetInterpolationQuality(bitmap, quality);
// Draw into the context; this scales the image
CGContextDrawImage(bitmap, transpose ? transposedRect : newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(bitmap);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
// Clean up
CGContextRelease(bitmap);
CGImageRelease(newImageRef);
return newImage;
}
Harmon saved me dozens of man hours trying to get resizing done correctly.
Solved as follows.
UIView *bgView = [[UIView alloc] initwithFrame:CGRectMake(0, 0, 1600, 1200)];
UIGraphicsBeginImageContext(tempView.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, self, nil, nil);
Thanks for all your support to solve the issue