convert any uploaded image in 72 DPI - iphone

In my application i m using upload image functionality and i want to convert uploaded image in 72 DPI if image is with more or less DPI.
can we do it with using cgimagecreate
Please suggest me.
Thanks.

We can get image with 72 DPI with below method, actually we need to just get new image from context as below.
-(UIImage *)resizeImageFor72DPI:(UIImage*)image newSize:(CGSize)newSize
{
CGRect newRect= CGRectZero;
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]
&& [[UIScreen mainScreen] scale] == 2.0)
{
//For retina image
newSize=CGSizeMake(newSize.width/2.0,newSize.height/2.0);
}
newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGImageRef imageRef = image.CGImage;
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0);
CGContextRef context = UIGraphicsGetCurrentContext();
// Set the quality level to use when rescaling
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, newSize.height);
CGContextConcatCTM(context, flipVertical);
// Draw into the context; this scales the image
CGContextDrawImage(context, newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(context);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
CGImageRelease(newImageRef);
UIGraphicsEndImageContext();
return newImage;
}

Related

UIImage Black and White with Transparency.

Below is code for converting image to Black and white. it is working fine unless image with Transparency comes. That transparent area is converted to black. please help on this what is wrong here.
+ (UIImage *)getBlackAndWhiteVersionOfImage:(UIImage *)anImage
{
UIImage *newImage;
UIImage *imageToDisplay;
int orientation = anImage.imageOrientation;
if (anImage) {
CGColorSpaceRef colorSapce = CGColorSpaceCreateDeviceGray();
CGContextRef context = CGBitmapContextCreate(nil, anImage.size.width * anImage.scale, anImage.size.height * anImage.scale, 8, anImage.size.width * anImage.scale, colorSapce, kCGImageAlphaNone);
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
CGContextSetShouldAntialias(context, NO);
CGContextDrawImage(context, CGRectMake(0, 0, anImage.size.width, anImage.size.height), [anImage CGImage]);
CGImageRef bwImage = CGBitmapContextCreateImage(context);
CGContextRelease(context);
CGColorSpaceRelease(colorSapce);
UIImage *resultImage = [UIImage imageWithCGImage:bwImage];
CGImageRelease(bwImage);
UIGraphicsBeginImageContextWithOptions(anImage.size, NO, anImage.scale);
[resultImage drawInRect:CGRectMake(0.0, 0.0, anImage.size.width, anImage.size.height)];
newImage = UIGraphicsGetImageFromCurrentImageContext();
imageToDisplay =
[UIImage imageWithCGImage:[newImage CGImage]
scale:1.0
orientation: orientation];
UIGraphicsEndImageContext();
}
return imageToDisplay;
}
I dont think gray colorspace has an alpha compononent

capture screen as it looks like

iam doing one application.In that Im using one imageview with image and add one view with sone clear holes.Means through that holes we can see the background image.So my problem is i want to capture that total screen (imageview with this holes view).Iam using below code but it's not working.
- (UIImage*)captureView:(UIView *)yourView {
CGRect rect = [[UIScreen mainScreen] bounds];
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[yourView.layer renderInContext:context];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
but it's not working.
This IS how we do:
UIGraphicsBeginImageContextWithOptions(yourView.bounds.size, yourView.opaque, 0.0);
[yourView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *LastImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Try this to take your screeShoT
-(UIImage*)getScreenShot
{
CGSize screenSize = [[UIScreen mainScreen] applicationFrame].size;
//CGSize screenSize = CGSizeMake(1024, 768);
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGContextRef ctx = CGBitmapContextCreate(nil, screenSize.width, 416, 8, 4*(int)screenSize.width, colorSpaceRef, kCGImageAlphaPremultipliedLast);
CGContextTranslateCTM(ctx, 0.0, screenSize.height);
CGContextScaleCTM(ctx, 1.0, -1.0);
[(CALayer*)self.yourView.layer renderInContext:ctx];
CGImageRef cgImage = CGBitmapContextCreateImage(ctx);
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
CGContextRelease(ctx);
return image;
}

UIGraphicsBeginImageContextWithOptions and UIImageJPEGRepresentation not working well together

So i have this code to create a UIImage:
UIGraphicsBeginImageContextWithOptions(border.frame.size, YES, 0);
[border.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *thumbnailImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
At this point, the size on the image is correct, 80x100.
Then it runs this code:
NSData *fullImageData = UIImageJPEGRepresentation(image, 1.0f);
And the NSData of the image returns an image at the size 160x200 - twice as much as it should be.
It's became clear the reason for this is the line:
UIGraphicsBeginImageContextWithOptions(border.frame.size, YES, 0);
The 0 on the end is the scale, and because it's 0 it goes by the devices scale factor. I keep it this way to maintain a clear image. However, when i set the image to 1, despite the image staying the size it should, it doesn't come out in retina quality. What i want to do is keep it in retina quality, but also keep it at the right size. Is there a way to do this?
Try resizing the UIImage before calling UIImageJPEGRepresentation
- (UIImage *)resizeImage:(UIImage*)image newSize:(CGSize)newSize {
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGImageRef imageRef = image.CGImage;
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0);
CGContextRef context = UIGraphicsGetCurrentContext();
// Set the quality level to use when rescaling
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, newSize.height);
CGContextConcatCTM(context, flipVertical);
// Draw into the context; this scales the image
CGContextDrawImage(context, newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(context);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
CGImageRelease(newImageRef);
UIGraphicsEndImageContext();
return newImage;
}
if([UIScreen mainScreen].scale > 1)
{
thumbnailImage = [self thumbnailImage newSize:CGSizeMake(thumbnailImage.size.width/[UIScreen mainScreen].scale, thumbnailImage.size.height/[UIScreen mainScreen].scale)];
}
- (UIImage *)imageWithImage{
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

How to crop a particular image in iphone

I am working on Camera Application user are taking a picture its fine,but i want to crop anywhere in that image and send it to server. How can I do this?
Check out this link for details:
http://www.hive05.com/2008/11/crop-an-image-using-the-iphone-sdk/
Basic code:
- (UIImage*)imageByCropping:(UIImage *)imageToCrop toRect:(CGRect)rect
{
//create a context to do our clipping in
UIGraphicsBeginImageContext(rect.size);
CGContextRef currentContext = UIGraphicsGetCurrentContext();
//create a rect with the size we want to crop the image to
//the X and Y here are zero so we start at the beginning of our
//newly created context
CGRect clippedRect = CGRectMake(0, 0, rect.size.width, rect.size.height);
CGContextClipToRect( currentContext, clippedRect);
//create a rect equivalent to the full size of the image
//offset the rect by the X and Y we want to start the crop
//from in order to cut off anything before them
CGRect drawRect = CGRectMake(rect.origin.x * -1,
rect.origin.y * -1,
imageToCrop.size.width,
imageToCrop.size.height);
//draw the image to our clipped context using our offset rect
CGContextDrawImage(currentContext, drawRect, imageToCrop.CGImage);
//pull the image from our cropped context
UIImage *cropped = UIGraphicsGetImageFromCurrentImageContext();
//pop the context to get back to the default
UIGraphicsEndImageContext();
//Note: this is autoreleased
return cropped;
}
I think i can provide a better solution to that large amount of code.
- (void)viewDidLoad
{
[super viewDidLoad];
// do something......
UIImage *croppedImage = [self imageByCropping:[UIImage imageNamed:#"SomeImage.png"] toRect:CGRectMake(10, 10, 100, 100)];
}
- (UIImage*)imageByCropping:(UIImage *)imageToCrop toRect:(CGRect)rect
{
CGImageRef imageRef = CGImageCreateWithImageInRect([imageToCrop CGImage], rect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef];
return cropped;
}

Problem in cropping the UIImage using CGContext?

I developing the simple UIApplication in which i want to crop the UIImage (in .jpg format) with help of CGContext. The developed code till now as follows,
CGImageRef graphicOriginalImage = [originalImage.image CGImage];
UIGraphicsBeginImageContext(originalImage.image.size);
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGBitmapContextCreateImage(graphicOriginalImage);
CGFloat fltW = originalImage.image.size.width;
CGFloat fltH = originalImage.image.size.height;
CGFloat X = round(fltW/4);
CGFloat Y =round(fltH/4);
CGFloat width = round(X + (fltW/2));
CGFloat height = round(Y + (fltH/2));
CGContextTranslateCTM(ctx, 0, image.size.height);
CGContextScaleCTM(ctx, 1.0, -1.0);
CGRect rect = CGRectMake(X,Y ,width ,height);
CGContextDrawImage(ctx, rect, graphicOriginalImage);
croppedImage = UIGraphicsGetImageFromCurrentImageContext();
return croppedImage;
}
The above code is worked fine but it can't crop image.
The original image memory and cropped image memory i will got same(equal to original image memory).
The above code is right for cropping the image??????????????????
Here is a good way to crop an image to a CGRect:
- (UIImage*)imageByCropping:(UIImage *)imageToCrop toRect:(CGRect)rect
{
//create a context to do our clipping in
UIGraphicsBeginImageContext(rect.size);
CGContextRef currentContext = UIGraphicsGetCurrentContext();
//create a rect with the size we want to crop the image to
//the X and Y here are zero so we start at the beginning of our
//newly created context
CGRect clippedRect = CGRectMake(0, 0, rect.size.width, rect.size.height);
CGContextClipToRect( currentContext, clippedRect);
//create a rect equivalent to the full size of the image
//offset the rect by the X and Y we want to start the crop
//from in order to cut off anything before them
CGRect drawRect = CGRectMake(rect.origin.x * -1,
rect.origin.y * -1,
imageToCrop.size.width,
imageToCrop.size.height);
//draw the image to our clipped context using our offset rect
CGContextDrawImage(currentContext, drawRect, imageToCrop.CGImage);
//pull the image from our cropped context
UIImage *cropped = UIGraphicsGetImageFromCurrentImageContext();
//pop the context to get back to the default
UIGraphicsEndImageContext();
//Note: this is autoreleased
return cropped;
}
Or another way:
- (UIImage *)imageByCropping:(UIImage *)imageToCrop toRect:(CGRect)rect
 {
CGImageRef imageRef = CGImageCreateWithImageInRect([imageToCrop CGImage], rect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);

 return cropped;
}
From http://www.hive05.com/2008/11/crop-an-image-using-the-iphone-sdk/.
The context you create to draw the image has the same size that the original image. That's why they have the same size.
If you don't want to re-invent the wheel, take a look at the TouchCode project on Google Code. You will find UIImage categories that do the job (see UIImage_ThumbnailExtensions.m).