How to perform square cut to the photos in camera roll? - iphone

I would like to try some image filter features on iPhone like instagram does.
I use imagePickerController to get photo from camera roll. I understand that the image return by imagePickerController was reduced to save memory. And it's not wise to load the original image to an UIImage. But how can I processe the image then save it back as the original pixels?
I use iPhone 4S as my development device.
The original photo in camera roll is 3264 * 2448.
The image return by UIImagePickerControllerOriginalImage is 1920 * 1440
The image return by UIImagePickerControllerEditedImage is 640 * 640
imageViewOld(use UIImagePickerControllerCropRect [80,216,1280,1280] to crop the image return by UIImagePickerControllerOriginalImage) is 1280 * 1224
imageViewNew(use double sized UIImagePickerControllerCropRect [80,216,2560,2560] to crop the image return by UIImagePickerControllerOriginalImage) is 1840 * 1224.
I check the same photo proceed by instagram is 1280 * 1280
My questions are:
Why UIImagePickerControllerOriginalImage does not return the "Original" photo? Why reduced it to 1920 * 1440?
Why UIImagePickerControllerEditedImage does not return the image by 1280 * 1280? As the
UIImagePickerControllerCropRect shows it is cut by 1280 * 1280 square?
How can I do a square cut to original photo to be a 2448 * 2448 image?
Thanks in advance.
Below is my code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if ([mediaType isEqualToString:#"public.image"])
{
UIImage *imageEdited = [info objectForKey:UIImagePickerControllerEditedImage];
UIImage *imagePicked = [info objectForKey:UIImagePickerControllerOriginalImage];
CGRect cropRect;
cropRect = [[info valueForKey:#"UIImagePickerControllerCropRect"] CGRectValue];
NSLog(#"Original width = %f height= %f ",imagePicked.size.width, imagePicked.size.height);
//Original width = 1440.000000 height= 1920.000000
NSLog(#"imageEdited width = %f height = %f",imageEdited.size.width, imageEdited.size.height);
//imageEdited width = 640.000000 height = 640.000000
NSLog(#"corpRect %f %f %f %f", cropRect.origin.x, cropRect.origin.y , cropRect.size.width, cropRect.size.height);
//corpRect 80.000000 216.000000 1280.000000 1280.000000
CGRect rectNew = CGRectMake(cropRect.origin.x, cropRect.origin.y , cropRect.size.width*2, cropRect.size.height*2);
CGRect rectOld = CGRectMake(cropRect.origin.x, cropRect.origin.y , cropRect.size.width, cropRect.size.height);
CGImageRef imageRefNew = CGImageCreateWithImageInRect([imagePicked CGImage], rectNew);
CGImageRef imageRefOld = CGImageCreateWithImageInRect([imagePicked CGImage], rectOld);
UIImageView *imageViewNew = [[UIImageView alloc] initWithImage:[UIImage imageWithCGImage:imageRefNew]];
CGImageRelease(imageRefNew);
UIImageView *imageViewOld = [[UIImageView alloc] initWithImage:[UIImage imageWithCGImage:imageRefOld]];
CGImageRelease(imageRefOld);
NSLog(#"imageViewNew width = %f height = %f",imageViewNew.image.size.width, imageViewNew.image.size.height);
//imageViewNew width = 1840.000000 height = 1224.000000
NSLog(#"imageViewOld width = %f height = %f",imageViewOld.image.size.width, imageViewOld.image.size.height);
//imageViewOld width = 1280.000000 height = 1224.000000
UIImageWriteToSavedPhotosAlbum(imageEdited, nil, nil, NULL);
UIImageWriteToSavedPhotosAlbum([imageViewNew.image imageRotatedByDegrees:90.0], nil, nil, NULL);
UIImageWriteToSavedPhotosAlbum([imageViewOld.image imageRotatedByDegrees:90.0], nil, nil, NULL);
//assign the image to an UIImage Control
self.imageV.contentMode = UIViewContentModeScaleAspectFit;
self.imageV.frame = CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.width);
self.imageV.image = imageEdited;
}
[self dismissModalViewControllerAnimated:YES];
}

As you have observed the UIImagePickerController will return a scaled down edited image sometimes 640x640 sometimes 320x320 (device dependent).
Your question:
How can I do a square cut to original photo to be a 2448 * 2448 image?
To do this you need to first use the UIImagePickerControllerCropRect to create a new image from the original image obtained using the UIImagePickerControllerOriginalImage key of the info dictionary. Using the Quartz Core method, CGImageCreateWithImageInRect you can create a new image that only contains the pixels bounded by the passed rect; in this case the crop rect. You will need to take into account orientation in order for this to work properly. Then you need only scale the image to your desired size. It's important to note that the crop rect is relative to the original image when after it has been oriented correctly, not as it comes out of the camera or photo library. This is why we need to transform the crop rect to match the orientation when we start using Quartz methods to create new images, etc.
I took your code above and set it up to create a 1280x1280 image from the original image based on the crop rect. There are still some edge cases here, i.e. taking into account that the crop rect can sometimes have negative values, (the code assumes a square cropping rect) that have not been addressed.
First transform the crop rect to take into account the orientation of the incoming image and size. This transformCGRectForUIImageOrientation function is from NiftyBean
Create an image that is cropped to the transformed cropping rect.
Scale (and rotate) the image to the desired size. i.e. 1280x1280.
Create a UIImage from the CGImage with correct scale and orientation.
Here is your code with the changes: UPDATE New code has been added below this that should take care of the missing cases.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if ([mediaType isEqualToString:#"public.image"])
{
UIImage *imageEdited = [info objectForKey:UIImagePickerControllerEditedImage];
UIImage *imagePicked = [info objectForKey:UIImagePickerControllerOriginalImage];
CGRect cropRect;
cropRect = [[info valueForKey:#"UIImagePickerControllerCropRect"] CGRectValue];
NSLog(#"Original width = %f height= %f ",imagePicked.size.width, imagePicked.size.height);
//Original width = 1440.000000 height= 1920.000000
NSLog(#"imageEdited width = %f height = %f",imageEdited.size.width, imageEdited.size.height);
//imageEdited width = 640.000000 height = 640.000000
NSLog(#"corpRect %#", NSStringFromCGRect(cropRect));
//corpRect 80.000000 216.000000 1280.000000 1280.000000
CGSize finalSize = CGSizeMake(1280,1280);
CGImageRef imagePickedRef = imagePicked.CGImage;
CGRect transformedRect = transformCGRectForUIImageOrientation(cropRect, imagePicked.imageOrientation, imagePicked.size);
CGImageRef cropRectImage = CGImageCreateWithImageInRect(imagePickedRef, transformedRect);
CGColorSpaceRef colorspace = CGImageGetColorSpace(imagePickedRef);
CGContextRef context = CGBitmapContextCreate(NULL,
finalSize.width,
finalSize.height,
CGImageGetBitsPerComponent(imagePickedRef),
CGImageGetBytesPerRow(imagePickedRef),
colorspace,
CGImageGetAlphaInfo(imagePickedRef));
CGContextSetInterpolationQuality(context, kCGInterpolationHigh); //Give the context a hint that we want high quality during the scale
CGContextDrawImage(context, CGRectMake(0, 0, finalSize.width, finalSize.height), cropRectImage);
CGImageRelease(cropRectImage);
CGImageRef instaImage = CGBitmapContextCreateImage(context);
CGContextRelease(context);
//assign the image to an UIImage Control
UIImage *image = [UIImage imageWithCGImage:instaImage scale:imagePicked.scale orientation:imagePicked.imageOrientation];
self.imageView.contentMode = UIViewContentModeScaleAspectFit;
self.imageView.image = image;
CGImageRelease(instaImage);
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
}
[self dismissModalViewControllerAnimated:YES];
}
CGRect transformCGRectForUIImageOrientation(CGRect source, UIImageOrientation orientation, CGSize imageSize) {
switch (orientation) {
case UIImageOrientationLeft: { // EXIF #8
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI_2);
return CGRectApplyAffineTransform(source, txCompound);
}
case UIImageOrientationDown: { // EXIF #3
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI);
return CGRectApplyAffineTransform(source, txCompound);
}
case UIImageOrientationRight: { // EXIF #6
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(0.0, imageSize.width);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI + M_PI_2);
return CGRectApplyAffineTransform(source, txCompound);
}
case UIImageOrientationUp: // EXIF #1 - do nothing
default: // EXIF 2,4,5,7 - ignore
return source;
}
}
UPDATE I have made a couple of methods that will take care of the rest of the cases.
The steps are basically the same, with a couple of modifications.
First modification is to correctly transform and scale the context to handle the orientation of the incoming image,
and the second is to support the non square crops you can get from
the UIImagePickerController. In these cases the square image is
filled with a color of your choosing.
New Code
// CropRect is assumed to be in UIImageOrientationUp, as it is delivered this way from the UIImagePickerController when using AllowsImageEditing is on.
// The sourceImage can be in any orientation, the crop will be transformed to match
// The output image bounds define the final size of the image, the image will be scaled to fit,(AspectFit) the bounds, the fill color will be
// used for areas that are not covered by the scaled image.
-(UIImage *)cropImage:(UIImage *)sourceImage cropRect:(CGRect)cropRect aspectFitBounds:(CGSize)finalImageSize fillColor:(UIColor *)fillColor {
CGImageRef sourceImageRef = sourceImage.CGImage;
//Since the crop rect is in UIImageOrientationUp we need to transform it to match the source image.
CGAffineTransform rectTransform = [self transformSize:sourceImage.size orientation:sourceImage.imageOrientation];
CGRect transformedRect = CGRectApplyAffineTransform(cropRect, rectTransform);
//Now we get just the region of the source image that we are interested in.
CGImageRef cropRectImage = CGImageCreateWithImageInRect(sourceImageRef, transformedRect);
//Figure out which dimension fits within our final size and calculate the aspect correct rect that will fit in our new bounds
CGFloat horizontalRatio = finalImageSize.width / CGImageGetWidth(cropRectImage);
CGFloat verticalRatio = finalImageSize.height / CGImageGetHeight(cropRectImage);
CGFloat ratio = MIN(horizontalRatio, verticalRatio); //Aspect Fit
CGSize aspectFitSize = CGSizeMake(CGImageGetWidth(cropRectImage) * ratio, CGImageGetHeight(cropRectImage) * ratio);
CGContextRef context = CGBitmapContextCreate(NULL,
finalImageSize.width,
finalImageSize.height,
CGImageGetBitsPerComponent(cropRectImage),
0,
CGImageGetColorSpace(cropRectImage),
CGImageGetBitmapInfo(cropRectImage));
if (context == NULL) {
NSLog(#"NULL CONTEXT!");
}
//Fill with our background color
CGContextSetFillColorWithColor(context, fillColor.CGColor);
CGContextFillRect(context, CGRectMake(0, 0, finalImageSize.width, finalImageSize.height));
//We need to rotate and transform the context based on the orientation of the source image.
CGAffineTransform contextTransform = [self transformSize:finalImageSize orientation:sourceImage.imageOrientation];
CGContextConcatCTM(context, contextTransform);
//Give the context a hint that we want high quality during the scale
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
//Draw our image centered vertically and horizontally in our context.
CGContextDrawImage(context, CGRectMake((finalImageSize.width-aspectFitSize.width)/2, (finalImageSize.height-aspectFitSize.height)/2, aspectFitSize.width, aspectFitSize.height), cropRectImage);
//Start cleaning up..
CGImageRelease(cropRectImage);
CGImageRef finalImageRef = CGBitmapContextCreateImage(context);
UIImage *finalImage = [UIImage imageWithCGImage:finalImageRef];
CGContextRelease(context);
CGImageRelease(finalImageRef);
return finalImage;
}
//Creates a transform that will correctly rotate and translate for the passed orientation.
//Based on code from niftyBean.com
- (CGAffineTransform) transformSize:(CGSize)imageSize orientation:(UIImageOrientation)orientation {
CGAffineTransform transform = CGAffineTransformIdentity;
switch (orientation) {
case UIImageOrientationLeft: { // EXIF #8
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI_2);
transform = txCompound;
break;
}
case UIImageOrientationDown: { // EXIF #3
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI);
transform = txCompound;
break;
}
case UIImageOrientationRight: { // EXIF #6
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(0.0, imageSize.width);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,-M_PI_2);
transform = txCompound;
break;
}
case UIImageOrientationUp: // EXIF #1 - do nothing
default: // EXIF 2,4,5,7 - ignore
break;
}
return transform;
}

Related

Crop image using border frame

I am trying to crop image using rectangle frame. But somehow not able to do that according to its required.
Here is What i am trying:
Here is the result i want :
Now what i need is when click on done image should crop in rectangle shape exactly placed in image. I have tried few things like masking & draw image using mask image rect but no success yet.
Here is my code which is not working :
CALayer *mask = [CALayer layer];
mask.contents = (id)[imgMaskImage.image CGImage];
mask.frame = imgMaskImage.frame;
imgEditedImageView.layer.mask = mask;
imgEditedImageView.layer.masksToBounds = YES;
Can anyone suggest me the better way to implement it.
I have tried so many other things & wasted time so please if i get some help that it will be great & appreciated.
Thanks.
- (UIImage *)croppedPhoto {
// For dealing with Retina displays as well as non-Retina, we need to check
// the scale factor, if it is available. Note that we use the size of teh cropping Rect
// passed in, and not the size of the view we are taking a screenshot of.
CGRect croppingRect = CGRectMake(imgMaskImage.frame.origin.x,
imgMaskImage.frame.origin.y, imgMaskImage.frame.size.width,
imgMaskImage.frame.size.height);
imgMaskImage.hidden=YES;
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
UIGraphicsBeginImageContextWithOptions(croppingRect.size, YES,
[UIScreen mainScreen].scale);
} else {
UIGraphicsBeginImageContext(croppingRect.size);
}
// Create a graphics context and translate it the view we want to crop so
// that even in grabbing (0,0), that origin point now represents the actual
// cropping origin desired:
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(ctx, -croppingRect.origin.x, -croppingRect.origin.y);
[self.view.layer renderInContext:ctx];
// Retrieve a UIImage from the current image context:
UIImage *snapshotImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Return the image in a UIImageView:
return snapshotImage;
}
Here is the way you do
+(UIImage *)maskImage:(UIImage *)image andMaskingImage:(UIImage *)maskingImage{
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGImageRef maskImageRef = [maskingImage CGImage];
CGContextRef mainViewContentContext = CGBitmapContextCreate (NULL, maskingImage.size.width, maskingImage.size.height, 8, 0, colorSpace, kCGImageAlphaPremultipliedLast);
if (mainViewContentContext==NULL)
return NULL;
CGFloat ratio = 0;
ratio = maskingImage.size.width/ image.size.width;
if(ratio * image.size.height < maskingImage.size.height) {
ratio = maskingImage.size.height/ image.size.height;
}
CGRect rect1 = {{0, 0}, {maskingImage.size.width, maskingImage.size.height}};
//// CHANGE THIS RECT ACCORDING TO YOUR NEEDS
CGRect rect2 = {{-((image.size.width*ratio)-maskingImage.size.width)/2 , -((image.size.height*ratio)-maskingImage.size.height)/2}, {image.size.width*ratio, image.size.height*ratio}};
CGContextClipToMask(mainViewContentContext, rect1, maskImageRef);
CGContextDrawImage(mainViewContentContext, rect2, image.CGImage);
CGImageRef newImage = CGBitmapContextCreateImage(mainViewContentContext);
CGContextRelease(mainViewContentContext);
CGColorSpaceRelease(colorSpace);
UIImage *theImage = [UIImage imageWithCGImage:newImage];
CGImageRelease(newImage);
return theImage;
}
You need to have image like this
Note that
The mask image cannot have ANY transparency. Instead, transparent areas must be white or some value between black and white. The more towards black a pixel is the less transparent it becomes.

Weird behavior when rotating AVFoundation stills on iOS

OK, so the following code works, but I don't get why. I am capturing still images from the Front camera using AVFoundation. I have this code before initiating capture:
if ([connection isVideoOrientationSupported]) {
AVCaptureVideoOrientation orientation;
switch ([UIDevice currentDevice].orientation) {
case UIDeviceOrientationPortraitUpsideDown:
orientation = AVCaptureVideoOrientationPortraitUpsideDown;
break;
case UIDeviceOrientationLandscapeLeft:
orientation = AVCaptureVideoOrientationLandscapeRight;
break;
case UIDeviceOrientationLandscapeRight:
orientation = AVCaptureVideoOrientationLandscapeLeft;
break;
default:
orientation = AVCaptureVideoOrientationPortrait;
break;
}
[connection setVideoOrientation:orientation];
}
and then this in the captureStillImageAsynchronouslyFromConnection:completionHandler: to store the image:
NSData *imageData = [AVCaptureStillImageOutputjpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *i = [UIImage imageWithData:imageData];
orientation:i.imageOrientation];
UIGraphicsBeginImageContext(i.size);
[i drawAtPoint:CGPointMake(0.0, 0.0)];
image.image = UIGraphicsGetImageFromCurrentImageContext();
as you can see, I don't rotate the image or anything, just draw it in the context and save. But as soon as I try to use i it is always rotated by 90 degrees. If I try to rotate it using
UIImage *rotated = [[UIImage alloc] initWithCGImage:i.CGImage scale:1.0f orientation:i.imageOrientation];
it doesn't work (no change from just using i).
I understand that UIImage might just draw the image into the context using the right orientation automatically, but WTF?
you can check the device orientation like this :
if ((deviceOrientation == UIInterfaceOrientationLandscapeLeft && position == AVCaptureDevicePositionBack)
)
{
and if above condition or whatever your condition is satisfies then you can rotate your image using the code below :
-(void)rotateImageByDegress:(int)degrees{
if (degrees == 0.0) {
return self;
}
// calculate the size of the rotated view's containing box for our drawing space
UIView *rotatedViewBox = [[UIView alloc] initWithFrame:CGRectMake(0,0,self.size.width, self.size.height)];
CGAffineTransform t = CGAffineTransformMakeRotation(DegreesToRadians(degrees));
rotatedViewBox.transform = t;
CGSize rotatedSize = rotatedViewBox.frame.size;
[rotatedViewBox release];
// Create the bitmap context
UIGraphicsBeginImageContext(rotatedSize);
CGContextRef bitmap = UIGraphicsGetCurrentContext();
// Move the origin to the middle of the image so we will rotate and scale around the center.
CGContextTranslateCTM(bitmap, rotatedSize.width/2, rotatedSize.height/2);
// // Rotate the image context
CGContextRotateCTM(bitmap, DegreesToRadians(degrees));
// Now, draw the rotated/scaled image into the context
CGContextScaleCTM(bitmap, 1.0, -1.0);
CGContextDrawImage(bitmap, CGRectMake(-self.size.width / 2, -self.size.height / 2, self.size.width, self.size.height), [self CGImage]);
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

How do you crop an image in iOS

I have a photo app where you can add stickers in one section. When you're finished I want to save the image. Here is the code that I have to do that.
if(UIGraphicsBeginImageContextWithOptions != NULL)
{
UIGraphicsBeginImageContextWithOptions(self.view.frame.size, YES, 2.5);
} else {
UIGraphicsBeginImageContext(self.view.frame.size);
}
CGContextRef contextNew=UIGraphicsGetCurrentContext();
[self.view.layer renderInContext:contextNew];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Now the image that gets saved is the full screen of the image, which is fine, but now I need to crop the image and I don't know how. You can see the image at the link below:
http://dl.dropbox.com/u/19130454/Photo%202012-04-09%201%2036%2018%20PM.png
I need to crop:
91px from the left and right
220px from the bottom
Any help would be greatly appreciated. If I haven't explained things clearly, please let me know and I'll do my best to re-explain.
How about something like this
CGRect clippedRect = CGRectMake(self.view.frame.origin.x+91, self.view.frame.origin.y, self.view.frame.size.width-91*2, self.view.frame.size.height-220);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], clippedRect);
UIImage *newImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
Following code may help you.
You should get the correct cropFrame fist by below method
-(CGRect)cropRectForFrame:(CGRect)frame
{
// NSAssert(self.contentMode == UIViewContentModeScaleAspectFit, #"content mode must be aspect fit");
CGFloat widthScale = imageview.superview.bounds.size.width / imageview.image.size.width;
CGFloat heightScale = imageview.superview.bounds.size.height / imageview.image.size.height;
float x, y, w, h, offset;
if (widthScale<heightScale) {
offset = (imageview.superview.bounds.size.height - (imageview.image.size.height*widthScale))/2;
x = frame.origin.x / widthScale;
y = (frame.origin.y-offset) / widthScale;
w = frame.size.width / widthScale;
h = frame.size.height / widthScale;
} else {
offset = (imageview.superview.bounds.size.width - (imageview.image.size.width*heightScale))/2;
x = (frame.origin.x-offset) / heightScale;
y = frame.origin.y / heightScale;
w = frame.size.width / heightScale;
h = frame.size.height / heightScale;
}
return CGRectMake(x, y, w, h);
}
Then you need to call this method to get cropped image
- (UIImage *)imageByCropping:(UIImage *)image toRect:(CGRect)rect
{
// you need to update scaling factor value if deice is not retina display
UIGraphicsBeginImageContextWithOptions(rect.size,
/* your view opaque */ NO,
/* scaling factor */ 2.0);
// stick to methods on UIImage so that orientation etc. are automatically
// dealt with for us
[image drawAtPoint:CGPointMake(-rect.origin.x, -rect.origin.y)];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return result;
}
- (UIImage*)imageByCropping:(CGRect)rect
{
//create a context to do our clipping in
UIGraphicsBeginImageContext(rect.size);
CGContextRef currentContext = UIGraphicsGetCurrentContext();
//create a rect with the size we want to crop the image to
//the X and Y here are zero so we start at the beginning of our
//newly created context
CGRect clippedRect = CGRectMake(0, 0, rect.size.width, rect.size.height);
CGContextClipToRect( currentContext, clippedRect);
//create a rect equivalent to the full size of the image
//offset the rect by the X and Y we want to start the crop
//from in order to cut off anything before them
CGRect drawRect = CGRectMake(rect.origin.x * -1,
rect.origin.y * -1,
self.size.width,
self.size.height);
//draw the image to our clipped context using our offset rect
// CGContextDrawImage(currentContext, drawRect, self.CGImage);
[self drawInRect:drawRect]; // This will fix getting inverted image from context.
//pull the image from our cropped context
UIImage *cropped = UIGraphicsGetImageFromCurrentImageContext();
//pop the context to get back to the default
UIGraphicsEndImageContext();
//Note: this is autoreleased
return cropped;
}
Refer the below link for crop image
https://github.com/myang-git/iOS-Image-Crop-View
** How to Use **
Very easy! It is created to be a drop-in component, so no static library, no extra dependencies. Just copy ImageCropView.h and ImageCropView.m to your project, and implement ImageCropViewControllerDelegate protocol.
Use it like UIImagePicker:
- (void)cropImage:(UIImage *)image{
ImageCropViewController *controller = [[ImageCropViewController alloc] initWithImage:image];
controller.delegate = self;
[[self navigationController] pushViewController:controller animated:YES];
}
- (void)ImageCropViewController:(ImageCropViewController *)controller didFinishCroppingImage:(UIImage *)croppedImage{
image = croppedImage;
imageView.image = croppedImage;
[[self navigationController] popViewControllerAnimated:YES];
}
- (void)ImageCropViewControllerDidCancel:(ImageCropViewController *)controller{
imageView.image = image;
[[self navigationController] popViewControllerAnimated:YES];
}

Rotate UIImage: can't make it work

This is part of a big project, but my problem can be simplified as:
I have a simple view with an ImageView and a "Rotate" button. Whenever I press the button, the image inside the ImageView will rotate 90 degree. From much of what I've found on StackOverflow and other sites, this should work (please note that my image is a square image, which has width and height equal to 464):
- (UIImage *) getRotatedImageFromImage:(UIImage*)image:(int)rotate_index
{
UIImage *rotatedImage;
// Get image width, height of the bounding rectangle
CGRect boundingRect = CGRectMake(0, 0, image.size.width, image.size.height);
NSLog(#"width = %f, height = %f",boundingRect.size.width,boundingRect.size.height);
NSLog(#"rotate index = %d",rotate_index);
// Create a graphics context the size of the bounding rectangle
UIGraphicsBeginImageContext(boundingRect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextRotateCTM(context, rotate_index*M_PI/2);
// Draw the image into the context
[image drawInRect:boundingRect];
// Get an image from the context
rotatedImage = UIGraphicsGetImageFromCurrentImageContext();
// Clean up
UIGraphicsEndImageContext();
NSLog(#"rotatedImage size = (%f, %f)",rotatedImage.size.width,rotatedImage.size.height);
return rotatedImage;
}
- (IBAction) rotateImage
{
NSLog(#"rotate image");
rotateIndex++;
if (rotateIndex >= 4)
rotateIndex = 0;
[imageView setImage: [self getRotatedImageFromImage:imageToSubmit :rotateIndex]];
}
But it doesn't work for some reasons. What I have is: when pressing the button, the image only appears when rotateIndex gets to 0, and the image is the same as the original image (which is expected). When rotateIndex is 1, 2, 3, the imageView displays nothing, even though the size of rotatedImage printed out is correct (i.e. 464 x 464) .
Could anyone tell me what's going wrong?
Thanks.
//create rect
UIImageView *myImageView = [[UIImageView alloc]initWithImage:[UIImage imageNamed:#"image.png"]];
//set point of rotation
myImageView.center = CGPointMake(100.0,100.0);
//rotate rect
myImageView.transform =CGAffineTransformMakeRotation(3.14159265/2); //rotation in radians
I was able to solve my problem. Here is my solution, using CGImageRef and [UIImage imageWithCGImage: scale: orientation:]
CGImageRef cgImageRef = CGBitmapContextCreateImage(context);
UIImageOrientation imageOrientation;
switch (rotate_index) {
case 0: imageOrientation = UIImageOrientationUp; break;
case 1: imageOrientation = UIImageOrientationLeft; break;
//2 more cases for rotate_index = 2 and 3
}
rotatedImage = [UIImage imageWithCGImage:cgImageRef scale:1.0 orientation:imageOrientation];`

Crop a UIImage from the Center outwards?

I am making a camera app which includes digital zoom. I have a slider (zoomSlider) that has a minimum value of 1 and a maximum value of 4. When the user taps the camera button, it takes a picture, and then I crop it for the zooming. I have two problems:
How do I crop the exact middle of the image? (eg. 2x zoom, Rect would be centered with dimensions of 600x800 (for iPhone 2G/3G))
When I do this, it rotates the image. I make up for it by rotating the UIImageView it's in, but this causes a portrait picture to become landscape and vice versa.
Here is my code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
if ([mediaType isEqualToString:#"public.image"]){
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
CGRect clippedRect = CGRectMake(600, 450, image.size.width/zoomSlider.value, image.size.height/zoomSlider.value);
UIImage *cropped = [self imageByCropping:image toRect:clippedRect];
CGRect croppedImageSize = CGRectMake(0, 0, image.size.width/zoomSlider.value, image.size.height/zoomSlider.value);
[cropped drawInRect:croppedImageSize];
zoomPhoto.frame = croppedImageSize;
zoomPhoto.image = cropped;
CGAffineTransform rotateTransform = CGAffineTransformRotate(CGAffineTransformIdentity,
RADIANS(90.0));
zoomPhoto.transform = rotateTransform;
}
- (UIImage *)imageByCropping:(UIImage *)imageToCrop toRect:(CGRect)rect
{
CGImageRef imageRef = CGImageCreateWithImageInRect([imageToCrop CGImage], rect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return cropped;
}
CGFloat newWidth = image.size.width/zoomSlider.value;
CGFloat newHeight = image.size.height/zoomSlider.value;
CGRect clippedRect = CGRectMake((image.size.width-newWidth)/2, (image.size.height-newHeight)/2, newWidth, newHeight);
Remove this to not rotate the image:
CGAffineTransform rotateTransform = CGAffineTransformRotate(CGAffineTransformIdentity,RADIANS(90.0));
zoomPhoto.transform = rotateTransform;