Crop a UIImage from the Center outwards? - iphone

I am making a camera app which includes digital zoom. I have a slider (zoomSlider) that has a minimum value of 1 and a maximum value of 4. When the user taps the camera button, it takes a picture, and then I crop it for the zooming. I have two problems:
How do I crop the exact middle of the image? (eg. 2x zoom, Rect would be centered with dimensions of 600x800 (for iPhone 2G/3G))
When I do this, it rotates the image. I make up for it by rotating the UIImageView it's in, but this causes a portrait picture to become landscape and vice versa.
Here is my code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
if ([mediaType isEqualToString:#"public.image"]){
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
CGRect clippedRect = CGRectMake(600, 450, image.size.width/zoomSlider.value, image.size.height/zoomSlider.value);
UIImage *cropped = [self imageByCropping:image toRect:clippedRect];
CGRect croppedImageSize = CGRectMake(0, 0, image.size.width/zoomSlider.value, image.size.height/zoomSlider.value);
[cropped drawInRect:croppedImageSize];
zoomPhoto.frame = croppedImageSize;
zoomPhoto.image = cropped;
CGAffineTransform rotateTransform = CGAffineTransformRotate(CGAffineTransformIdentity,
RADIANS(90.0));
zoomPhoto.transform = rotateTransform;
}
- (UIImage *)imageByCropping:(UIImage *)imageToCrop toRect:(CGRect)rect
{
CGImageRef imageRef = CGImageCreateWithImageInRect([imageToCrop CGImage], rect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return cropped;
}

CGFloat newWidth = image.size.width/zoomSlider.value;
CGFloat newHeight = image.size.height/zoomSlider.value;
CGRect clippedRect = CGRectMake((image.size.width-newWidth)/2, (image.size.height-newHeight)/2, newWidth, newHeight);
Remove this to not rotate the image:
CGAffineTransform rotateTransform = CGAffineTransformRotate(CGAffineTransformIdentity,RADIANS(90.0));
zoomPhoto.transform = rotateTransform;

Related

How to perform square cut to the photos in camera roll?

I would like to try some image filter features on iPhone like instagram does.
I use imagePickerController to get photo from camera roll. I understand that the image return by imagePickerController was reduced to save memory. And it's not wise to load the original image to an UIImage. But how can I processe the image then save it back as the original pixels?
I use iPhone 4S as my development device.
The original photo in camera roll is 3264 * 2448.
The image return by UIImagePickerControllerOriginalImage is 1920 * 1440
The image return by UIImagePickerControllerEditedImage is 640 * 640
imageViewOld(use UIImagePickerControllerCropRect [80,216,1280,1280] to crop the image return by UIImagePickerControllerOriginalImage) is 1280 * 1224
imageViewNew(use double sized UIImagePickerControllerCropRect [80,216,2560,2560] to crop the image return by UIImagePickerControllerOriginalImage) is 1840 * 1224.
I check the same photo proceed by instagram is 1280 * 1280
My questions are:
Why UIImagePickerControllerOriginalImage does not return the "Original" photo? Why reduced it to 1920 * 1440?
Why UIImagePickerControllerEditedImage does not return the image by 1280 * 1280? As the
UIImagePickerControllerCropRect shows it is cut by 1280 * 1280 square?
How can I do a square cut to original photo to be a 2448 * 2448 image?
Thanks in advance.
Below is my code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if ([mediaType isEqualToString:#"public.image"])
{
UIImage *imageEdited = [info objectForKey:UIImagePickerControllerEditedImage];
UIImage *imagePicked = [info objectForKey:UIImagePickerControllerOriginalImage];
CGRect cropRect;
cropRect = [[info valueForKey:#"UIImagePickerControllerCropRect"] CGRectValue];
NSLog(#"Original width = %f height= %f ",imagePicked.size.width, imagePicked.size.height);
//Original width = 1440.000000 height= 1920.000000
NSLog(#"imageEdited width = %f height = %f",imageEdited.size.width, imageEdited.size.height);
//imageEdited width = 640.000000 height = 640.000000
NSLog(#"corpRect %f %f %f %f", cropRect.origin.x, cropRect.origin.y , cropRect.size.width, cropRect.size.height);
//corpRect 80.000000 216.000000 1280.000000 1280.000000
CGRect rectNew = CGRectMake(cropRect.origin.x, cropRect.origin.y , cropRect.size.width*2, cropRect.size.height*2);
CGRect rectOld = CGRectMake(cropRect.origin.x, cropRect.origin.y , cropRect.size.width, cropRect.size.height);
CGImageRef imageRefNew = CGImageCreateWithImageInRect([imagePicked CGImage], rectNew);
CGImageRef imageRefOld = CGImageCreateWithImageInRect([imagePicked CGImage], rectOld);
UIImageView *imageViewNew = [[UIImageView alloc] initWithImage:[UIImage imageWithCGImage:imageRefNew]];
CGImageRelease(imageRefNew);
UIImageView *imageViewOld = [[UIImageView alloc] initWithImage:[UIImage imageWithCGImage:imageRefOld]];
CGImageRelease(imageRefOld);
NSLog(#"imageViewNew width = %f height = %f",imageViewNew.image.size.width, imageViewNew.image.size.height);
//imageViewNew width = 1840.000000 height = 1224.000000
NSLog(#"imageViewOld width = %f height = %f",imageViewOld.image.size.width, imageViewOld.image.size.height);
//imageViewOld width = 1280.000000 height = 1224.000000
UIImageWriteToSavedPhotosAlbum(imageEdited, nil, nil, NULL);
UIImageWriteToSavedPhotosAlbum([imageViewNew.image imageRotatedByDegrees:90.0], nil, nil, NULL);
UIImageWriteToSavedPhotosAlbum([imageViewOld.image imageRotatedByDegrees:90.0], nil, nil, NULL);
//assign the image to an UIImage Control
self.imageV.contentMode = UIViewContentModeScaleAspectFit;
self.imageV.frame = CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.width);
self.imageV.image = imageEdited;
}
[self dismissModalViewControllerAnimated:YES];
}
As you have observed the UIImagePickerController will return a scaled down edited image sometimes 640x640 sometimes 320x320 (device dependent).
Your question:
How can I do a square cut to original photo to be a 2448 * 2448 image?
To do this you need to first use the UIImagePickerControllerCropRect to create a new image from the original image obtained using the UIImagePickerControllerOriginalImage key of the info dictionary. Using the Quartz Core method, CGImageCreateWithImageInRect you can create a new image that only contains the pixels bounded by the passed rect; in this case the crop rect. You will need to take into account orientation in order for this to work properly. Then you need only scale the image to your desired size. It's important to note that the crop rect is relative to the original image when after it has been oriented correctly, not as it comes out of the camera or photo library. This is why we need to transform the crop rect to match the orientation when we start using Quartz methods to create new images, etc.
I took your code above and set it up to create a 1280x1280 image from the original image based on the crop rect. There are still some edge cases here, i.e. taking into account that the crop rect can sometimes have negative values, (the code assumes a square cropping rect) that have not been addressed.
First transform the crop rect to take into account the orientation of the incoming image and size. This transformCGRectForUIImageOrientation function is from NiftyBean
Create an image that is cropped to the transformed cropping rect.
Scale (and rotate) the image to the desired size. i.e. 1280x1280.
Create a UIImage from the CGImage with correct scale and orientation.
Here is your code with the changes: UPDATE New code has been added below this that should take care of the missing cases.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if ([mediaType isEqualToString:#"public.image"])
{
UIImage *imageEdited = [info objectForKey:UIImagePickerControllerEditedImage];
UIImage *imagePicked = [info objectForKey:UIImagePickerControllerOriginalImage];
CGRect cropRect;
cropRect = [[info valueForKey:#"UIImagePickerControllerCropRect"] CGRectValue];
NSLog(#"Original width = %f height= %f ",imagePicked.size.width, imagePicked.size.height);
//Original width = 1440.000000 height= 1920.000000
NSLog(#"imageEdited width = %f height = %f",imageEdited.size.width, imageEdited.size.height);
//imageEdited width = 640.000000 height = 640.000000
NSLog(#"corpRect %#", NSStringFromCGRect(cropRect));
//corpRect 80.000000 216.000000 1280.000000 1280.000000
CGSize finalSize = CGSizeMake(1280,1280);
CGImageRef imagePickedRef = imagePicked.CGImage;
CGRect transformedRect = transformCGRectForUIImageOrientation(cropRect, imagePicked.imageOrientation, imagePicked.size);
CGImageRef cropRectImage = CGImageCreateWithImageInRect(imagePickedRef, transformedRect);
CGColorSpaceRef colorspace = CGImageGetColorSpace(imagePickedRef);
CGContextRef context = CGBitmapContextCreate(NULL,
finalSize.width,
finalSize.height,
CGImageGetBitsPerComponent(imagePickedRef),
CGImageGetBytesPerRow(imagePickedRef),
colorspace,
CGImageGetAlphaInfo(imagePickedRef));
CGContextSetInterpolationQuality(context, kCGInterpolationHigh); //Give the context a hint that we want high quality during the scale
CGContextDrawImage(context, CGRectMake(0, 0, finalSize.width, finalSize.height), cropRectImage);
CGImageRelease(cropRectImage);
CGImageRef instaImage = CGBitmapContextCreateImage(context);
CGContextRelease(context);
//assign the image to an UIImage Control
UIImage *image = [UIImage imageWithCGImage:instaImage scale:imagePicked.scale orientation:imagePicked.imageOrientation];
self.imageView.contentMode = UIViewContentModeScaleAspectFit;
self.imageView.image = image;
CGImageRelease(instaImage);
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
}
[self dismissModalViewControllerAnimated:YES];
}
CGRect transformCGRectForUIImageOrientation(CGRect source, UIImageOrientation orientation, CGSize imageSize) {
switch (orientation) {
case UIImageOrientationLeft: { // EXIF #8
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI_2);
return CGRectApplyAffineTransform(source, txCompound);
}
case UIImageOrientationDown: { // EXIF #3
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI);
return CGRectApplyAffineTransform(source, txCompound);
}
case UIImageOrientationRight: { // EXIF #6
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(0.0, imageSize.width);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI + M_PI_2);
return CGRectApplyAffineTransform(source, txCompound);
}
case UIImageOrientationUp: // EXIF #1 - do nothing
default: // EXIF 2,4,5,7 - ignore
return source;
}
}
UPDATE I have made a couple of methods that will take care of the rest of the cases.
The steps are basically the same, with a couple of modifications.
First modification is to correctly transform and scale the context to handle the orientation of the incoming image,
and the second is to support the non square crops you can get from
the UIImagePickerController. In these cases the square image is
filled with a color of your choosing.
New Code
// CropRect is assumed to be in UIImageOrientationUp, as it is delivered this way from the UIImagePickerController when using AllowsImageEditing is on.
// The sourceImage can be in any orientation, the crop will be transformed to match
// The output image bounds define the final size of the image, the image will be scaled to fit,(AspectFit) the bounds, the fill color will be
// used for areas that are not covered by the scaled image.
-(UIImage *)cropImage:(UIImage *)sourceImage cropRect:(CGRect)cropRect aspectFitBounds:(CGSize)finalImageSize fillColor:(UIColor *)fillColor {
CGImageRef sourceImageRef = sourceImage.CGImage;
//Since the crop rect is in UIImageOrientationUp we need to transform it to match the source image.
CGAffineTransform rectTransform = [self transformSize:sourceImage.size orientation:sourceImage.imageOrientation];
CGRect transformedRect = CGRectApplyAffineTransform(cropRect, rectTransform);
//Now we get just the region of the source image that we are interested in.
CGImageRef cropRectImage = CGImageCreateWithImageInRect(sourceImageRef, transformedRect);
//Figure out which dimension fits within our final size and calculate the aspect correct rect that will fit in our new bounds
CGFloat horizontalRatio = finalImageSize.width / CGImageGetWidth(cropRectImage);
CGFloat verticalRatio = finalImageSize.height / CGImageGetHeight(cropRectImage);
CGFloat ratio = MIN(horizontalRatio, verticalRatio); //Aspect Fit
CGSize aspectFitSize = CGSizeMake(CGImageGetWidth(cropRectImage) * ratio, CGImageGetHeight(cropRectImage) * ratio);
CGContextRef context = CGBitmapContextCreate(NULL,
finalImageSize.width,
finalImageSize.height,
CGImageGetBitsPerComponent(cropRectImage),
0,
CGImageGetColorSpace(cropRectImage),
CGImageGetBitmapInfo(cropRectImage));
if (context == NULL) {
NSLog(#"NULL CONTEXT!");
}
//Fill with our background color
CGContextSetFillColorWithColor(context, fillColor.CGColor);
CGContextFillRect(context, CGRectMake(0, 0, finalImageSize.width, finalImageSize.height));
//We need to rotate and transform the context based on the orientation of the source image.
CGAffineTransform contextTransform = [self transformSize:finalImageSize orientation:sourceImage.imageOrientation];
CGContextConcatCTM(context, contextTransform);
//Give the context a hint that we want high quality during the scale
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
//Draw our image centered vertically and horizontally in our context.
CGContextDrawImage(context, CGRectMake((finalImageSize.width-aspectFitSize.width)/2, (finalImageSize.height-aspectFitSize.height)/2, aspectFitSize.width, aspectFitSize.height), cropRectImage);
//Start cleaning up..
CGImageRelease(cropRectImage);
CGImageRef finalImageRef = CGBitmapContextCreateImage(context);
UIImage *finalImage = [UIImage imageWithCGImage:finalImageRef];
CGContextRelease(context);
CGImageRelease(finalImageRef);
return finalImage;
}
//Creates a transform that will correctly rotate and translate for the passed orientation.
//Based on code from niftyBean.com
- (CGAffineTransform) transformSize:(CGSize)imageSize orientation:(UIImageOrientation)orientation {
CGAffineTransform transform = CGAffineTransformIdentity;
switch (orientation) {
case UIImageOrientationLeft: { // EXIF #8
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI_2);
transform = txCompound;
break;
}
case UIImageOrientationDown: { // EXIF #3
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI);
transform = txCompound;
break;
}
case UIImageOrientationRight: { // EXIF #6
CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(0.0, imageSize.width);
CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,-M_PI_2);
transform = txCompound;
break;
}
case UIImageOrientationUp: // EXIF #1 - do nothing
default: // EXIF 2,4,5,7 - ignore
break;
}
return transform;
}

How to fit image in irregular shape frame

I have a problem regarding irregular shape. I searched a lot but nothing was useful for me. I have a number of frames of irregular shape and each frame is again divided into sub areas. I want to fit images from photo library in each sub areas of frame. But i am unable to get location of each sub areas and since shape is also irregular so again another problem to fit image in that area. Can anyone help me !! This is an example of that frame.
You can never have irregular shaped frames. Frames will always in rect shape.
You can have it done by detecting transparent areas.
Refer this link. It will give you idea how to do that :)
Do you want to clip the various images by the arc of the circle? For example, here is a screen snapshot with four images (just random images I got from a search for dogs on http://images.google.com):
And here are the same four images cropped by a circle (or more accurately, each of the four images were separately cropped by the same circle path):
Here is the code that does that
- (UIImage *)cropImage:(UIImage *)image locatedAt:(CGRect)imageFrame byCircleAt:(CGPoint)center withRadius:(float)radius
{
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(NULL, imageFrame.size.width, imageFrame.size.height, 8, 4 * imageFrame.size.width, colorSpace, kCGImageAlphaPremultipliedFirst);
CGContextBeginPath(context);
CGRect ellipseFrame = CGRectMake(center.x - imageFrame.origin.x - radius, imageFrame.size.height - (center.y - imageFrame.origin.y - radius) - radius * 2.0, radius * 2.0, radius * 2.0);
CGContextAddEllipseInRect(context, ellipseFrame);
CGContextClosePath(context);
CGContextClip(context);
CGContextDrawImage(context, CGRectMake(0, 0, imageFrame.size.width, imageFrame.size.height), image.CGImage);
CGImageRef imageMasked = CGBitmapContextCreateImage(context);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
UIImage *newImage = [UIImage imageWithCGImage:imageMasked];
CGImageRelease(imageMasked);
return newImage;
}
- (void)addSingleCroppedImage:(UIImage *)image at:(CGRect)imageFrame byCircleAt:(CGPoint)center withRadius:(float)radius
{
UIImage *newImage = [self cropImage:image locatedAt:imageFrame byCircleAt:center withRadius:radius];
UIImageView *imageView = [[UIImageView alloc] initWithFrame:imageFrame];
imageView.image = newImage;
[self.view addSubview:imageView];
}
- (void)addCroppedImages
{
NSString *bundlePath = [[NSBundle mainBundle] resourcePath];
CGPoint center = CGPointMake(self.view.frame.size.width / 2.0, self.view.frame.size.width / 2.0);
float radius = 150.0;
UIImage *dog1 = [UIImage imageWithContentsOfFile:[bundlePath stringByAppendingPathComponent:#"imgres-1.jpg"]];
UIImage *dog2 = [UIImage imageWithContentsOfFile:[bundlePath stringByAppendingPathComponent:#"imgres-2.jpg"]];
UIImage *dog3 = [UIImage imageWithContentsOfFile:[bundlePath stringByAppendingPathComponent:#"imgres-3.jpg"]];
UIImage *dog4 = [UIImage imageWithContentsOfFile:[bundlePath stringByAppendingPathComponent:#"imgres-4.jpg"]];
CGRect frame;
UIImage *currentImage;
// upper left
currentImage = dog1;
frame = CGRectMake(center.x - currentImage.size.width, center.y - currentImage.size.height, currentImage.size.width, currentImage.size.height);
[self addSingleCroppedImage:currentImage at:frame byCircleAt:center withRadius:radius];
// upper right
currentImage = dog2;
frame = CGRectMake(center.x, center.y - currentImage.size.height, currentImage.size.width, currentImage.size.height);
[self addSingleCroppedImage:currentImage at:frame byCircleAt:center withRadius:radius];
// lower left
currentImage = dog3;
frame = CGRectMake(center.x - currentImage.size.width, center.y, currentImage.size.width, currentImage.size.height);
[self addSingleCroppedImage:currentImage at:frame byCircleAt:center withRadius:radius];
// lower right
currentImage = dog4;
frame = CGRectMake(center.x, center.y, currentImage.size.width, currentImage.size.height);
[self addSingleCroppedImage:currentImage at:frame byCircleAt:center withRadius:radius];
}

How do you crop an image in iOS

I have a photo app where you can add stickers in one section. When you're finished I want to save the image. Here is the code that I have to do that.
if(UIGraphicsBeginImageContextWithOptions != NULL)
{
UIGraphicsBeginImageContextWithOptions(self.view.frame.size, YES, 2.5);
} else {
UIGraphicsBeginImageContext(self.view.frame.size);
}
CGContextRef contextNew=UIGraphicsGetCurrentContext();
[self.view.layer renderInContext:contextNew];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Now the image that gets saved is the full screen of the image, which is fine, but now I need to crop the image and I don't know how. You can see the image at the link below:
http://dl.dropbox.com/u/19130454/Photo%202012-04-09%201%2036%2018%20PM.png
I need to crop:
91px from the left and right
220px from the bottom
Any help would be greatly appreciated. If I haven't explained things clearly, please let me know and I'll do my best to re-explain.
How about something like this
CGRect clippedRect = CGRectMake(self.view.frame.origin.x+91, self.view.frame.origin.y, self.view.frame.size.width-91*2, self.view.frame.size.height-220);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], clippedRect);
UIImage *newImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
Following code may help you.
You should get the correct cropFrame fist by below method
-(CGRect)cropRectForFrame:(CGRect)frame
{
// NSAssert(self.contentMode == UIViewContentModeScaleAspectFit, #"content mode must be aspect fit");
CGFloat widthScale = imageview.superview.bounds.size.width / imageview.image.size.width;
CGFloat heightScale = imageview.superview.bounds.size.height / imageview.image.size.height;
float x, y, w, h, offset;
if (widthScale<heightScale) {
offset = (imageview.superview.bounds.size.height - (imageview.image.size.height*widthScale))/2;
x = frame.origin.x / widthScale;
y = (frame.origin.y-offset) / widthScale;
w = frame.size.width / widthScale;
h = frame.size.height / widthScale;
} else {
offset = (imageview.superview.bounds.size.width - (imageview.image.size.width*heightScale))/2;
x = (frame.origin.x-offset) / heightScale;
y = frame.origin.y / heightScale;
w = frame.size.width / heightScale;
h = frame.size.height / heightScale;
}
return CGRectMake(x, y, w, h);
}
Then you need to call this method to get cropped image
- (UIImage *)imageByCropping:(UIImage *)image toRect:(CGRect)rect
{
// you need to update scaling factor value if deice is not retina display
UIGraphicsBeginImageContextWithOptions(rect.size,
/* your view opaque */ NO,
/* scaling factor */ 2.0);
// stick to methods on UIImage so that orientation etc. are automatically
// dealt with for us
[image drawAtPoint:CGPointMake(-rect.origin.x, -rect.origin.y)];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return result;
}
- (UIImage*)imageByCropping:(CGRect)rect
{
//create a context to do our clipping in
UIGraphicsBeginImageContext(rect.size);
CGContextRef currentContext = UIGraphicsGetCurrentContext();
//create a rect with the size we want to crop the image to
//the X and Y here are zero so we start at the beginning of our
//newly created context
CGRect clippedRect = CGRectMake(0, 0, rect.size.width, rect.size.height);
CGContextClipToRect( currentContext, clippedRect);
//create a rect equivalent to the full size of the image
//offset the rect by the X and Y we want to start the crop
//from in order to cut off anything before them
CGRect drawRect = CGRectMake(rect.origin.x * -1,
rect.origin.y * -1,
self.size.width,
self.size.height);
//draw the image to our clipped context using our offset rect
// CGContextDrawImage(currentContext, drawRect, self.CGImage);
[self drawInRect:drawRect]; // This will fix getting inverted image from context.
//pull the image from our cropped context
UIImage *cropped = UIGraphicsGetImageFromCurrentImageContext();
//pop the context to get back to the default
UIGraphicsEndImageContext();
//Note: this is autoreleased
return cropped;
}
Refer the below link for crop image
https://github.com/myang-git/iOS-Image-Crop-View
** How to Use **
Very easy! It is created to be a drop-in component, so no static library, no extra dependencies. Just copy ImageCropView.h and ImageCropView.m to your project, and implement ImageCropViewControllerDelegate protocol.
Use it like UIImagePicker:
- (void)cropImage:(UIImage *)image{
ImageCropViewController *controller = [[ImageCropViewController alloc] initWithImage:image];
controller.delegate = self;
[[self navigationController] pushViewController:controller animated:YES];
}
- (void)ImageCropViewController:(ImageCropViewController *)controller didFinishCroppingImage:(UIImage *)croppedImage{
image = croppedImage;
imageView.image = croppedImage;
[[self navigationController] popViewControllerAnimated:YES];
}
- (void)ImageCropViewControllerDidCancel:(ImageCropViewController *)controller{
imageView.image = image;
[[self navigationController] popViewControllerAnimated:YES];
}

Rotate UIImage: can't make it work

This is part of a big project, but my problem can be simplified as:
I have a simple view with an ImageView and a "Rotate" button. Whenever I press the button, the image inside the ImageView will rotate 90 degree. From much of what I've found on StackOverflow and other sites, this should work (please note that my image is a square image, which has width and height equal to 464):
- (UIImage *) getRotatedImageFromImage:(UIImage*)image:(int)rotate_index
{
UIImage *rotatedImage;
// Get image width, height of the bounding rectangle
CGRect boundingRect = CGRectMake(0, 0, image.size.width, image.size.height);
NSLog(#"width = %f, height = %f",boundingRect.size.width,boundingRect.size.height);
NSLog(#"rotate index = %d",rotate_index);
// Create a graphics context the size of the bounding rectangle
UIGraphicsBeginImageContext(boundingRect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextRotateCTM(context, rotate_index*M_PI/2);
// Draw the image into the context
[image drawInRect:boundingRect];
// Get an image from the context
rotatedImage = UIGraphicsGetImageFromCurrentImageContext();
// Clean up
UIGraphicsEndImageContext();
NSLog(#"rotatedImage size = (%f, %f)",rotatedImage.size.width,rotatedImage.size.height);
return rotatedImage;
}
- (IBAction) rotateImage
{
NSLog(#"rotate image");
rotateIndex++;
if (rotateIndex >= 4)
rotateIndex = 0;
[imageView setImage: [self getRotatedImageFromImage:imageToSubmit :rotateIndex]];
}
But it doesn't work for some reasons. What I have is: when pressing the button, the image only appears when rotateIndex gets to 0, and the image is the same as the original image (which is expected). When rotateIndex is 1, 2, 3, the imageView displays nothing, even though the size of rotatedImage printed out is correct (i.e. 464 x 464) .
Could anyone tell me what's going wrong?
Thanks.
//create rect
UIImageView *myImageView = [[UIImageView alloc]initWithImage:[UIImage imageNamed:#"image.png"]];
//set point of rotation
myImageView.center = CGPointMake(100.0,100.0);
//rotate rect
myImageView.transform =CGAffineTransformMakeRotation(3.14159265/2); //rotation in radians
I was able to solve my problem. Here is my solution, using CGImageRef and [UIImage imageWithCGImage: scale: orientation:]
CGImageRef cgImageRef = CGBitmapContextCreateImage(context);
UIImageOrientation imageOrientation;
switch (rotate_index) {
case 0: imageOrientation = UIImageOrientationUp; break;
case 1: imageOrientation = UIImageOrientationLeft; break;
//2 more cases for rotate_index = 2 and 3
}
rotatedImage = [UIImage imageWithCGImage:cgImageRef scale:1.0 orientation:imageOrientation];`

why some UIimages don't show up in iphone

hi I am currently developing a small app on ios 4.3 , using objective c
as part of the app I need to manipulate an Image that I have downloaded from the web.
the following code shows up a missing image:
(the original is in a class but I just put this together as a test scenario so that it could be easily copy pasted)
- (void)viewDidLoad
{
[super viewDidLoad];
[self loadImage:#"http://www.night-net.net/images/ms/microsoft_vista_home_basic.jpg"];
[self getCroped:CGRectMake(10, 50, 80, 160)];
[self getCroped:CGRectMake(90, 50, 80, 80)];
[self getCroped:CGRectMake(90, 130, 40, 80)];
[self getCroped:CGRectMake(130, 130, 40, 40)];
[self getCroped:CGRectMake(130, 170, 40, 40)];
}
-(void) loadImage : (NSString*) url
{
_data = [NSData dataWithContentsOfURL:
[NSURL URLWithString: url]];
}
-(UIImageView*) getCroped:(CGRect) imageSize{
UIImage *temp = [[UIImage alloc] initWithData:_data];
UIImage *myImage = [self resizedImage:temp and:CGSizeMake(160,160) interpolationQuality:kCGInterpolationHigh];
UIImage *image = [self croppedImage:myImage and:imageSize];
UIImageView *imageView = [[UIImageView alloc] init];
imageView.image = image;
imageView.frame = imageSize;
[[self view] addSubview:imageView];
return imageView;
}
- (UIImage *)croppedImage:(UIImage*) image and: (CGRect)bounds {
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], bounds);
UIImage *croppedImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return croppedImage;
}
- (UIImage *)resizedImage:(UIImage*) image and:(CGSize)newSize interpolationQuality:(CGInterpolationQuality)quality {
BOOL drawTransposed = NO;
return [self resizedImage:image
and:newSize
transform:[self transformForOrientation:newSize]
drawTransposed:drawTransposed
interpolationQuality:quality];
}
// Returns a copy of the image that has been transformed using the given affine transform and scaled to the new size
// The new image's orientation will be UIImageOrientationUp, regardless of the current image's orientation
// If the new size is not integral, it will be rounded up
- (UIImage *)resizedImage:(UIImage*) image and:(CGSize)newSize
transform:(CGAffineTransform)transform
drawTransposed:(BOOL)transpose
interpolationQuality:(CGInterpolationQuality)quality {
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGRect transposedRect = CGRectMake(0, 0, newRect.size.height, newRect.size.width);
CGImageRef imageRef = image.CGImage;
// Build a context that's the same dimensions as the new size
CGContextRef bitmap = CGBitmapContextCreate(NULL,
newRect.size.width,
newRect.size.height,
CGImageGetBitsPerComponent(imageRef),
0,
CGImageGetColorSpace(imageRef),
CGImageGetBitmapInfo(imageRef));
// Rotate and/or flip the image if required by its orientation
CGContextConcatCTM(bitmap, transform);
// Set the quality level to use when rescaling
CGContextSetInterpolationQuality(bitmap, quality);
// Draw into the context; this scales the image
CGContextDrawImage(bitmap, transpose ? transposedRect : newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(bitmap);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
// Clean up
CGContextRelease(bitmap);
CGImageRelease(newImageRef);
return newImage;
}
// Returns an affine transform that takes into account the image orientation when drawing a scaled image
- (CGAffineTransform)transformForOrientation:(CGSize)newSize {
CGAffineTransform transform = CGAffineTransformIdentity;
transform = CGAffineTransformTranslate(transform, newSize.width, 0);
transform = CGAffineTransformScale(transform, -1, 1);
return transform;
}
at first I thought this is caused by a lack of memory, but I have tested for that and that doesnt seem to be the problem,thanks in advance ofir
I've had issues in the past with images not appearing within UIWebViews if they contain unicode characters in the filename. I wonder if this might be the same thing. Try renaming your image?
doing this should be possible and low on memory cost as I did the same test,using flash to create an iphone app that does the same thing, and it works.
but I would much prefer using objective c so the question still stands