How to crop image in iPhone? - iphone

In my application I m using following codes to crop the captured image :-
-(void)imagePickerController:(UIImagePickerController *) picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
#ifdef _DEBUG
NSLog(#"frmSkinImage-imagePickerController-Start");
#endif
imageView.image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
//=======================================
UIImage *image =imageView.image;
CGRect cropRect = CGRectMake(100, 100, 125,128);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], cropRect);
[imageView setImage:[UIImage imageWithCGImage:imageRef]];
CGImageRelease(imageRef);
//===================================================
//imgglobal = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
// for saving image to photo album
//UIImageWriteToSavedPhotosAlbum(imageView.image, self, #selector(image:didFinishSavingWithError:contextInfo:), self);
[picker dismissModalViewControllerAnimated:YES];
#ifdef _DEBUG
NSLog(#"frmSkinImage-imagePickerController-End");
#endif
}
But my problem is that when I use camera to take photo to crop the captured image it rotates the image to 90 degree towards right and in case I use Photo library it works perfectly.
So can you examine my above code, to see where I am wrong?

The CGImage is a UIImage without the meta data, and therefore loses the orientation information. I'd suggest that you get the orientation of the original [UIImage imageOrientation], store it and then apply it to the final image.
If that doesn't work, try applying a CGAffineTransformMakeRotation(90.0*0.0174532925); to the final image according to the orientation of the original.

Fastest and easiest way is to have a look to NYXImagesKit. With UIImage+Resizing category you can crop image. From the documentation:
UIImage+Resizing
This category can be used to crop or to scale images.
Cropping
You can crop your image by 9 different ways :
Top left Top center Top right Bottom left Bottom center Bottom right
Left center Right center Center
UIImage* cropped = [myImage cropToSize:(CGSize){width, height}
usingMode:NYXCropModeCenter];
NYXCropMode is an enum type which can be found in the header file, it
is used to represent the 9 modes above.

Related

How do I take a screenshot of a view which has transform?

I'm able to crop a view with this code
- (UIImage *)captureScreenInRect:(CGRect)captureFrame {
CALayer *layer;
layer = self.view.layer;
UIGraphicsBeginImageContext(self.view.bounds.size);
CGContextClipToRect (UIGraphicsGetCurrentContext(),captureFrame);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return screenImage;
}
But I have an imageview zoomed in with transform and it isn't shown to scale.
How do I capture EXACTLY what the user sees on the screen
The Stack Overflow question "renderInContext:" and CATransform3D has more info, but the gist is:
QCCompositionLayer, CAOpenGLLayer, and QTMovieLayer layers are not rendered. Additionally, layers that use 3D transforms are not rendered, nor are layers that specify backgroundFilters, filters, compositingFilter, or a mask values.
(from the CALayer docs).
More info is also available in this technical Q&A: http://developer.apple.com/library/ios/#qa/qa1703/_index.html
If your app is not going to the app store you can use the undocumented UIGetScreenImage API:
// Define at top of implementation file
CGImageRef UIGetScreenImage(void);
...
- (void)buttonPressed:(UIButton *)button
{
// Capture screen here...
CGImageRef screen = UIGetScreenImage();
UIImage* image = [UIImage imageWithCGImage:screen];
CGImageRelease(screen);
// Save the captured image to photo album
UIImageWriteToSavedPhotosAlbum(image, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
}
(from John Muchow)
However, use of this API will make your app not get approved.
I have been unable to find any other workarounds.

Cropping Image using CGImageCreateWithImageInRect

I'm attempting to implement an iOS camera view that takes pictures that are square in shape (similar to Instagram). My code appears below. The first part, where the frame height is set to be equal to the frame width, is working as expected and the user is given a view that is square. The problem occurs later when I attempt to apply the frame (which is a CGRect property) to the image data using CGImageCreateWithImageInRect. I pass the frame rect to this method with the image. But the results are not cropped to be square. Instead the image retains the original default dimensions from the iOS camera. Can someone please tell me what I've done wrong? My understanding from the Apple documentation is that CGImageCreateWithImageInRect should select an image area of shape Rect from some starting x/y coordinate. But that doesn't seem to be happening.
//Set the frame size to be square shaped
UIView *view = imagePicker.view;
frame = view.frame;
frame.size.height = frame.size.width;
view.frame = frame;
//Crop the image to the frame dimensions using CGImageCreateWithImageInRect
-(void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[self.popoverController dismissPopoverAnimated:true];
NSString *mediaType = [info
objectForKey:UIImagePickerControllerMediaType];
[self dismissModalViewControllerAnimated:YES];
if ([mediaType isEqualToString:(NSString *)kUTTypeImage]) {
UIImage *image = [info
objectForKey:UIImagePickerControllerOriginalImage];
croppedImage = (__bridge UIImage *)(CGImageCreateWithImageInRect((__bridge CGImageRef)(image), frame));
imageView.image = croppedImage;
}
else if ([mediaType isEqualToString:(NSString *)kUTTypeMovie])
{
// Code here to support video if enabled
}
}
You are doing right. The only thing is that I think you are setting the frame property same as the picker view, so the final size is the same as the original size.
Try to set frame smaller than pickerView.view.frame, not equal
Check this out
Cropping an UIImage
You are setting the frame wrong.
I suggest you take a look at this sample code from Apple, on how to create what you are trying to:
https://developer.apple.com/library/mac/#samplecode/VirtualScanner/Listings/Sources_VirtualScanner_m.html#//apple_ref/doc/uid/DTS40011006-Sources_VirtualScanner_m-DontLinkElementID_9
Look at the:
- (ICAError)startScanningWithParams:(ICD_ScannerStartPB*)pb
function

Save zoomed image from camera

I am developing a camera application for iphone/ipad.
We are using an overlay for displaying the camera app on top of the viewfinder.
Currently i am trying to save the zoomed image. We are able to zoom the image on viewfinder. But when we save the image it gets saved in the original size.
To solve this we are scaling the zoomed image using the following code :
UIImageView *v = [[UIImageView alloc]initWithImage:image];
UIGraphicsBeginImageContext(v.bounds.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSaveGState(context);
CGContextScaleCTM(context, zoomvalue,zoomvalue);
[v drawRect:CGRectMake(0,0,320,480)];
CGContextRestoreGState(context);
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Using the above code we are able to get the zoomed image.
However, we need to crop the center portion of the scaled image and save the same. We are not getting the same.
Kindly help
if You are using UIImagePickerController to capture image and zoom your image there then you can get the edited image by getting UIImagePickerControllerEditedImage from picker NSDictionary something like this -
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
img = [[info objectForKey:UIImagePickerControllerEditedImage] retain];
[picker dismissModalViewControllerAnimated:YES];
}

Overlaying UIImageview over UIImageview save

I'm trying to merge two UIImageViews. The first UIImageView (theimageView) is the background, and the second UIImageView (Birdie) is an image overlaying the first UIImageView. You can load the first UIImageView from a map or take a picture. After this you can drag, rotate and scale the second UIImageView over the first one. I want the output (saved image) to look the same as what I see on the screen.
I got that working, but I get borders and the quality and size are bad. I want the size to be the same as that of the image which is chosen, and the quality to be good. Also I get a crash if I save it a second time, right after the first time.
Here is my current code:
//save actual design in photo library
- (void)captureScreen {
UIImage *myImage = [self addImage:theImageView ToImage:Birdie];
[myImage retain];
UIImageWriteToSavedPhotosAlbum(myImage, self, #selector(imageSavedToPhotosAlbum:didFinishSavingWithError:contextInfo:), self);
}
- (UIImage*) addImage:(UIImage*)theimageView toImage:(UIImage*)Birdie{
CGSize size = CGSizeMake(theimageView.size.height, theimageView.size.width);
UIGraphicsBeginImageContext(size);
CGPoint pointImg1 = CGPointMake(0,0);
[theimageView drawAtPoint:pointImg1 ];
CGPoint pointImage2 = CGPointMake(0, 0);
[Birdie drawAtPoint:pointImage2 ];
UIImage* result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return result;
}
But I only get errors with this code!
Thanks in advanced!
Take a look at Drawing a PNG Image Into a Graphics Context for Blending Mode Manipulation

incorrect values for UIImagePickerControllerCropRect rectangle

I am working on an application where i am picking up images from iphone library using uiimagepicker control. I am using following code to pick up image.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSLog(#"image picked:");
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
//UIImageWriteToSavedPhotosAlbum(image, self, nil, nil);
CGRect cropRect;
cropRect = [[info valueForKey:#"UIImagePickerControllerCropRect"] CGRectValue];
}
The above method works perfect when I capture an image from camera but when I pick an image from iphone library, the cropRect gives me incorrect values. It is always set to x = 43 or greater, even if I pick the rectangle from extreme left of the screen. So as a result I am getting a vertical black strip on left side of the image.
thanks in advance