I want to state my basic requirement which is to change the frame for crop rect in UIImagePickerController for a camera.
I just realized that it is not possible to change frame for crop rect. That leaves me with only one option i.e to create my own camera overlay wherein I can set frame for crop rect. I searched a lot but found nothing. I asked previously but didn't get anything. I don't even know is it possible and if yes then how to create it and move the crop box, scale it in accordance to default UIImagePickerController crop rect.
You have to Implement your own CropRect. First set the
[picker setAllowsEditing:NO];
Then in didFinishPickingMediaWithInfo delegate Push your own CropRect View
CustomImageEditor *custom = [[CustomImageEditor alloc] initWithNibName:#"CustomImageEditor" bundle:nil];
[picker pushViewController:custom animated:YES];
[custom release];
while Pushing view pass the image to the Custom View like this
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
custom.pickedImage = image;
In that customView you crop the image.
For croping the image try like this..
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], cropRect);
UIImage *image = [UIImage imageWithCGImage:imageRef];
Related
In my app, the user is able to put stickers on top of a photo. When they go to save their creation, I do a screen grab and store it in a UIImage:
UIGraphicsBeginImageContextWithOptions(self.mainView.bounds.size, NO, [UIScreen mainScreen].scale);
[self.mainView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *resultImage = [UIGraphicsGetImageFromCurrentImageContext() retain];
UIGraphicsEndImageContext();
(where self.mainView has a subview UIImageView which holds the photo, and another subview UIView which holds the stickers).
I am wondering, is it possible to do a screen shot in this manner, and maintain the resolution of the aforementioned photo?
The following will 'flatten' two UIImages into one while maintaining the resolution of the original image(s):
CGSize photoSize = photoImage.size;
UIGraphicsBeginImageContextWithOptions(photoSize, NO, 0.0);
CGRect photoRect = CGRectMake(0, 0, photoSize.width, photoSize.height);
// Add the original photo into the context
[photoImage drawInRect:photoRect];
// Add the sticker image with its upper left corner set to where the user placed it
[stickerImage drawAtPoint:stickerView.frame.origin];
// Get the resulting 'flattened' image
UIImage *flattenedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
The above assumes photoImage and stickerImage are both instances of UIImage and stickerView is a UIView with containing the stickerImage and thus will be able to use the stickerView frame to determine its origin.
If you have multiple stickers, just iterate through the collection.
If you are looking to save an image of your current view then this might help you.
UIGraphicsBeginImageContext(self.scrollView.contentSize);
[self.scrollView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGImageRef imageRef = CGImageCreateWithImageInRect(finalImage.CGImage,
CGRectMake(scrollView.contentOffset.x, scrollView.contentOffset.y,
scrollView.frame.size.width, scrollView.frame.size.height));
UIImage *screenImage = [UIImage imageWithCGImage:imageRef scale:[UIScreen mainScreen].scale orientation:UIImageOrientationUp];
CGImageRelease(imageRef);
There are Two View controllers in one view controller i am taking an snap shot and in the anothe view controller i am showing tha anap shot that was picked.
I am working on image, i need to take a snap shot while i am doing this i am getting white background. I do not know why this appears? As below image (There are Three image views , Initially image view has a transparent image view then background body image view and tattoo image view.)
When i am taking the screen shot I am hiding the Background image.
i set the view alpha to 1.0f, i set the view background color to clear color
The output screen shot is
My code is
self.backgroundImgView.hidden=YES;
self.view.backgroundColor=[UIColor clearColor];
self.view.alpha=1.0f;
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGRect rect;
rect = CGRectMake(appDelegate.xFloat, appDelegate.yFloat, appDelegate.widthFloat, appDelegate.heightFloat);
CGImageRef imageRef = CGImageCreateWithImageInRect([viewImage CGImage], rect);
UIImage *img = [UIImage imageWithCGImage:imageRef];
appDelegate.tattooImg=img;
CGImageRelease(imageRef);
Make a UIView like a holderview.... on the self.view. Then add both the images on the holderview not the self.view.... then wen u save the screenshot take it of the holderview ...
In the Below image there are two imageViews one is body imageView and another one is black tattoo imageview ,
Now i am getting the tatoo imageView position with the below code
appDelegate.xFloat= self.imgView.frame.origin.x;
appDelegate.yFloat= self.imgView.frame.origin.y;
appDelegate.widthFloat= self.imgView.frame.size.width;
appDelegate.heightFloat= self.imgView.frame.size.height;
Now i need to put the tattoo image in another view controller as we are seeing in the image(Here Car is in reverse position), But with the help of (appDelegate.xFloat, appDelegate.yFloat, appDelegate.widthFloat, appDelegate.heightFloat) these I am setting the tattooimageview frame.
But i am getting the image as shown below in another view
I need to place the car image in reverse as we seen in first image.
Please Guide Me
My requirement is not only rotation.. The image may be in any position like below
It is strange that one image is not rotating and the other one is rotating. I am assuming that one of the images is a background image or what ever, but you can use transform properties to change the rotation. Some thing similar to the code below should show you the same image in two orientations.
UIImage *image = [UIImage imageNamed:#"image.png"];
UIImageView *iv1 =[[UIImageView alloc] initWithImage:image];
[self.view addSubview:iv1];
UIImageView *iv2 = [[UIImageView alloc] initWithImage:image];
[iv2 setFrame:CGRectMake(100,100,image.size.width,image.size.height)];
[iv2 setTransform:CGAffineTransformMakeRotation(-M_PI)];
[self.view addSubview:iv2];
you must contructor appDelegate before use it.
by:
appDelegate = (AppDelegate *) [[UIApplication sharedApplication] delegate];
Babul, Looks like you are trying to create an image. For that you need to use some thing called ImageContext. I have given some code, where you draw an image in a context and then get the image out.
UIGraphicsBeginImageContext(CGSizeMake(300,300));
UIImage *image = [UIImage imageNamed:#"breakpoint place.png"];
[image drawAtPoint:CGPointMake(0,0)];
UIImage *image2 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageView *iv = [[UIImageView alloc] initWithImage:image2];
[iv setFrame:CGRectMake(100,100,image2.size.width,image2.size.height)];
[self.view addSubview:iv];
I am developing an iPhone app in which I am capturing the live video.
I have added the AVCaptureVideoPreviewLayer to the self.view.layer as sublayer.
[self.view.layer addSublayer: self.prevLayer];
On the same self.view I have added an image as subview.
[self.view addSubView:image];
Now what I want is to capture an image from the live video but the image should also come into that picture as it looks on the screen.
I have tried to take the screen shot but it capture only the image and not the live video image.
Can any body please help me on this.
Thanks in advance.
hi this will help you as it works for me
- (void)didTakePicture:(UIImage *)picture
{
UIImage *frm = self.overlayViewController.imgOverlay.image;
NSLog(#"width %f height %f",picture.size.width,picture.size.height);
UIGraphicsBeginImageContext(picture.size);
[picture drawAtPoint:CGPointMake(0,0)];
[frm drawInRect:CGRectMake(0, 0, picture.size.width, picture.size.height)];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSLog(#"view %#",viewImage.debugDescription);
[self.capturedImages addObject:viewImage];
}
In brief, the app is meant to do the following:
The user will select from the photos within the iPhone (via ALAssets). This part is fine.
The selected photo will appear on another view, within a smaller, rectangular subview.
When using the ALAsset thumbnail, the image shows up correctly within the subview. However, the resolution is poor, so I'm trying to use a higher resolution image.
When using the full resolution image, and then placing it into the view, any portrait photo will be rotated 90 degrees counter-clockwise.
Code-wise, this it looks like this:
//ImageView is the UIImageView subview that will hold the selected photo image.
//This code works ok, but I'd rather have a higher resolution photo.
[imageView setClipsToBounds:YES];
[imageView setContentMode:UIViewContentModeScaleAspectFill];
[imageView setImage:[UIImage imageWithCGImage:[selectedPhoto thumbnail]];
Now, this is the code where I try to use a higher resolution image, but then the image within imageView is rotated -90 degrees:
[imageView setClipsToBounds:YES];
[imageView setContentMode:UIViewContentModeScaleAspectFill];
//HIGH RESOLUTION IMAGE
ALAssetRepresentation *rep = [selectedPhoto defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
UIImage *largeImage = [Utilities imageWithImage:[UIImage imageWithCGImage:iref] scaledToSize:CGSizeMake(CGImageGetWidth(iref)/4, CGImageGetHeight(iref)/4)];
[imageView setImage:largeImage];
}
And the imageWithImage: function called there looks like this:
+ (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
As mentioned, photos taken in landscape orientation by the iPhone camera come out OK. However, photos taken in portrait camera orientation come out rotated -90 degrees somehow.
How can I make it so that each photo, whether taken in portrait or landscape, will come out oriented correctly on my ImageView?
Any help would be appreciated! Thank you!
I found the answer here!
ALAssetRepresentation fullResolutionImage's UIImageOrientation is wrong
I am doing this now:
UIImage *largeImage = [[UIImage alloc] initWithCGImage:iref scale:4 orientation:(UIImageOrientation)rep.orientation];