Not able to merge two images into one - iphone

I'm trying to merge two images in one, and save that image onto the camera roll. But it just show a blank image. Can anyone help?
My code:
-(void)SaveFinalImage{
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *savedImg = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(savedImg, nil, nil, nil);
}

I have used this in my app.
UIImage *bottomImage = [UIImage imageNamed:#"bottom.png"]; //background image
UIImage *image = [UIImage imageNamed:#"top.png"]; //foreground image
CGSize newSize = CGSizeMake(width, height);
UIGraphicsBeginImageContext( newSize );
// Use existing opacity as is
[bottomImage drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
// Apply supplied opacity if applicable
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height) blendMode:kCGBlendModeNormal alpha:0.8];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
for more see my related answer on the same subject. iOS - Merging two images of different size

Related

Blending UIImage with alpha value

How to blend two images by changing alfa of only one image so that the upper image will be slightly transparent and the above image will be displayed as it is drawn on the background image.
You can do it through imageView alpha property
imageView.alpha = 0.0f;
Detailed code:
UIImage *bottomImage = [UIImage imageNamed:#"bottomImage.png"];
UIImage *topImage = [UIImage imageNamed:#"topImage.png"];
CGSize newSize = CGSizeMake(width, height);
UIGraphicsBeginImageContext( newSize );
// Use existing opacity as is
[bottomImage drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
// Apply supplied opacity
[topImage drawInRect:CGRectMake(0,0,newSize.width,newSize.height) blendMode:kCGBlendModeNormal alpha:0.8];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
see more details here

How to clip image with another image in CGContext?

I want to get the following result with two images.
Please help me.
To combine two images on an image view try this
UIImage *bottomImage = [UIImage imageNamed:#"bottom.png"]; //background image
UIImage *image = [UIImage imageNamed:#"top.png"]; //foreground image
CGSize newSize = CGSizeMake(width, height);
UIGraphicsBeginImageContext( newSize );
// Use existing opacity as is
[bottomImage drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
// Apply supplied opacity if applicable
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height) blendMode:kCGBlendModeNormal alpha:0.8];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Add newImage to UIImageView
After #Sumanth's code for combine two images you need to mask final image like how-to-mask-an-image link

How to take a screenshot of a part of an app's window?

I'm trying to save a screenshot of a part of the screen with all views and layers merged. The only way I know how to do this is with the code below.
The code below creates an 80x80 square, but with a top left corner at 0,0.
How can I cut a smaller rectangle out of a full page screenshot? For example I have a 320x480 full window screenshot, but I only need an 80x80 square centered at 160,240?
UIGraphicsBeginImageContext(smallView.frame.size);
[appDelegate.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Thank you!
after getting the screenshot image crop the required part using this
CGImageRef imageRef = CGImageCreateWithImageInRect([screenshot CGImage], cropRect);
UIImage *result = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
Untested suggestion...
UIGraphicsBeginImageContext(smallView.frame.size);
[appDelegate.window.layer renderInContext:UIGraphicsGetCurrentContext()];
CGRect cropRect = UIImage *screenshot = UIGraphicsGetImageFromCurrentImageContext();
CGRectMake(160.0,240.0,smallView.frame.size.width,smallView.frame.size.height);
CGImageRef croppedImageRef = CGImageCreateWithImageInRect(screenshot.CGImage,cropRect);
UIImage *croppedScreenshot = [UIImage imageWithCGImage:croppedImageRef];
CGImageRelease(croppedImageRef);
UIGraphicsEndImageContext();

UIGraphicsBeginImageContext how to set position?

I am taking a screenshot in my application. I am able to take the screenshot, but now
I want to take it by specifying the x and y coordinate. I mean taking a screenshot from y=40. Is that possible?
UIGraphicsBeginImageContext(screenshot.frame.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIGraphicsBeginImageContext(screenshot.frame.size);
CGImageRef imageRef = CGImageCreateWithImageInRect((CGImageRef)self.view.layer.contents, cropRect);
UIImage *image = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
[image drawInRect:CGRectMake(...)];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

merging a stretchable UIImage with a "normal" one

I'd like to combine two UIImages, one stretchable and one "normal" one. The problem is that if I merge the Images using the UIGraphicsImageContext, the scond image is also stretched (it is on top of the first one as it should be, but stretched). Does anybody know how to avoid this?
Thanks a lot!
calls from my ViewController:
UIImage *stretchImage = [[UIImage imageNamed:#"stretchableLeft.png"] stretchableImageWithLeftCapWidth:0.0 topCapHeight:16.0];
stretchImage = [self imageWithImage:stretchImage scaledToSize:CGSizeMake(64.0, 64.0)];
stretchImage = [self mergeImageWithImage:stretchImage secondImage:[UIImage imageNamed:#"topImage.png"]]; // only 40x40 Px
the two methods are:
- (UIImage*)imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
- (UIImage*)mergeImageWithImage:(UIImage *)image secondImage:(UIImage *)image2
{
UIGraphicsBeginImageContext(image.size);
[image drawInRect:CGRectMake(0,0,image.size.width,image.size.height)];
[image2 drawInRect:CGRectMake(10,10,image.size.width,image.size.height) blendMode:kCGBlendModeNormal alpha:1.0];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I think the issue is that you're asking both images to draw in the full rectangle. That is causing your second image to stretch.
Try using the image2.size for image2 when merging the images. You'll have to adjust the placement using the x/y coordinate when drawing the rectangle.
- (UIImage*)mergeImageWithImage:(UIImage *)image secondImage:(UIImage *)image2
{
UIGraphicsBeginImageContext(image.size);
[image drawInRect:CGRectMake(0,0,image.size.width,image.size.height)];
[image2 drawInRect:CGRectMake(10,10,image2.size.width,image2.size.height) blendMode:kCGBlendModeNormal alpha:1.0];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}