Received Memory Warning while stitching images - iphone

I am developing an app in which I have to stitch multiple images one by one when ever the user takes picture from camera.
This is what I am using to merge two images.
[self performSelector:#selector(joinImages:secondImage:) withObject:firstimage withObject:imageCaptured];
- (UIImage *)joinImages:(UIImage *)im1 secondImage:(UIImage *)im2
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
//Joins 2 UIImages together, stitching them horizontally
CGSize size = CGSizeMake(im1.size.width+im2.size.width, im2.size.height);
UIGraphicsBeginImageContext(size);
CGPoint image1Point = CGPointMake(0, 0);
[im1 drawAtPoint:image1Point];
CGPoint image2Point = CGPointMake(im1.size.width,0);
[im2 drawAtPoint:image2Point];
UIImage* finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
firstimage=finalImage;// final images updated everytime
[pool release];
return finalImage;
}
But I receive memory warning when I run this in iPhone and it works fine for iPod.
Also images are cropped in case of iPhone.
Anything I can do to resolve this problem.
Thanks..

You could do something like this.
- (UIImage *)joinImages:(UIImage *)im1 secondImage:(UIImage *)im2
{
#autoreleasepool
{
//Joins 2 UIImages together, stitching them horizontally
CGSize size = CGSizeMake(im1.size.width+im2.size.width, im2.size.height);
UIGraphicsBeginImageContext(size);
CGPoint image1Point = CGPointMake(0, 0);
[im1 drawAtPoint:image1Point];
CGPoint image2Point = CGPointMake(im1.size.width,0);
[im2 drawAtPoint:image2Point];
UIImage* finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
firstimage=finalImage;// final images updated everytime
return finalImage;
}
}
Hope this will work for you.

[self joinImages:firstimage secondImage:secondImage];
- (UIImage *)joinImages:(UIImage *)im1 secondImage:(UIImage *)im2
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
//Joins 2 UIImages together, stitching them horizontally
CGSize size = CGSizeMake(im1.size.width+im2.size.width, im2.size.height);
UIGraphicsBeginImageContext(size);
CGPoint image1Point = CGPointMake(0, 0);
[im1 drawAtPoint:image1Point];
CGPoint image2Point = CGPointMake(im1.size.width,0);
[im2 drawAtPoint:image2Point];
UIImage* finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
firstimage=finalImage;// final images updated everytime
[pool release];
return finalImage;
}

Related

Stuttering when Drawing to Image in GCD Block

I have the following method which takes some CAShapeLayers and converts them into a UIImage. The UIImage is used in a cell for a UITableView (much like the photos app when you select a photo from one of your libraries). This method is called from within a GCD block:
-(UIImage*) imageAtIndex:(NSUInteger)index
{
Graphic *graphic = [[Graphic graphicWithType:index]retain];
CALayer *layer = [[CALayer alloc] init];
layer.bounds = CGRectMake(0,0, graphic.size.width, graphic.size.height);
layer.shouldRasterize = YES;
layer.anchorPoint = CGPointZero;
layer.position = CGPointMake(0, 0);
for (int i = 0; i < [graphic.shapeLayers count]; i++)
{
[layer addSublayer:[graphic.shapeLayers objectAtIndex:i]];
}
CGFloat largestDimension = MAX(graphic.size.width, graphic.size.height);
CGFloat maxDimension = self.thumbnailDimension;
CGFloat multiplicationFactor = maxDimension / largestDimension;
CGSize graphicThumbnailSize = CGSizeMake(multiplicationFactor * graphic.size.width, multiplicationFactor * graphic.size.height);
layer.sublayerTransform = CATransform3DScale(layer.sublayerTransform, graphicThumbnailSize.width / graphic.size.width, graphicThumbnailSize.height / graphic.size.height, 1);
layer.bounds = CGRectMake(0,0, graphicThumbnailSize.width, graphicThumbnailSize.height);
UIGraphicsBeginImageContextWithOptions(layer.bounds.size, NO, 0);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = [UIGraphicsGetImageFromCurrentImageContext() retain];
UIGraphicsEndImageContext();
[layer release];
[graphic release];
return [image autorelease];
}
For whatever reason, when I'm scrolling the UITableView and loading the images in, it is stuttering a little bit. I know the GCD code is fine because it's worked previously so it appears something in this code is causing the stuttering. Does anyone know what that could be? Is CAAnimation not thread safe? Or does anyone know a better way to take a bunch of CAShapeLayers and convert them into a UIImage?
In the end I believe:
[layer renderInContext:UIGraphicsGetCurrentContext()];
Cannot be done on a separate thread, so I had to do the following:
UIGraphicsBeginImageContextWithOptions(imageSize, NO, 0);
CGContextRef context = UIGraphicsGetCurrentContext();
//draw the mutable paths of the CAShapeLayers to the context
UIImage *image = [UIGraphicsGetImageFromCurrentImageContext() retain];
UIGraphicsEndImageContext();
There is a great example of this (where I learned to do it) in the WWDC2012 video "Building Concurrent User Interfaces on iOS"

Trying to overlap two images and showing a overlapped imaged in a third image

I am trying to overlap two local images and trying to show the overlapped one in third image.
I am using this code but simulator shows nothing.
- (void)viewDidLoad
{
[super viewDidLoad];
image1 = [[UIImage alloc]init];
image1 = [UIImage imageNamed:#"iphone.png"];
imageA = [[UIImageView alloc]initWithImage:image1];
[self merge];
}
-(void)merge
{
CGSize size = CGSizeMake(320, 480);
UIGraphicsBeginImageContext(size);
CGPoint thumbPoint = CGPointMake(0,0);
imageview.image = imageA.image;
[imageA.image drawAtPoint:thumbPoint];
imageB = [[UIImage alloc]init];
imageB = [UIImage imageNamed:#"Favorites.png"];
CGPoint starredPoint = CGPointMake(0, 0);
[imageB drawAtPoint:starredPoint];
UIImage *imageC = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
imageview.image = imageC;
[self.view addSubview:imageview];
}
I can't figure out/don't know where i am making mistake.
Any help would be appreciable.
Remove all the code from every where except the below code in Merge.
-(void)merge
{
CGSize size = CGSizeMake(320, 480);
UIGraphicsBeginImageContext(size);
CGPoint point1 = CGPointMake(0,0);
// The second point has to be some where different than the first point, other wise, the second image will be above the first image, and you wont even know that the two images are there.
CGPoint point2 = CGPointMake(100,100);
UIImage *imageOne = [UIImage imageNamed:#"Image1.png"];
[imageOne drawAtPoint:point1];
UIImage *imageTwo = [UIImage imageNamed:#"Image2.png"];
// If you want the above image to have some blending, then you can do some thing like below.
// [imageTwo drawAtPoint:point2 blendMode:kCGBlendModeMultiply alpha:0.5];
[imageTwo drawAtPoint:point2];
UIImage *imageC = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageView *iv = [[UIImageView alloc] initWithFrame:CGRectMake(100,100,200,200)];
iv.image=imageC;
[self.view addSubview:iv];
}
here's a general purpose "merge" function wrote as a UIImage category... allows image overlay/underlay.
http://saveme-dot-txt.blogspot.com/2011/06/merge-image-function.html

Taking a picture from the camera and show it in a UIImageView

I have a view with some fields (name, price, category) and a segmented control, plus this button to take picture.
If I try this on the simulator (no camera) it works properly: I can select the image from the camera roll, edit it and go back to the view, which will show all the fields with their contents .
But on my iphone, when I select the image after the editing and go back to the view, all the fields are empty exept for the UIImageView.I also tried to save the content of the fields in variables and put them back in the "viewWillApper" method, but the app crashes.
Start to thinking that maybe there is something wrong methods below
EDIT
I found the solution here. I defined a new method to the UIImage class. (follow the link for more information).Then I worked on the frame of the UIImageView to adapt itself to the new dimension, in landscape or portrait.
-(IBAction)takePhoto:(id)sender {
if ([UIImagePickerController isSourceTypeAvailable: UIImagePickerControllerSourceTypeCamera]) {
self.imgPicker.sourceType = UIImagePickerControllerSourceTypeCamera;
self.imgPicker.cameraCaptureMode = UIImagePickerControllerCameraCaptureModePhoto;
} else {
imgPicker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;
}
[self presentModalViewController:self.imgPicker animated:YES];
}
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissModalViewControllerAnimated:YES];
NSDate *date = [NSDate date];
NSString *photoName = [dateFormatter stringFromDate:date];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUs erDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
imagePath = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.png", photoName]];
UIImage *picture = [info objectForKey:UIImagePickerControllerOriginalImage];
// ---------RESIZE CODE--------- //
if (picture.size.width == 1936) {
picture = [picture scaleToSize:CGSizeMake(480.0f, 720.0f)];
} else {
picture = [picture scaleToSize:CGSizeMake(720.0f, 480.0f)];
}
// --------END RESIZE CODE-------- //
photoPreview.image = picture;
// ---------FRAME CODE--------- //
photoPreview.contentMode = UIViewContentModeScaleAspectFit;
CGRect frame = photoPreview.frame;
if (picture.size.width == 480) {
frame.size.width = 111.3;
frame.size.height =167;
} else {
frame.size.width = 167;
frame.size.height =111.3;
}
photoPreview.frame = frame;
// --------END FRAME CODE-------- //
NSData *webData = UIImagePNGRepresentation(picture);
CGImageRelease([picture CGImage]);
[webData writeToFile:imagePath atomically:YES];
imgPicker = nil;
}
Now I have a new issue! If I take a picture in landscape, and try to take another one in portrait, the app crashs. Do I have to release something?
I had the same issue, there is no edited image when using the camera, you must use the original image :
originalimage = [editingInfo objectForKey:UIImagePickerControllerOriginalImage];
if ([editingInfo objectForKey:UIImagePickerControllerMediaMetadata]) {
// test to chek that the camera was used
// especially I fund out htat you then have to rotate the photo
...
If it was cropped when usign the album you have to re-crop it of course :
if ([editingInfo objectForKey:UIImagePickerControllerCropRect] != nil) {
CGRect cropRect = [[editingInfo objectForKey:UIImagePickerControllerCropRect] CGRectValue];
CGImageRef imageRef = CGImageCreateWithImageInRect([originalimage CGImage], cropRect);
chosenimage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
} else {
chosenimage = originalimage;
}
The croprect info is also present for the camera mode, you need to check how you want it to behave.
To Crop image i think this may help you
UIImage *croppedImage = [self imageByCropping:photo.image toRect:tempview.frame];
CGSize size = CGSizeMake(croppedImage.size.height, croppedImage.size.width);
UIGraphicsBeginImageContext(size);
CGPoint pointImg1 = CGPointMake(0,0);
[croppedImage drawAtPoint:pointImg1 ];
[[UIImage imageNamed:appDelegete.strImage] drawInRect:CGRectMake(0,532, 150,80) ];
UIImage* result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
croppedImage = result;
UIImageView *mainImageView = [[UIImageView alloc] initWithImage:croppedImage];
CGRect clippedRect = CGRectMake(0, 0, croppedImage.size.width, croppedImage.size.height);
CGFloat scaleFactor = 0.5;
UIGraphicsBeginImageContext(CGSizeMake(croppedImage.size.width * scaleFactor, croppedImage.size.height * scaleFactor));
CGContextRef currentContext = UIGraphicsGetCurrentContext();
CGContextClipToRect(currentContext, clippedRect);
//this will automatically scale any CGImage down/up to the required thumbnail side (length) when the CGImage gets drawn into the context on the next line of code
CGContextScaleCTM(currentContext, scaleFactor, scaleFactor);
[mainImageView.layer renderInContext:currentContext];
appDelegete.appphoto = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

why some UIimages don't show up in iphone

hi I am currently developing a small app on ios 4.3 , using objective c
as part of the app I need to manipulate an Image that I have downloaded from the web.
the following code shows up a missing image:
(the original is in a class but I just put this together as a test scenario so that it could be easily copy pasted)
- (void)viewDidLoad
{
[super viewDidLoad];
[self loadImage:#"http://www.night-net.net/images/ms/microsoft_vista_home_basic.jpg"];
[self getCroped:CGRectMake(10, 50, 80, 160)];
[self getCroped:CGRectMake(90, 50, 80, 80)];
[self getCroped:CGRectMake(90, 130, 40, 80)];
[self getCroped:CGRectMake(130, 130, 40, 40)];
[self getCroped:CGRectMake(130, 170, 40, 40)];
}
-(void) loadImage : (NSString*) url
{
_data = [NSData dataWithContentsOfURL:
[NSURL URLWithString: url]];
}
-(UIImageView*) getCroped:(CGRect) imageSize{
UIImage *temp = [[UIImage alloc] initWithData:_data];
UIImage *myImage = [self resizedImage:temp and:CGSizeMake(160,160) interpolationQuality:kCGInterpolationHigh];
UIImage *image = [self croppedImage:myImage and:imageSize];
UIImageView *imageView = [[UIImageView alloc] init];
imageView.image = image;
imageView.frame = imageSize;
[[self view] addSubview:imageView];
return imageView;
}
- (UIImage *)croppedImage:(UIImage*) image and: (CGRect)bounds {
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], bounds);
UIImage *croppedImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return croppedImage;
}
- (UIImage *)resizedImage:(UIImage*) image and:(CGSize)newSize interpolationQuality:(CGInterpolationQuality)quality {
BOOL drawTransposed = NO;
return [self resizedImage:image
and:newSize
transform:[self transformForOrientation:newSize]
drawTransposed:drawTransposed
interpolationQuality:quality];
}
// Returns a copy of the image that has been transformed using the given affine transform and scaled to the new size
// The new image's orientation will be UIImageOrientationUp, regardless of the current image's orientation
// If the new size is not integral, it will be rounded up
- (UIImage *)resizedImage:(UIImage*) image and:(CGSize)newSize
transform:(CGAffineTransform)transform
drawTransposed:(BOOL)transpose
interpolationQuality:(CGInterpolationQuality)quality {
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGRect transposedRect = CGRectMake(0, 0, newRect.size.height, newRect.size.width);
CGImageRef imageRef = image.CGImage;
// Build a context that's the same dimensions as the new size
CGContextRef bitmap = CGBitmapContextCreate(NULL,
newRect.size.width,
newRect.size.height,
CGImageGetBitsPerComponent(imageRef),
0,
CGImageGetColorSpace(imageRef),
CGImageGetBitmapInfo(imageRef));
// Rotate and/or flip the image if required by its orientation
CGContextConcatCTM(bitmap, transform);
// Set the quality level to use when rescaling
CGContextSetInterpolationQuality(bitmap, quality);
// Draw into the context; this scales the image
CGContextDrawImage(bitmap, transpose ? transposedRect : newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(bitmap);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
// Clean up
CGContextRelease(bitmap);
CGImageRelease(newImageRef);
return newImage;
}
// Returns an affine transform that takes into account the image orientation when drawing a scaled image
- (CGAffineTransform)transformForOrientation:(CGSize)newSize {
CGAffineTransform transform = CGAffineTransformIdentity;
transform = CGAffineTransformTranslate(transform, newSize.width, 0);
transform = CGAffineTransformScale(transform, -1, 1);
return transform;
}
at first I thought this is caused by a lack of memory, but I have tested for that and that doesnt seem to be the problem,thanks in advance ofir
I've had issues in the past with images not appearing within UIWebViews if they contain unicode characters in the filename. I wonder if this might be the same thing. Try renaming your image?
doing this should be possible and low on memory cost as I did the same test,using flash to create an iphone app that does the same thing, and it works.
but I would much prefer using objective c so the question still stands

Should repeated use of the camera crash an app?

I have an app that builds a slideshow from user images. They can grab from their library or take a picture. I have found that repeated use of grabbing an image from the library is fine. But repeated use of taking a picture causes erratic behavior. I have been getting crashes but mostly what happens seems to be a reloading of the view after "didFinishPickingMediaWithInfo", which messes things up.
I have no leaks and it seems to be releasing properly after each picture is taken. I am resizing the image and saving it in a data base. Is anyone else running into this situation? Was the camera not designed to be called this often?
Sorry for the mix up. It's not continuos, it's take a picture, remove and release the camera, then the user selects choice A, B or C (they are prompted throughout to make selections on many things) then they can take another picture, remove and release the camera...etc...and this happens several times while they complete all the data entry.
After they select the camera I call this code. I am not releasing the image picker in the dealloc method.
- (void)openCamera {
if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera])
{
imagePicker = [[UIImagePickerController alloc] init];
imagePicker.delegate = self;
imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
[self presentModalViewController:imagePicker animated:YES];
[imagePicker release];
}
}
After they "USE" a picture taken from the camera I call the following code. It resized the image based on the original size of the image.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *tempCameraImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
[picker dismissModalViewControllerAnimated:YES];
CGFloat originalSize = tempCameraImage.size.width * tempCameraImage.size.height;
NSLog(#"Original Size %f", originalSize);
if (originalSize > 2500000.0) {
CGSize size = tempCameraImage.size;
CGRect rect = CGRectMake(0.0, 0.0, .23 * size.width, .23 * size.height);
UIGraphicsBeginImageContext(rect.size);
[tempCameraImage drawInRect:rect];
theImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGFloat totalSize = theImage.size.width * theImage.size.height;
NSLog(#"Final Camera Size %f", totalSize);
[self resizeImageCamera];
return;
}
else {
CGSize size = tempCameraImage.size;
CGRect rect = CGRectMake(0.0, 0.0, .27 * size.width, .27 * size.height);
UIGraphicsBeginImageContext(rect.size);
[tempCameraImage drawInRect:rect];
theImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGFloat totalSize = theImage.size.width * theImage.size.height;
NSLog(#"Final Camera Size %f", totalSize);
[self resizeImageCamera];
return;
}
}
-(void)resizeImageCamera {
if (editingImage1) {
NSManagedObject *image = [NSEntityDescription insertNewObjectForEntityForName:#"Image" inManagedObjectContext:slideshow.managedObjectContext];
slideshow.image = image;
[image setValue:theImage forKey:#"image"];
CGSize size = theImage.size;
CGRect rect = CGRectMake(0.0, 0.0, .15 * size.width, .15 * size.height);
UIGraphicsBeginImageContext(rect.size);
[theImage drawInRect:rect];
slideshow.thumbnailImage1 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self finishCameraImage];
return;
}
}
-(void)finishCameraImage {
if (editingImage1) {
keyInt = #"2";
[editedObject setValue:keyInt forKey:#"script1"];
pictureView.alpha = 0;
self.navigationItem.rightBarButtonItem = nil;
editedFieldKey = #"line2Int";
editedFieldName = NSLocalizedString(#"line2Int", #"display name for line2");
self.title = editedFieldName;
linePicker.hidden = NO;
}
I realize I am doing a lot to this image. If I put in a couple of delays would that help?