How can I save a stretched image?
I have a small(50 X 50) image, that I have stretched while displaying it on a UIImageView (by choosing scaletofit mode.) Now I need to fetch & save that stretched image.
I have tried by using UIImageView's image property, but it gives me original image not that stretched image.
So, do you have any idea how to solve this problem? If then please help me or guide me or at-least give me a hint.
Hoping for your reply.........
Thanks
You have to create an image by copying the bit map of the view's CGContext (core graphic context). This has been asked on stackoverflow before but I can't find the answer right now.
You need to render your UIImageView into a graphic context you create :
UIGraphicsBeginImageContext(yourUIImageView.frame.size);
yourUIImageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//get the PNG representation of the UIImage (but you could do the same with JPEG
NSData *myData = UIImagePNGRepresentation(screenshot);
//then save it to a file
[myData writeToFile:thePathToWriteYourFileAsNSString atomically:YES];
Related
In my app i have one transparent image, now when user selects one image from photo library, that image have to be display over transparent image and make as an one uiiimage so that user can mail or share with it. I have used the following code, however image is not coming correct over transparent image
UIImage *backgroundImage = [UIImage imageNamed:#"iPhoneOverLay.png"];
UIGraphicsBeginImageContext(backgroundImage.size);
[backgroundImage drawInRect:CGRectMake(0, 0, backgroundImage.size.width, backgroundImage.size.height)];
[testImage drawInRect:CGRectMake(backgroundImage.size.width - testImage.size.width, backgroundImage.size.height - testImage.size.height, testImage.size.width, testImage.size.height)];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Here testImage is selected from Photo Library or taken from camera
You draw the 2 images in the wrong order. Transparency is a property that makes things shine through that have been painted previously. Therefore you have to draw the (opaque) photo first and the (transparent) overlay second.
i load data in a tableview, each row have a simple picture 500x300px that go resize to size of cell.
Is possible resize the images?
tableview.m
NSString *filePath = [[NSBundle mainBundle] pathForResource:receip.image ofType:#"jpg"];
UIImage *image = [UIImage imageWithContentsOfFile:filePath];
cell.imageView.image = image;
I assume you mean resize them before putting them in the cell to improve performance?
You can do this with core graphics as follows:
//define desired size
CGSize size = CGSizeMake(100.0f, 100.0f);
//create drawing context
UIGraphicsBeginImageContextWithOptions(size, NO, 0.0f);
//draw image at new size
[image drawInRect:CGRectMake(0.0f, 0.0f, size.width, size.height)];
//capture resultant image
UIImage *resizedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Be warned though that is a computationally expensive task. You'll probably want to resize all your images in advance, possibly on a background thread, and store the resized versions in an array. You definitely do not want to stick the code above in your cellForRowAtIndexPath method, or your table scrolling performance will be diabolical.
If those images come from an external source (say the web). It's better to first cache the original file and then (like explained by #Nick Lockwood) resize them (remember retina displays need twice the width / height for hi-res displaying). And after that process is complete you can send a signal that the "download" is complete and carry on showing the UITableView
If you already have the images in your app bundle I suggest you make a thumb version of that and add it to your bundle. For better performance o'course.
I'm download jpeg picture from web server and load it to UIImage.
If I display the UIImage in UIImageView directly, I see the picture correctly.
But if i cache the image to file with :
[UIImageJPEGRepresentation(image,1.0f) writeToFile:sFilePath atoimcally:YES]
and load it with :
image=[UIImage imageWithContentsOFile:sFilePath]
and display this in the same UIImageView, I can see white stripes in the sides of the picture.
again, Exactly the same UIImageView object with the same properties settings in it.
Why is that?
You can simply write to a file the NSData you have loaded from the web, without going through the UIImageJPEGRepresentation routine.
[dataObject writeToURL:fileURL atomically:YES];
and to retrieve
UIImage *image = [[UIImage alloc] initWithContentsOfFile:[fileURL path]];
That works really well in my app.
I have a issue with masking images. I do game "puzzle" and have to make custom images. I found and tried 2 way of custom cropping:
Using CALayer.mask property.
Using UIImage.mask property.
In first option i create my custom path, then assign it to CAShapeLayer.path property, then assign CAShapeLayer to CALayer.mask property. At the end i have custom cropped image.
In second option i use firstly use CGImageMaskCreate() method (i use previously created black mask images of puzzle), then CGContextClipToMask().
In either options i have problem with performance (mostly when i crop image into 16 puzzles and drag in over the screen).
Is there any other approaches to crop image in custom way.
(I don't know how to solve performance problem).
Thanks in advance.
There are lots of UIImage-categories out there you can use for this. Give me a moment and I'll post some links here:
Cropping an UIImage (not really a category though, but it'll fit)
UIImage: Resize, then Crop
https://sites.google.com/a/injoit.com/knowledge-base/for-developers/graphics/uiimage-routines-scaling-cropping-rotating-etc
http://www.hive05.com/2008/11/crop-an-image-using-the-iphone-sdk/
http://maybelost.com/2010/11/cropping-a-uiimage-on-iphone/
Try this:
-(UIImage *)imageByCropping:(UIImage *)imageToCrop toRect:(CGRect)rect
{
CGImageRef imageRef = CGImageCreateWithImageInRect([imageToCrop CGImage], rect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return cropped;
}
...
UIImage *temp_image = [self imageByCropping:original_image toRect:clipping_rectangle];
Maybe you should consider about drawing the image in a new image with an alpha background an overdrawing the current background. I mean: All pixel which are inside the jigsaw piece: normal colour, all pixels outside the jigsaw piece = transparent. And then try to blend it to the new background or overdrawing it.
Just my 2 cents. :)
I have an app that need to display image(in a folder) thumbnail list in UITableView.
My way is to create the thumbnail of an image when add the image to the folder.
CGSize itemSize = CGSizeMake(100, 100);
UIGraphicsBeginImageContext(itemSize);
[image drawInRect:CGRectMake(0, 0,100, 100)];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData* imageData1 =[NSData dataWithData:UIImagePNGRepresentation (image)];
NSMutableString *sss1=[[NSMutableString alloc] initWithString:folderPath];
[sss1 appendString: thumbnailIdString] ;
[sss1 appendString:fileName] ;
[imageData1 writeToFile:sss1 atomically:NO];
[sss1 release];
Then the app display the resized the thumbnail image in UITableView.
It DOES work. But the performance is not perfect.
It needs to load the large image and rewrites the thumbnails to the folder.
Is there any other better solution? I checked Three20, but I am not sure if it can do this.
Welcome any comment
Thanks
interdev
You can load and resize image in a worker thread, and when the image is ready, show this image in the main thread. In order to complete the above behaviors, you need to find a thread-safe way to resize image. The UIGraphicsBeginImageContext() and UIGraphicsEndImageContext() should only run in the main thread.