On click event of a button
[mybutton addTarget:self action:#selector(captureView)
forControlEvents:UIControlEventTouchUpInside];
- (void)captureView {
UIGraphicsBeginImageContext(CGSizeMake(320,480));
CGContextRef context = UIGraphicsGetCurrentContext();
[self.view.layer renderInContext:context];
UIImage *screenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSLog(#"%#",screenShot);
}
My screenshot prints UIImage: 0x4b249c0. Is this correct code to take a screen short of a particular area of an iphone app? Where this image store at that particular time. How can i see those images?
The UIImage is just an in-memory representation of the screenshot you've taken. You will need to write it out somewhere if you want to save it. For example, the following code will write that image to the user's photo library:
UIImageWriteToSavedPhotosAlbum(screenShot, self, #selector(image:didFinishSavingWithError:contextInfo:), NULL);
Remember to retain the image and release it within your -image:didFinishSavingWithError:contextInfo:
callback method.
You can also manually save out an image as a PNG or JPEG using code like the following:
NSData *imageData = UIImagePNGRepresentation(screenShot);
[imageData writeToFile:pathToSaveImage atomically:YES];
Here your image is in the memory itself and you need to save it somewhere say in Bundle .
Related
I'm able to crop a view with this code
- (UIImage *)captureScreenInRect:(CGRect)captureFrame {
CALayer *layer;
layer = self.view.layer;
UIGraphicsBeginImageContext(self.view.bounds.size);
CGContextClipToRect (UIGraphicsGetCurrentContext(),captureFrame);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return screenImage;
}
But I have an imageview zoomed in with transform and it isn't shown to scale.
How do I capture EXACTLY what the user sees on the screen
The Stack Overflow question "renderInContext:" and CATransform3D has more info, but the gist is:
QCCompositionLayer, CAOpenGLLayer, and QTMovieLayer layers are not rendered. Additionally, layers that use 3D transforms are not rendered, nor are layers that specify backgroundFilters, filters, compositingFilter, or a mask values.
(from the CALayer docs).
More info is also available in this technical Q&A: http://developer.apple.com/library/ios/#qa/qa1703/_index.html
If your app is not going to the app store you can use the undocumented UIGetScreenImage API:
// Define at top of implementation file
CGImageRef UIGetScreenImage(void);
...
- (void)buttonPressed:(UIButton *)button
{
// Capture screen here...
CGImageRef screen = UIGetScreenImage();
UIImage* image = [UIImage imageWithCGImage:screen];
CGImageRelease(screen);
// Save the captured image to photo album
UIImageWriteToSavedPhotosAlbum(image, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
}
(from John Muchow)
However, use of this API will make your app not get approved.
I have been unable to find any other workarounds.
I want to build an app that user can add a frame (a png file with transparent) to another image(maybe from library) to make a new image and save it to library or upload to Facebook.
So, my question is how to make an image and its frame become unique image. Thanks for your helps!
May be it's helpful for you. I think you wants merge image and crop also May be It's helpful for you. It's work fine for me
CGSize newSize = CGSizeMake(190, 190);
UIGraphicsBeginImageContext( newSize );
[imageView.image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
[imageViewTatoo.image drawInRect:CGRectMake(x1,y1,x2,y2) blendMode:kCGBlendModeDarken alpha:0.4];
UIImage *fullScreenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
imageView.contentMode =UIViewContentModeScaleAspectFit;
UIImageWriteToSavedPhotosAlbum(fullScreenshot, nil, nil, nil);
imageView.image= fullScreenshot;
I have following code as UIImage+Scale.h category.
-(UIImage*)scaleToSize:(CGSize)size
{
UIGraphicsBeginImageContext(size);
[self drawInRect:CGRectMake(0, 0, size.width, size.height)];
// is this scaledImage auto-released?
UIImage* scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return scaledImage;
}
I use image obtained as above and use it as following.
UIImage* image = [[UIImage alloc] initWithData: myData];
image = [image scaleToSize: size]; <- wouldn't this code create a leak since
image(before scaling) is lost somewhere?
i guess above codes work fine if image was first created with auto-release.
But if image was created using 'alloc', it would create a leak in my short knowledge.
How should I change scaleToSize: to guard against it?
Thank you
EDIT -
I'd like to use alloc(or retain)/release on UIImage so that I can keep the # of UIImage in memory at a point small.
(i'm loading many UIImages in a loop and device can't take it)
Notice that your code could be rewritten as:
UIImage *image = [[UIImage alloc] initWithData:myData];
UIImage *scaledImage = [image scaleToSize:size];
image = scaledImage;
so let’s see what happens:
image is obtained via alloc, hence you own that object
scaledImage is obtained via a method that returns an autoreleased object since UIGraphicsGetImageFromCurrentImageContext() returns an autoreleased object
you own the original image but you don’t own scaledImage. You are responsible for releasing the original image, otherwise you have a leak.
In your code, you use a single variable to refer to both objects: the original image and the scaled image. This doesn’t change the fact that you own the first image, hence you need to release it to avoid leaks. Since you lose the original image reference by using the same variable, one common idiom is to send -autorelease to the original object:
UIImage *image = [[[UIImage alloc] initWithData:myData] autorelease];
image = [image scaleToSize:size];
Or, if you’d rather release the original image instead of autoreleasing it,
UIImage *image = [[UIImage alloc] initWithData:myData];
UIImage *scaledImage = [image scaleToSize:size];
[image release];
// use scaledImage from this point on, or assign image = scaledImage
IMO, it doesn’t make sense to change scaleToSize:. It is an instance method that creates an (autoreleased) image based on a given UIImage instance. It’s similar to -[NSString stringByAppendingString:], which creates a (an autoreleased) string based on a given NSString instance. It doesn’t and shouldn’t care about the ownership of the original string, and the same applies to your scaleToSize: method. How would the method know whether the caller wants to keep the original image?
I’d also rename scaleToSize: to imageByScalingToSize to make it similar to Cocoa’s naming convention — you’re getting an image by applying an operation to an existing image.
Yeah, it is sure that you have a leak. The object stored previously in the image is not referenced anymore but not deallocated yet
On my screen there is an image and I have to save this image by taking a screenshot. Is there any other way of saving a particular area of the image only?
If there is a method, please guide me with a simple example.
The following code will allow you to grab just a section of the screen, you could adapt for full screen
CGRect screenRect = CGRectMake(0, 0, 200, 200);
UIGraphicsBeginImageContext(screenRect.size);
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGContextFillRect(ctx, screenRect);
[self.view.layer renderInContext:ctx];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
//if you want to save this image to the photo album uncomment the next line
//UIImageWriteToSavedPhotosAlbum(newImage, nil, nil, nil);
UIGraphicsEndImageContext();
If you simply want to get a screenshot though just press the home and top button together which will put a screenshot in the photo album each time you do it
If you need a programming way to achieve this you could use:
[[NSImage alloc] initWithData:[view dataWithPDFInsideRect:[view bounds]]];
Try pressing the home and close/power button at the same time.
i want to take screenshots in landscape mode.
currently my below code takes screenshots in PORTRAIT mode.
also i want to store the images into the given location not in photo library..
how can i attain this...
thanks for any help
below is my code
UIGraphicsBeginImageContext(self.view.bounds.size) ;
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, self, nil, nil);
As far as I know there is no way to take screenshots in landscape mode. Images taken with the iPhone contain an information called imageOrientation which is then used to rotate an UIImageView displaying that image.
But you should be able to take an image in portrait mode and rotate it by 90 degree before saving it. I can't try it right now but the following should work:
Create a method for rotation and pass the UIImage as an argument.
CGSize size = sizeOfImage;
UIGraphicsBeginImageContext(size);
CGContextRotateCTM(ctx, angleInRadians); // (M_PI/2) or (3M_PI/2) depending on left/right rotation
CGContextDrawImage(UIGraphicsGetCurrentContext(),
CGRectMake(0,0,size.width, size.height),
image);
UIImage *copy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return copy;
To store into a given location you can use NSData as I have answered on your other question ( Saving images to a given location )