iPhone Screenshots - iphone

I've been using UIGetScreenImage() to get a screenshot of a UIImagePickerController. Basically I use the camera overlay and then when I take the screenshot, I have the image that the camera preview had been showing and my overlay on there too, which is exactly what I need.
Now UIGetScreenImage() has been banned, I've not been able to find a way to do this. It just shows black for the camera.
Edit: all of my other views are showing absolutely fine, just not the actual camera preview. Any ideas??!?!?
Here's the code I am using at the moment.
UIGraphicsBeginImageContext(picker.view.bounds.size);
[picker.view.layer renderInContext:UIGraphicsGetCurrentContext()];
CGContextDrawImage(context, bounds, camView.CGImage);
UIImage* screenImage = UIGraphicsGetImageFromCurrentImageContext();
UIImageWriteToSavedPhotosAlbum(screenImage, nil, nil, nil);
UIGraphicsEndImageContext();
Any ideas how I can get the overlay + the camera image?
Thanks!

This is slightly different than yours.
CGRect screenRect = [[UIScreen mainScreen] bounds];
UIGraphicsBeginImageContext(screenRect.size);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *sShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum (sShot, nil, nil, nil);
Does that work?
If not, you might want to try asking over at iphonedevsdk.com since they specialize in all things iPhone.

Have them take a picture. Then overlay your image on top of it.
Create a UIImage from two other UIImages on the iPhone

Related

iphone, when saving image with "renderInContext" to device, the image is blurry?

I'm using this code to render an image from view.
Then i am saving it to photo album.
The image is blurry?
Why? Is there a solution?
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Tnx all.
You are propably using a retina device,
Change the following
UIGraphicsBeginImageContext(self.view.bounds.size)
to
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, self.view.opaque, 0.0);

Taking a screenshot in-app then putting it in a UIImageView does so at normal resolution

After zooming the iOS simulator up to 100%, i've noticed that my icons which are made by the app through taking a 'screenshot' of a view using this code are in normal resolution, whereas everything else appears to be in retina resolution:
UIGraphicsBeginImageContext(rect.size);
[object.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Any ideas how to make it be in retina resolution?
UIGraphicsBeginImageContext() is not Retina display-aware.
On iOS 4, you'll need to use UIGraphicsBeginImageContextWithOptions() instead, passing 0 as the last argument to have iOS automatically scale it based on the device's screen resolution:
UIGraphicsBeginImageContextWithOptions(rect.size, NO, 0);
[object.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

HTMLto texture on iPhone

Is there a way to grab the pixels from an UIWebView and render it to an opengl texture?
I don't know what it takes to load an opengl texture. But this will convert the data from a view into a jpeg.
UIGraphicsBeginImageContext(self.bounds.size);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
sorry for the half answer. hope someone can take it from here...

Taking a Screenshot of an iPhone app mixing UIKit and OpenGL

So, I know how to take a screenshot with UIKit:
UIWindow *window = [[[UIApplication sharedApplication] delegate] window];
UIGraphicsBeginImageContext(window.bounds.size);
[window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
And I know how to take a screenshot with OpenGL
// Some preparation to read pixels into a buffer
glReadPixels(0,0,backingWidth, backingHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
// Some code to massage the raw pixel data into an image
My app mixes OpenGL and UIKit, so any screenshot taken with just one method doesn't look right.
What's the best way to either
a) Composit the two images together or
b) Take a screenshot that can capture both OpenGL and UIKit together
This is a similar post which has some answers. Apparently you can use either UIGetScreenImage, or glReadPixels to do this task. Hope that helps!

iPhone: Get camera preview

I'd like to get the image that is being displayed on the UIImagePickerController when user uses the camera. And when I get I want to process the image and display instead of regular camera view.
But the problem is when I want to get the camera view, the image is just a black rectangle.
Here's my code:
UIView *cameraView = [[[[[[imagePicker.view subviews] objectAtIndex:0]
subviews] objectAtIndex: 0]
subviews] objectAtIndex: 0];
UIGraphicsBeginImageContext( CGSizeMake(320, 427) );
[cameraView.layer renderInContext: UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
imageToDisplay.image = [PixelProcessing processImage: viewImage]; //In this case the image is black
//imageToDisplay.image = viewImage; //In this case the image is black too
//imageToDisplay.image = [UIImage imageNamed: #"icon.png"]; //In this case image is being displayed properly
What am I doing wrong?
Thanks.
This one is also working quite good. Use it when the camera preview is open:
UIImage *viewImage = [[(id)objc_getClass("PLCameraController")
performSelector:#selector(sharedInstance)]
performSelector:#selector(_createPreviewImage)];
But as far as I found out it brings the same results than the following solution which takes a 'screenshot' of the current screen:
extern CGImageRef UIGetScreenImage();
CGImageRef cgoriginal = UIGetScreenImage();
CGImageRef cgimg = CGImageCreateWithImageInRect(cgoriginal, rect);
UIImage *viewImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgoriginal);
CGImageRelease(cgimg);
A problem I didn't still find a fix for is, how can one get the camera image very fast without any overlays?
The unofficial call is:
UIGetScreenImage()
which you declare above the #implementation as:
extern CGImageRef UIGetScreenImage();
There may be a documented way to do this in 3.1, but I'm not sure. If not, please please file a Radar with Apple asking them to make some kind of screen grab access public!!!
That uses your same AppleID you log in to the iPhone development portal with.
Update: This call is not yet documented, but Apple explicitly has given the OK to use it with App Store apps.
at least for now, there's no way to do this. (certainly no official documented way, and as far as I know nobody's figured out an unofficial way either.)
the camera preview data is being drawn by the OS in some way that bypasses the normal graphics methods.