Getting an image representation of the camera preview in UIImagePickerController - iphone

When I use the standard [view.layer renderInContext:UIGraphicsGetCurrentContext()]; method for turning views into images on the iPhone camera modal view, all I get is the controls with black where the camera preview was.
Anyone have a solution for this? Speed is not an issue.

Not possible from a documented api, but possible. Find the "PLCameraView" view in the camera's subviews, then call
CGImageRef img = (CGImageRef)[foundCameraView imageRef];
This will return a reference to the image that camera holds.

You can do this with the new AVFoundation stuff in iOS 4.0.
You should be able to call UIGetScreenImage() (returns a UIImage) to get a current capture of the whole screen, including the preview. That's how all the barcode apps worked before. But supposedly Apple is disallowing that now and only allow the AVFoundation technique - which only works under 4.0.
The whole reason there's even an issue is because UIGetScreenImage() is not part of the documented API, but Apple made a specific exception for using it. It's not like they are pulling current apps, but they are not allowing new submissions (or updates) that use the older technique.
There is some lobbying on behalf on a number of people to convince Apple to let app developers use the old technique for iOS 3.x only, so send an email to developer relations if you want to use it.

This isn't possible unfortunately. For performance the iPhone uses some form of direct rendering to draw the camera preview directly onto the screen instead of as part of a UIView surface. As such when you "capture" the camera view you will only get the UIView elements, not the preview image.
(FWIW this is similar to the reasons why its difficult to screengrab some movie software on Windows/Mac)

You could try AVCaptureVideoPreviewLayer

Related

Is it possible to avoid the "use"/"retake" screen after taking a photo in an iOS app?

i have hired a programmer to create an iPhone app for me. The purpose of the app is to take a photo and upload it to a server. We want to make a special purpose screen to review the photo before uploading it. This specially developed screen will crucially have zooming functionality.
He claims that after taking a photo, it is impossible to avoid the "use"/"reuse" screen to show up, so now we have two screens to review the photo. First the standard one from Apple, then our own with zoom. Is he right about that? It just sounds so unreasonable that Apple would put such a restriction.
Edit: I mean taking a photo using the camera.
As par Apple's documentation
To perform fully-customized image or movie capture, instead use the AV
Foundation framework as described in “Media Capture and Access to
Camera” in AV Foundation Programming Guide. To create a
fully-customized image picker for browsing the photo library, use
classes from the Assets Library framework. For example, you could
create a custom image picker that displays larger thumbnail images,
that makes use of EXIF metadata including timestamp and location
information, or that integrates with other frameworks such as Map Kit.
For more information, see Assets Library Framework Reference. Media
browsing using the Assets Library framework is available starting in
iOS 4.0
In short yes it is possible check out this sample
[Update]
Use the allowsEditing property on your UIImagePickerController
imagePickerController.allowsEditing = NO;
Previous answer was a bit of a hack to take advantage of a code path that didn't show the buttons but wasn't awesome.
[Previous answer]
You can actually avoid it without going through the hassle of setting up your own image capture from AV Foundation.
Including the following will remove the need to show the "review" screen. All you have to do is put in a few of your own buttons and wire them up to the appropriate functionality.
[self.imagePickerController setShowsCameraControls:NO];
It is a little bit too late I know, but for future reference:
This is far more simple that the answers already provided,
what you are looking for is the allowsEditing option.
imagePickerController.allowsEditing = NO;
That should be enough to avoid showing the "Retake"/"Use" screen after the user takes a picture.

iOS: Getting started with the Camera and Custom Image Editing

My app will let users cut out things from photos. They'll be able to either select a photo already in their iPhone's photo library, or take a new one with the camera. From what I understand, UIImagePicker is the simplest way to accomplish picking a photo from the library or taking a new one. However, I also understand that it only provides basic image editing (zoom, crop). I want my image editing to allow for the creation of Bezier curves that, once all joined together, will cut out the enclosed area, saving it without the surrounding background.
The official apple documentation on UIImagePicker suggested that the AV Framework is required for providing custom image editing as opposed to the basic zoom and crop. So my first questions are:
Is the AV Framework indeed what I want to
use?
Will it get used in conjunction with UIImagePicker (i.e., UIImagePicker is used to select the photo or take a new one, and then my AV Framework code takes over for the image editing)?
Can anyone offer good resources on getting started on learning the code for this process?
My final question is about the actual Bezier curve generation/manipulation. It appears that the Core Graphics Framework has support for this, but there is also the UIBezierPath object, which is apparently some kind of wrapper for the Core Graphics tools I would otherwise use.
So my final question: will I want to use the UIBezierPath object, or does what I previously described require more fine-grained control that UIBezierPath can't provide, thereby forcing me to use the Core Graphics framework directly?
Thanks!
the AV Foundation allows you to talk to the camera, to configure it in various ways, and to receive a live feed from it. So it's good for taking new pictures or movies, but not for selecting them from the camera roll or for editing them. You'd likely want to use the AV Foundation to replace the image capture duties that UIImagePicker supplies. Probably you'll want to use a UIImagePicker with allowsEditing set to NO so as to be able to provide your own entirely separate editing interface.
no, it's a different sort of task.
I'm unaware of any tutorials on this sort of thing, but the docs are pretty good. I've posted the whole stuff for capturing a live feed from the camera in answers like this one, not sure if that's a more helpful way to see how some of the AV Foundation classes can be chained together?
What you'll probably end up doing in order to edit an image is starting with a UIImage, creating a CoreGraphics bitmap context (which is something you can draw to), doing some sort of compositing to that and then converting the result into an image and saving it back out to the camera roll.
UIBezierPath is a wrapper over the Core Graphics stuff, but will probably do what you want. addClip can set a defined path to be the new clipping path on the current context, or you can use the CGPath property if you need to go a bit further afield than UIKit's idea of a current context.
look for the iphone cookbook, maybe kickasstorrents still has it
C07 has everything you need, camera, overlay, loading, picking, editing, snapping,hiking camera, saving doc, sending image, image scroller, thumbnails, masking, etc....

iPhone demo help: anyone know of a faster screen capture alternative to UIGetScreenImage()?

I'm working on an iPhone app that I'm going to be demo'ing to a live audience soon.
I'd really like to demo the app live over VGA to a projector, rather than show screenshots.
I bought a VGA adapter for iPhone, and have adapted Rob Terrell's TVOutManager to suit my needs. Unfortunately, the frame rate after testing on my television at home just isn't that good - even on an iPhone 4 (perhaps 4-5 frames per second, it varies).
I believe the reason for this slowness is that the main routine I'm using to capture the device's screen (which is then being displayed on an external display) is UIGetScreenImage(). This routine, which is no longer allowed to be part of shipping apps, is actually quite slow. Here's the code I'm using to capture the screen (FYI mirrorView is a UIImageView):
CGImageRef cgScreen = UIGetScreenImage();
self.mirrorView.image = [UIImage imageWithCGImage:cgScreen];
CGImageRelease(cgScreen);
Is there a faster method I can use to capture the iPhone's screen and achieve a better frame rate (shooting for 20+ fps)? It doesn't need to pass Apple's app review - this demo code won't be in the shipping app. If anyone knows of any faster private APIs, I'd really appreciate the help!
Also, the above code is being executed using a repeating NSTimer which fires every 1.0/desiredFrameRate seconds (currently every 0.1 seconds). I'm wondering if instead wrapping those calls in a block and using GCD or an NSOperationQueue might be more efficient than having the NSTimer invoke my updateTVOut obj-c method that currently contains those calls. Would appreciate some input on that too - some searching seems to indicate that obj-c message sending is somewhat slow compared to other operations.
Finally, as you can see above, the CGImageRef that UIGetScreenImage() returns is being turned into a UIImage and then that UIImage is being passed to a UIImageView, which is probably resizing the image on the fly. I'm wondering if the resizing might be slowing things down even more. Ideas of how to do this faster?
Have you looked at Apple's recommended alternatives to UIGetScreenImage? From the "Notice regarding UIGetScreenImage()" thread:
Applications using UIGetScreenImage() to capture images from the camera should instead use AVCaptureSession and related classes in the AV Foundation Framework. For an example, see Technical Q&A QA1702, "How to capture video frames from the camera as images using AV Foundation". Note that use of AVCaptureSession is supported in iOS4 and above only.
Applications using UIGetScreenImage() to capture the contents of interface views and layers should instead use the -renderInContext: method of CALayer in the QuartzCore framework. For an example, see Technical Q&A QA1703, "Screen capture in UIKit applications".
Applications using UIGetScreenImage() to capture the contents of OpenGL ES based views and layers should instead use the glReadPixels() function to obtain pixel data. For an example, see Technical Q&A QA1704, "OpenGL ES View Snapshot".
New solution: get an iPad 2 and mirror the output! :)
I don't know how fast is this but it worth a try ;)
CGImageRef screenshot = [[UIApplication sharedApplication] _createDefaultImageSnapshot];
[myVGAView.layer setContents:(id)screenshot];
where _createDefaultImageSnapshot is a private API. (Since is for demo... its ok I suppose)
and myVGAView is a normal UIView.
If you get CGImageRefs then just pass them to the contents of a layer, its lighter and should be a little bit faster (but just a little bit ;) )
I haven't got the solution you want (simulating video mirroring) but you can move your views to the external display. This is what I do and there is no appreciable impact on the frame rate. However, obviously since the view is no longer on the device's screen you can no longer directly interact with it or see it. If you have something like a game controlled with the accelerometer this shouldn't be a problem, however something touch based will require some work. What I do is have an alternative view on the device when the primary view is external. For me this is a 2D control view to "command" the normal 3D view. If you have a game you could perhaps create an alternative input view to control the game with (virtual buttons/joystick etc.) really depends on what you have as to how to work around it best.
Not having jailbroken myself I can't say for sure but I am under the impression that a jailbroken device can essentially enable video mirroring (like they use at the apple demos...). If true, that is likely your easiest route if all you want is a demo.

iphone 4 - How do I record the video overlay on top of the video also?

Does anyone know how to do this?
Is anyone able to provide an example? I believe this is out of NDA now as was available in version 4.0 ?
Take a look to AVFoundation framework , specially to avcapture avsession avinputdevice, etc. You can find some listings in the iPhone dev center forums: search "avcapture"
AVFoundation is the framework you want to use to record, modify raw frames, show them, an offcourse add some overlay
If you want to do only overlay then, UIImagePickerController should b enough.
If i understand, what you are trying to do corectly, you have set up video capture someting like suggested by Apple in this Q&A:http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html
From there it shouldn't be much of a Problem to use the method described in this answer:blend two uiimages based on alpha/transparency of top image to blend the preview with your overlay, provided you draw it in a UIImage first. Feed the resulting images to a buffer and save it accordingly.
Though, this method most certainly wouldn't give you a lot of frames per second.

How to create iOS image buttons that scale well across multiple resolutions?

I've run into the issue of using a UIBarButtonItem with a custom color. Everything out on the 'net seems to indicate that the only way around this lack of official API support revolves around the use of images. This is all fine and dandy when developing for pre-iOS 4 devices, except when using the new iPhone 4. Creating an image for iPad and pre-iOS 4 devices is straightforward enough, but the images developed for those devices look absolutely horrid on iPhone 4. I suspect that this problem will be exacerbated further with the introduction of next generation devices.
Consider the example below. Notice how the default colored button is nice and smooth, but the iPhone 3GS image looks terrible. It does not seem very scalable (pun intended) to have to include multiple images for different resolution devices.
In the absence of an official API for changing the color of a UIBarButtonItem, what strategies are out there for creating images that scale well against differing resolution devices? This problem is hardly unique to UIBarButtonItems, how is the community adapting to other UI elements that are bitmapped? Is there a better solution for this particular case than using an image (such as using Quartz to draw it)?
If at all possible, please offer concrete code examples.
You can list any image as Image#2x.png along with Image.png and the system will select the appropriate image at runtime.
If you look at the source for Three20 you can see how they draw custom buttons and shapes that will scale well, regardless of resolution.
Give Opacity (for Mac) a try. Draw your button in it with vector elements and effects, and it'll spit out the necessary Quartz code to reproduce it, drawing natively in your iOS application. You get Retina (#2x) support automatically.
Been over a year since I posted this question, but ran into a use case where I wanted to be able to do this, so instead of having to draw or otherwise create the buttons, I decided to write an open source application to create them. This application uses private APIs to change the colors of the UIBarButtonItem objects and then uses a graphics context to save them to a determined location on your computer's file system. This way you can have pixel perfect UIBarButtonItem images to use in your UIToolbars.
The app creates both the standard and #2x resolution images.
UIBarButtonItem-Generator # GitHub
Any vector drawing app may work, but I would also consider povray, which allows you to create in a C-like scripting 3D language, then export any pixel size you choose.
http://povray.org
I have the same problem with navigation bar so solve as the following:
first i subclass my navigation bar
inside this class
- (void)drawRect:(CGRect)rect
{
UIImage *image=[UIImage imageNamed:#"MyImage.png"];
self.frame=CGRectMake(0, 0, image.size.width, image.size.height);
self.backgroundImage =image;
}
finally save the same image with different resolution With #2x at the end