Freeze or pause iPhone camera image - iphone

On iPhone's built-in Camera application (OS 3.1), touching the shutter button shows an iris animation, then displays the image that was taken for a second or so before animating it away.
Is anyone aware of a simple way to get this "brief pause" activity? Or do I have to resort to manually adding the image as part of my custom cameraOverlayView?
Bonus points for the iris animation too (without interfering with said custom overlay).

Ultimately, I ended up using UIGetScreenImage() (which is now officially blessed for use by Apple) and pushing that image to a previously hidden UIImageView.
Meh.

did you try releasing the ImagePickerController after your code be done?

Related

How can I animate a Camera Iris animation using CAFilter?

I am developing a simple camera app, where I need to animate a camera shutter opening/closing animation. I googled on CAFilter and CATransition, but got confused. How is it going to help me animate? Ex. If I have a view called view, and a method called
-(void)pressed;
In my interface. How can I implement an animation in my view using CAFilter? Can anyone even give me any other example of this? Maybe a view 360 degree rotation on the press of a button?
Asked and answered elsewhere on this forum. Here are some references for you:
This uses CATransition:
Shutter animation AVFoundation iphone
That uses #"cameraIris" which is an undocumented Apple API, so apps using it might be rejected. But it sure looks good.
There's also this 26 MB movie:
http://www.juicybitssoftware.com/2009/08/31/iphone-camera-shutter-animation/

IPhone Camera Shutter issue

basically I am trying to use the camera function but want to be able to upload a picture once the camera button is pressed without touching the use button.
Therefore the delegate camera method never gets called. I am however trying to capture the screen using the following method to get the image:
UIGetScreenImage()
This seems unorthodox but does the trick. My issue comes with the fact that sometimes, I get the shutter image. Is there delegate method called when the shutter animation is complete?
If so, any help is more than welcomed. Thanks.
Use stillCamera from Brad Larson https://github.com/BradLarson/GPUImage

How to modify in real time the video stream from the iPhone camera?

Every time the camera of the iPhone captures a new image, I want to modify it and then display it on the iPhone screen. In other way: how to modify in real time the video stream from the iPhone camera?
I need a method who is called every time when a new image comes from the camera.
Thanks for your help! :-)
EDIT: what I want to do is like augmented reality: while I'm taking a video, every image is modified, and is showed in real time on the iPhone screen.
You could capture an image with the UIImagePickerController's takePicture method and draw your modified version to the cameraOverlayView.
You get the picture recorded as a result of the takePicture message from the UIImagePicker's delegate in a imagePickerController:didFinishPickingMediaWithInfo:. From the dictionary supplied to that method, you can get the original image which you modify and draw to the overlay.
Here is an example for using the cameraOverlayView. You should be able to re-use the captured image from your delegate for drawing your overlay view.
Many augmented reality apps do not actually modify the image but just overlay information based on what they think is on the screen from input from the accelerometer and compass. If this is the kind of AR you are looking to do then try looking at ARKit.
You cannot process the camera's image data in real time. THere is no API to do this. File a request with Apple using their bug tracker. Many of us have done this already. More requests might lead to this being possible.
Oh, yes, just use an overlay view then. You were talking about modifying the video stream, which is clearly not needed.

Modify photo selected by UIImagePicker before it's used in the move-and-scale screen

as you probably know, the UIImagePickerController in the UIKit Framework allows the user to take a photo which is then handed over to my App for further processing.
If the property allowsEditing is set to YES, the user is allowed to move and scale the photo after taking it.
What I'd like to accomplish is to modify the photo in my code BEFORE the user is presented with the move and scale screen. This is because I'd like to add some visual effects to the photo and the user should be able to move and scale with these effects already applied to the photo.
I know that there's the cameraOverlayView property, but this is not useful in my case as far as I'm concerned.
Any ideas?
Thanks, Thomas
Not so easy way is to implement move and scale functionality on your own.
make showsCameraControls NO. You can design your own preview screen (with modified Image).
Looks like there are issues with touches in IOS5 cameraOverlayView. Make sure the above solution works for IOS5 also :).

Getting an image representation of the camera preview in UIImagePickerController

When I use the standard [view.layer renderInContext:UIGraphicsGetCurrentContext()]; method for turning views into images on the iPhone camera modal view, all I get is the controls with black where the camera preview was.
Anyone have a solution for this? Speed is not an issue.
Not possible from a documented api, but possible. Find the "PLCameraView" view in the camera's subviews, then call
CGImageRef img = (CGImageRef)[foundCameraView imageRef];
This will return a reference to the image that camera holds.
You can do this with the new AVFoundation stuff in iOS 4.0.
You should be able to call UIGetScreenImage() (returns a UIImage) to get a current capture of the whole screen, including the preview. That's how all the barcode apps worked before. But supposedly Apple is disallowing that now and only allow the AVFoundation technique - which only works under 4.0.
The whole reason there's even an issue is because UIGetScreenImage() is not part of the documented API, but Apple made a specific exception for using it. It's not like they are pulling current apps, but they are not allowing new submissions (or updates) that use the older technique.
There is some lobbying on behalf on a number of people to convince Apple to let app developers use the old technique for iOS 3.x only, so send an email to developer relations if you want to use it.
This isn't possible unfortunately. For performance the iPhone uses some form of direct rendering to draw the camera preview directly onto the screen instead of as part of a UIView surface. As such when you "capture" the camera view you will only get the UIView elements, not the preview image.
(FWIW this is similar to the reasons why its difficult to screengrab some movie software on Windows/Mac)
You could try AVCaptureVideoPreviewLayer