IPhone Camera Shutter issue - iphone

basically I am trying to use the camera function but want to be able to upload a picture once the camera button is pressed without touching the use button.
Therefore the delegate camera method never gets called. I am however trying to capture the screen using the following method to get the image:
UIGetScreenImage()
This seems unorthodox but does the trick. My issue comes with the fact that sometimes, I get the shutter image. Is there delegate method called when the shutter animation is complete?
If so, any help is more than welcomed. Thanks.

Use stillCamera from Brad Larson https://github.com/BradLarson/GPUImage

Related

How can I animate a Camera Iris animation using CAFilter?

I am developing a simple camera app, where I need to animate a camera shutter opening/closing animation. I googled on CAFilter and CATransition, but got confused. How is it going to help me animate? Ex. If I have a view called view, and a method called
-(void)pressed;
In my interface. How can I implement an animation in my view using CAFilter? Can anyone even give me any other example of this? Maybe a view 360 degree rotation on the press of a button?
Asked and answered elsewhere on this forum. Here are some references for you:
This uses CATransition:
Shutter animation AVFoundation iphone
That uses #"cameraIris" which is an undocumented Apple API, so apps using it might be rejected. But it sure looks good.
There's also this 26 MB movie:
http://www.juicybitssoftware.com/2009/08/31/iphone-camera-shutter-animation/

How to get the color of the tapped point on the iPhone's camera input in real time?

I want to get the color of the point that the user is panning on the image from the camera.
And this needs to happen in real time.
I'm using UIImagePickerController class with sourceType property set to UIImagePickerControllerSourceTypeCamera.
So the user opens the video camera and after the iris opens he has the possibility to tap over it.
While he is panning on the video camera view I want the application to show the color of the point under his finger. Real time.
If there is someone who could please tell me if this is possible and how to do it.
I tried to use the code from here:
http://developer.apple.com/library/ios/#qa/qa1702/_index.html
First I have a warning:
warning: 'CameraViewController' may not respond to '-setSession:'
I get a lot of errors when trying to compile. I included inside the .h file this:
#import <AVFoundation/AVFoundation.h>
Do I have to include more then this?
Also do I still need to use the UIImagePickerController to show the camera?
I'm new to iOS and very confused with this.
OK I did it using the example from http://developer.apple.com/library/ios/#qa/qa1702/_index.html
The problem that I have is that is working only on the iPhone. On the simulator I still get those errors regarding not recognizing the frameworks.
You will have to use AVFoundation for this. An AVCaptureSession can deliver live video frames to AVCaptureVideoDataOutput's delegate method captureOutput:didOutputSampleBuffer:fromConnection: where you then have to analyze each frame to determine the color at a particular position.

How to modify in real time the video stream from the iPhone camera?

Every time the camera of the iPhone captures a new image, I want to modify it and then display it on the iPhone screen. In other way: how to modify in real time the video stream from the iPhone camera?
I need a method who is called every time when a new image comes from the camera.
Thanks for your help! :-)
EDIT: what I want to do is like augmented reality: while I'm taking a video, every image is modified, and is showed in real time on the iPhone screen.
You could capture an image with the UIImagePickerController's takePicture method and draw your modified version to the cameraOverlayView.
You get the picture recorded as a result of the takePicture message from the UIImagePicker's delegate in a imagePickerController:didFinishPickingMediaWithInfo:. From the dictionary supplied to that method, you can get the original image which you modify and draw to the overlay.
Here is an example for using the cameraOverlayView. You should be able to re-use the captured image from your delegate for drawing your overlay view.
Many augmented reality apps do not actually modify the image but just overlay information based on what they think is on the screen from input from the accelerometer and compass. If this is the kind of AR you are looking to do then try looking at ARKit.
You cannot process the camera's image data in real time. THere is no API to do this. File a request with Apple using their bug tracker. Many of us have done this already. More requests might lead to this being possible.
Oh, yes, just use an overlay view then. You were talking about modifying the video stream, which is clearly not needed.

Freeze or pause iPhone camera image

On iPhone's built-in Camera application (OS 3.1), touching the shutter button shows an iris animation, then displays the image that was taken for a second or so before animating it away.
Is anyone aware of a simple way to get this "brief pause" activity? Or do I have to resort to manually adding the image as part of my custom cameraOverlayView?
Bonus points for the iris animation too (without interfering with said custom overlay).
Ultimately, I ended up using UIGetScreenImage() (which is now officially blessed for use by Apple) and pushing that image to a previously hidden UIImageView.
Meh.
did you try releasing the ImagePickerController after your code be done?

Taking images from camera without user interaction?

I am creating an app in which, as soon as the UIImagePickerController loads (i.e. the camera view), it should start taking pictures without any click and store images in an array. How can I do this without clicking on the "shoot" button?
In reference library, UIImagePickerController contains an instance method, -takePicture. Can somebody tell me if this function will do the trick, if I call it through timer?
Thanks in advance.
-takePicture should do the trick in deed. You have to provider a custom UI for the camera controls, because otherwise (for me) it doesn't work. Check out the developer documentation in Xcode and search for takePicture. The method description has everything you need.