Filter Live camera feed - ios5

So i've been using UIImagepickercontroller to access the camera for photo and video capture, then i wanted to apply filters on those 2 sources, i succeeded with filtering token photos but i'am having trouble finding the solution for the rest, all i need is to access the raw image data : the live image feed that the camera is showing , apply the filter and then show the filtered ones instead. Any help or advice will be appreciated.

UIImagePickerController doesn't give you low level access to the camera buffer.
You should setup a AVCaptureSession and use the delegate to process the CMSampleBufferRef
Take a look at the AVCam & SquareCam demos from Apple, they give a good introduction to video capture.
http://developer.apple.com/library/ios/#samplecode/AVCam/Introduction/Intro.html
http://developer.apple.com/library/ios/#samplecode/SquareCam/Introduction/Intro.html
An easier solution is to use https://github.com/BradLarson/GPUImage
Thanks
Adam

Related

Capture video without displaying the actual video feed

So I have an application that can currently capture video with the front facing iphone camera and then do some processing on the video feed real-time. What I'm trying to do, however, is make this process run in the background and put other controls onscreen. So for example, say I'd like to run the camera and process the image feed, but I want the user to see a black screen with some buttons on it. Any ideas on how to do this?
Just so we get terminology right, by "in the background", you mean running the camera capture while your application is in the foreground, but not displaying the actual video feed. This is possible, but I wanted to make clear that if you move your whole application into the background you will not have access to the camera then.
There are a few ways to do this, but the one that I've spent the most time with is grabbing frames of video (or photos) via AV Foundation. Using an AVCaptureDevice and AVCaptureSession, you can grab the frames of video and route them to an encoder for saving to disk or for processing using your own custom code. None of this requires the camera feed to be displayed onscreen, so you can put up whatever interface you like and do this video recording or photo capture without any onscreen indication.
I would caution that you should make it explicit to your users what you are doing, so that you do not run the risk of violating someone's privacy. Apple does not react kindly to those who do this (for good reason).
I encapsulate a lot of this within my open source GPUImage video and photo processing framework, so you could look at the code for the GPUImageVideoCamera class there to see how I configure the capture inputs. I hand the video frames off to OpenGL ES for the application of filters and other processing operations, but you could ignore that portion of it if you just wanted to do your own encoding or processing.
Heres an exemple code from Apple's doc:
http://developer.apple.com/library/ios/#samplecode/PhotoPicker/Introduction/Intro.html
there is also the way to customize the camera interface.

How do I create an AVAsset with a UIImage captured from a camera?

I am a newbie trying to capture camera video images using AVFoundation and
want to render the captured frames without using AVCaptureVideoPreviewLayer. I
want a slider control to be able to slow down or speed up the rate of display of
camera images.
Using other peoples code as examples, I can capture images and using an NSTimer,
with my slider control can define on the fly how often to display them, but I
can't convert the image to something I can display. I want to move these
images into a UIView or UIImageView and render them in the timer Fire function.
I have looked at Apples AVCam app, (which uses an AVCaptureVideoPreviewLayer)
but because it has its own built in AVCaptureSession, I can't adjust how often the
images are displayed. (well, you can adjust the preview layer frame rate but
that can't be done on the fly)
I have looked at the AVFoundation programming guide, which talks about AVAssets
and AVPlayer, etc. but I can't see how a camera image can be turned into an
AVAsset. When I look at the AVFoundation guide, and other demos which show how
to define an AVAsset, it only gives me choices of using http stream data to
create the asset, or a url to define an asset using an existing file. I can't
figure out how to make my captured UIImage into an AVAsset, in which case I guess
I could use an AVPlayer, AVPlayerItems and AVAssetTracks to show the image with
an observeValueForKeyPath function checking status and doing [myPlayer play].
(I also studied the WWDC session 405 "Exploring AV Foundation" to see how that
is done)
I have tried similar code as in the WWDC Session 409 "Using the Camera on iPhone."
Like that myCone demo, I can set up the device, the input, the capture session,
the output, the setting up of a callback function to a CMSampleBuffer, and I
can collect UIImages and size them, etc. At this point I want to send that image
to a UIView or UIimageView. The session 409 just talks about doing it with
CFShow(sampleBuffer). This wasn't explained, and I guess its just assuming a
knowledge of Core Foundation I don't yet have. I think I am turning the captured
output in the sample buffer into a UIImage, but I can't figure out how to render
it. I created an IBOutlet UIImageView in my nib file, but when I try to stuff
the image into that view, nothing gets displayed. Do I need an AVPlayerLayer?
I have looked at the UIImagePickerViewController as an alternate method of
controlling how often I display captured camera images, and I dont see that I
can change the time on the fly to display images using that controller either.
So, as you can see, I am learning this stuff with the Apple development forum and
their documentation, the WWDC videos, and various websites such as
stackoverflow.com but have yet to see any examples of doing camera to screen
without using AVCaptureVideoPreviewLayer, UIImagePickderViewController or by
using an AVAsset that isnt already a file or http stream.
Can anybody make a suggestion? Thanks in advance.

How to get raw image data (RGB value of each pixel) from iPhone camera when using in video capture mode?

I have to make an iPhone project that can process video data in
realtime. This app has to be able to reconize the color of the
object in the video frame. After I found information relating to
video processing in iOS, I found that I can use AVFoundation
Framework to achieve this task but I don't know which APIs or functions
of AVFoundation that's able to do this video processing task.
Can anyone suggest me which function to use to get image frames or
raw image data out of a video streaming in real-time?
I'd appreciate if you can give me some example code
Thank you very much for helping me...
You can first of all make use of AVAsset class by initiating it with your video file URL.
You can then use an AVAssetReader object for obtaining media data of that asset.
This will help you obtain video frames which you can read using AVAssetReaderVideoCompositionOutput class object. Accessing RGB channel data from these frames is pertinent to CGImage classes and it's methods.
Hope this helps you to get started

Camera differences between UIImagePickerController and AVCaptureSession on iPhone

I'm trying to build a replacement for UIImagePickerController, using AVCaptureSession with AVCaptureDeviceInput and AVCaptureStillImageOutput, as input/output respectively.
To preview the camera stream I'm using AVCaptureVideoPreviewLayer.
It's now working correctly for capturing and storing photos just like the default camera.
However, I found 3 problems I was unable to solve:
photos captured don't get the same quality the default camera provides
the viewing/capture angle is shortened, just like using the video capture on the default camera
no way to control camera specific options like flash
Is there any way to get to the level of UIImagePickerController using a more customizable approach (i.e. AVFoundation or any other)?
Check out "Session 409 - Using the Camera with AV Foundation" in the WWDC 2010 videos. Based on the video, it looks like you can resolve all three of your issues with AVFoundation.
Hope this helps!

iPhone video capture (best method)

I am curious about the new APIs for iPhone iOS: AVCapture...
Does this include a documented way to grab a screenshot of the camera preview? The doc seems a bit confusing to me, and since it is out of NDA now, I thought I would post my question here.
Many thanks,
Brett
With AVFoundation you can grab photos from the camera session...The way it works is you use one of the subclasses of AVCaptureOutput in order to get what you need, for still images you are going to want to use the AVCaptureSTillImageOutput subclass, here is a link AVCaptureStillImageOutput ref. Besides that you also have AVCaptureMovieFileOutput which is used to record a quicktime movie from the capture session to a file, AVCaptureVideoDataOutput which allows you to intercept uncompressed individual frames from the capture session, you also have audio outputs which you can use as well...hope this helps