I am curious about the new APIs for iPhone iOS: AVCapture...
Does this include a documented way to grab a screenshot of the camera preview? The doc seems a bit confusing to me, and since it is out of NDA now, I thought I would post my question here.
Many thanks,
Brett
With AVFoundation you can grab photos from the camera session...The way it works is you use one of the subclasses of AVCaptureOutput in order to get what you need, for still images you are going to want to use the AVCaptureSTillImageOutput subclass, here is a link AVCaptureStillImageOutput ref. Besides that you also have AVCaptureMovieFileOutput which is used to record a quicktime movie from the capture session to a file, AVCaptureVideoDataOutput which allows you to intercept uncompressed individual frames from the capture session, you also have audio outputs which you can use as well...hope this helps
Related
I'm making a photo gallery app. I use the photos as slides and shows them one by one with transactions made by core animation.
Now I need to turn this process in to a video so that I can share it on Youtube, but I don't know where to start or which reference should I read.
To make video in iphone you can AVAssetWriter.
Also have a look this post. You can capture your animation as images and make the video.
The AVAsset Apple APIs is used to do this sort of thing is. But be warned, it is not easy to use. If you would like to check out a working example Xcode project, see AVDecodeEncode. This example shows decoding from h.264 and then encoding the result back to h.264. You could write all you own code to do this, but just be aware that it is not for the faint of heart.
I want to make an app for the iPhone with the ability to record video. One thing I want to do with the video once it's recorded, is to take the audio from it and alter it, such as make it sound feminine or masculine etc. I've never done this before but is it better to use AVFoundation or UIImagePickerController. I've read that the UIImagePickerController is easier to use but will it allow be to extract and edit the audio of the recorded video and put it back in?
Any help or suggestions as to how to approach this are appreciated.
Thanks
It is easiest to capture with UIImagePickerController if you do not need to alter it live. Once the video is captured and you have altered the audio track somehow, write it again using AVFoundation's AVMutableComposition and an audio mix.
How to record audio and video using AVFoundation frame by frame in iOS4?
The AVCamDemo you mention is close to what you need to do and should be able to use that as reference, among those these are the following classes you need to use in order to achive what you are trying... All the classes are part of AVFoundation, you need
AVCaptureVideoDataOutput and AVCaptutureAudioDataOutput - use these classes to get raw samples from the video camera and the microphone
Use AVAssetWriter and AVAssetWriterInput in order to encode the raw samples into a file - the following sample mac OS X project shows how to use these classes (the sample should work for ios too), however they use an AVAssetReader for input (it reencodes a movie file) instead of the Camera and microphone... You can use the outputs mentioned above as the input in your case to write what you want
That should be all you need in order to achieve what you want to do...
Heres a link showing how to use VideoDataOutput
Hope it helps
If you are a registered developer, look at the videos from the 2011 WWDC (which you can find by searching in the developer portal). There are two sessions relating to AVFoundation. There was also some sample code from one of the WWDC sessions, which was extremely useful.
I am a newbie trying to capture camera video images using AVFoundation and
want to render the captured frames without using AVCaptureVideoPreviewLayer. I
want a slider control to be able to slow down or speed up the rate of display of
camera images.
Using other peoples code as examples, I can capture images and using an NSTimer,
with my slider control can define on the fly how often to display them, but I
can't convert the image to something I can display. I want to move these
images into a UIView or UIImageView and render them in the timer Fire function.
I have looked at Apples AVCam app, (which uses an AVCaptureVideoPreviewLayer)
but because it has its own built in AVCaptureSession, I can't adjust how often the
images are displayed. (well, you can adjust the preview layer frame rate but
that can't be done on the fly)
I have looked at the AVFoundation programming guide, which talks about AVAssets
and AVPlayer, etc. but I can't see how a camera image can be turned into an
AVAsset. When I look at the AVFoundation guide, and other demos which show how
to define an AVAsset, it only gives me choices of using http stream data to
create the asset, or a url to define an asset using an existing file. I can't
figure out how to make my captured UIImage into an AVAsset, in which case I guess
I could use an AVPlayer, AVPlayerItems and AVAssetTracks to show the image with
an observeValueForKeyPath function checking status and doing [myPlayer play].
(I also studied the WWDC session 405 "Exploring AV Foundation" to see how that
is done)
I have tried similar code as in the WWDC Session 409 "Using the Camera on iPhone."
Like that myCone demo, I can set up the device, the input, the capture session,
the output, the setting up of a callback function to a CMSampleBuffer, and I
can collect UIImages and size them, etc. At this point I want to send that image
to a UIView or UIimageView. The session 409 just talks about doing it with
CFShow(sampleBuffer). This wasn't explained, and I guess its just assuming a
knowledge of Core Foundation I don't yet have. I think I am turning the captured
output in the sample buffer into a UIImage, but I can't figure out how to render
it. I created an IBOutlet UIImageView in my nib file, but when I try to stuff
the image into that view, nothing gets displayed. Do I need an AVPlayerLayer?
I have looked at the UIImagePickerViewController as an alternate method of
controlling how often I display captured camera images, and I dont see that I
can change the time on the fly to display images using that controller either.
So, as you can see, I am learning this stuff with the Apple development forum and
their documentation, the WWDC videos, and various websites such as
stackoverflow.com but have yet to see any examples of doing camera to screen
without using AVCaptureVideoPreviewLayer, UIImagePickderViewController or by
using an AVAsset that isnt already a file or http stream.
Can anybody make a suggestion? Thanks in advance.
Is there a way to have my iPhone program step frame by frame through a movie recorded by the iPhone? What I want to do is have the user record a quicktime movie, then be able to step through the movie frame by frame.
Alternately, I suppose if there was a way to extract every single frame from the movie to a jpg, then I could easily step through the pictures. Anyone know of a way to do this???
I suppose the third option (which might not get past Apple's store) is to capture the movie the way the old jailbroken apps did, which is somehow capture the pictures directly from the camera view????
Any help appreciated. Thanks in advance!!!!
You cannot step through a movie frame by frame. That functionality does not exist in the public API.
You can include your own media decoder code (open source or not) and use that to parse your movies of course. It is perfectly fine to do that.