Creating a video from the visible window or view - iphone

I have requirement to record video from the visible view.i,e not form camera. like in TalkingTom application. Can any suggest the solution.

You can take screenshot (using UIGetScreenImage) and store the images into an array.
Then convert it into video using mpeg coder.

Related

Recording video with option of manipulating the pixels before writing to file

I know that I can access raw video images from the iPhone's camera with AVCaptureVideoDataOutput. I also know that I can record video to a file with AVCaptureMovieFileOutput. But how can I first access the raw video images, manipulate them and then write the manipulated ones into the video file? I've already seen apps in the app store, which do this, so it must be possible.
Ok, I now know, that it's done with AVAssetWriter.

How do I create an AVAsset with a UIImage captured from a camera?

I am a newbie trying to capture camera video images using AVFoundation and
want to render the captured frames without using AVCaptureVideoPreviewLayer. I
want a slider control to be able to slow down or speed up the rate of display of
camera images.
Using other peoples code as examples, I can capture images and using an NSTimer,
with my slider control can define on the fly how often to display them, but I
can't convert the image to something I can display. I want to move these
images into a UIView or UIImageView and render them in the timer Fire function.
I have looked at Apples AVCam app, (which uses an AVCaptureVideoPreviewLayer)
but because it has its own built in AVCaptureSession, I can't adjust how often the
images are displayed. (well, you can adjust the preview layer frame rate but
that can't be done on the fly)
I have looked at the AVFoundation programming guide, which talks about AVAssets
and AVPlayer, etc. but I can't see how a camera image can be turned into an
AVAsset. When I look at the AVFoundation guide, and other demos which show how
to define an AVAsset, it only gives me choices of using http stream data to
create the asset, or a url to define an asset using an existing file. I can't
figure out how to make my captured UIImage into an AVAsset, in which case I guess
I could use an AVPlayer, AVPlayerItems and AVAssetTracks to show the image with
an observeValueForKeyPath function checking status and doing [myPlayer play].
(I also studied the WWDC session 405 "Exploring AV Foundation" to see how that
is done)
I have tried similar code as in the WWDC Session 409 "Using the Camera on iPhone."
Like that myCone demo, I can set up the device, the input, the capture session,
the output, the setting up of a callback function to a CMSampleBuffer, and I
can collect UIImages and size them, etc. At this point I want to send that image
to a UIView or UIimageView. The session 409 just talks about doing it with
CFShow(sampleBuffer). This wasn't explained, and I guess its just assuming a
knowledge of Core Foundation I don't yet have. I think I am turning the captured
output in the sample buffer into a UIImage, but I can't figure out how to render
it. I created an IBOutlet UIImageView in my nib file, but when I try to stuff
the image into that view, nothing gets displayed. Do I need an AVPlayerLayer?
I have looked at the UIImagePickerViewController as an alternate method of
controlling how often I display captured camera images, and I dont see that I
can change the time on the fly to display images using that controller either.
So, as you can see, I am learning this stuff with the Apple development forum and
their documentation, the WWDC videos, and various websites such as
stackoverflow.com but have yet to see any examples of doing camera to screen
without using AVCaptureVideoPreviewLayer, UIImagePickderViewController or by
using an AVAsset that isnt already a file or http stream.
Can anybody make a suggestion? Thanks in advance.

iPhone: get image from camera without UIImagePickerController

Please help me with my question.
Is there any way to get image from camera without UIImagePickerController?
I need to render current image(from camera) into image on my view and update it by timer.
May be AVCaptureStillImageOutput? I didn't find any examples.
Any ideas?
Yes, you can do it easily using AVCamCaptureManager and AVCamRecorder classes. Apple has a demo program build on its developer site here. It is named AVCam. In simple words what it does is when you click to open the camera, it calls the classes and methods which are responsible for opening the iPhone's camera and record video or capture audio. It calls the same classes which are called by UIImagePickerController.
I hope it helps.

Record Camera from iPhone and use as overlay in another video

I know how to record a video from the camera on the iPhone, my question, is it possible to take the recording and overlay it on a saved video and save it out as another file?
No, I don't think so. Not using an standard framework. You could probably do something involving screen capture and combing a load of images to make a video. But it would be complicated.

How to step frames in a movie recorded by iPhone

Is there a way to have my iPhone program step frame by frame through a movie recorded by the iPhone? What I want to do is have the user record a quicktime movie, then be able to step through the movie frame by frame.
Alternately, I suppose if there was a way to extract every single frame from the movie to a jpg, then I could easily step through the pictures. Anyone know of a way to do this???
I suppose the third option (which might not get past Apple's store) is to capture the movie the way the old jailbroken apps did, which is somehow capture the pictures directly from the camera view????
Any help appreciated. Thanks in advance!!!!
You cannot step through a movie frame by frame. That functionality does not exist in the public API.
You can include your own media decoder code (open source or not) and use that to parse your movies of course. It is perfectly fine to do that.