Extract frames from Video - iphone

i am working on an app where i allow user to shoot a video and then capture the frames of the video every second and write the images to documents directory.I compiled the iFrameExtractor project which works fine on simulator but shows errors when running on device.Any Suggestions?

You could simply use the thumbnailImageAtTime:timeOption: Method from the MPMoviePlayerController Class.

Related

iOS 6 in app camera

Im building an app that allows the user to record a video (in app) by pressing a button on the main screen. I don't want the user to be taken to the photo app because the video will only be able to be viewed on the app (Max of 15 seconds) and I can't quite get it. Anyone have the code to do this? A good example of what i want the camera to do is the camera in the app Cinemagram. Thanks for any help.
If you plan on saving the movie to the user's photo library, then you can use UIImagePickerController. In particular, you should read the guide that accompanies the class.
However, if you only want the video to be temporary, then you will probably want to use AVFoundation. You would then need to configure an AVCaptureSession with an AVCaptureMovieFileOutput to write the video to disk. Then, when you are ready to play the video, create an AVURLAsset with the file url that you just wrote, use that to create an AVPlayer to play the video, and add an AVPlayerLayer to your view, with said player, to display the video.
Either way, I would recommend studying the examples that Apple provides.
AVCam and
AVPlayerDemo should be more than enough to get you started (especially the AVCam example project).

ffmpeg for extract a frame from iPhone video camera

I am trying to extract an image frame from a video taken from iPhone camera using ffmpeg. But it usually throws me a EXEC_BAD_ACCESS and the stacktrace is showing in another method calls that is never called (I know my code didn't call it)
I am using the ffmpeg built from the instruction of the iFrameExtractor website. If anybody do it successfully, please help me or if possible, send me some codes. I don't know why it crashes, although it works well on the simulator (which I manually import a video into the library). My guess is that ffmpeg cannot decode the iPhone video camera correctly.
I already tried to use all 3 sets of library files like arvm6, arvm7 and i386 but doesn't work. My iPhone is 3gs. My iphone sdk is 3.1.3
I think it is my fault in calling the VideoFrameExtractor. The example code doesn't work well. I have to change from videoExtractor.currentImage to [videoExtractor currentImage]
Why would you use ffmpeg? You can extract frames using the AVFoundation framework in iOS4. It's faster and easier to use.
Can you paste in your stack trace and possibly the code you are using to read the frames?

How to record a video automatically in Iphone app without user interaction

I am working an iphone app that needs to record a vedio automatically.
I used mobile coreservices framework and using that. I made it to came into video mode and clicking on record option its start capturing a vedio. But I want it automatically that is.. I should able to record a video without clicking on record option. That is when video mode comes up its automatically start record video.
Could any one help?
You can look at UIImagePickerControllers startVideoCapture method which is used to start taking video from the camera, this is to be used when you arent using the camera standard controlors and you provide and overlay view. Here is a reference UIImagePickerCOntroller ref. If this is not enough for you, you might want to look into AVFoundation framework which gives you a lot more control over video capturing process...hope that helps

Cannot play a recorded sound on device

I'm using the exact code from the iPhone Application Programming Guide Multimedia Support to use AVAudioRecorder to record a file to the disk and then AVAudioPlayer to load and play that file.
This is working fine in the simulator but is not working on the device. The file gets loaded (we can see the NSTimeInterval) but does not play (play returns false).
After it didn't work with the sample code from the website, we tried changing to a bunch of different codecs with no success. And of course, the sound is on.
Thanks a bunch.
SOLVED!
The problem was that after you set the AVAudioSession category to Record for recording, you have to set it to Play for playing. Sounds obvious now but you know how it is. When you use the Simulator it uses a different underlying implementation so this problem does not show up.

Play audio and video at a same time in iPhone application

Is it possible to play audio and video file at a same time? I want to play different audio file and video file at a same time and also want control for both, so is it possible?
Sorry, not that I know of. The movie view automatically stops all audio files and takes up the whole screen, so you are forced to listen to the audio for the video.
If the audio is part of the video file, yes.
If it's an MP3 file or some other external type how are you planning on playing it? The phone might allow this to happen depending on what you're up too.
You might be in luck soon though...
Another helpful link is here.