ffmpeg for extract a frame from iPhone video camera - iphone

I am trying to extract an image frame from a video taken from iPhone camera using ffmpeg. But it usually throws me a EXEC_BAD_ACCESS and the stacktrace is showing in another method calls that is never called (I know my code didn't call it)
I am using the ffmpeg built from the instruction of the iFrameExtractor website. If anybody do it successfully, please help me or if possible, send me some codes. I don't know why it crashes, although it works well on the simulator (which I manually import a video into the library). My guess is that ffmpeg cannot decode the iPhone video camera correctly.
I already tried to use all 3 sets of library files like arvm6, arvm7 and i386 but doesn't work. My iPhone is 3gs. My iphone sdk is 3.1.3

I think it is my fault in calling the VideoFrameExtractor. The example code doesn't work well. I have to change from videoExtractor.currentImage to [videoExtractor currentImage]

Why would you use ffmpeg? You can extract frames using the AVFoundation framework in iOS4. It's faster and easier to use.
Can you paste in your stack trace and possibly the code you are using to read the frames?

Related

What can be the substitute for SDL to direct ffmpeg decoded videos to screen in IOS?

I making a iOS video player using ffmpeg, the flow likes this:
Video File---> [FFMPEG Decoder] --> decoded frames --> [a media director] --> /iphone screen (full and partial)/
A media director will handle the tasks of rendering decoded video frames to iOS ui (UIView, UIWindow etc), outputting audio samples to iOS speaker, and threads management.
SDL is one of those libs, but SDL is mainly made for game making purpose and seem to be not really mature for iOS.
What can be the substitute for SDL?
On Mac OS X I used CoreImage/CoreVideo for this, decoding frame into a CVImageBuffer and rendering them into a CoreImage context. I'm not sure CoreImage contexts are supported on iOS though. Maybe this thread will help on this: How to turn a CVPixelBuffer into a UIImage?
A better way on iOS might be to draw your frames with OpenGLES.
SDL uses opengl and FFMpeg, you can come pretty close using ffmpeg and apple native api's functions. We've done it with several video players.
This certainly will get you started.
https://github.com/mooncatventures-group

Extract frames from Video

i am working on an app where i allow user to shoot a video and then capture the frames of the video every second and write the images to documents directory.I compiled the iFrameExtractor project which works fine on simulator but shows errors when running on device.Any Suggestions?
You could simply use the thumbnailImageAtTime:timeOption: Method from the MPMoviePlayerController Class.

Recording Audio and Video using AVFoundation frame by frame

How to record audio and video using AVFoundation frame by frame in iOS4?
The AVCamDemo you mention is close to what you need to do and should be able to use that as reference, among those these are the following classes you need to use in order to achive what you are trying... All the classes are part of AVFoundation, you need
AVCaptureVideoDataOutput and AVCaptutureAudioDataOutput - use these classes to get raw samples from the video camera and the microphone
Use AVAssetWriter and AVAssetWriterInput in order to encode the raw samples into a file - the following sample mac OS X project shows how to use these classes (the sample should work for ios too), however they use an AVAssetReader for input (it reencodes a movie file) instead of the Camera and microphone... You can use the outputs mentioned above as the input in your case to write what you want
That should be all you need in order to achieve what you want to do...
Heres a link showing how to use VideoDataOutput
Hope it helps
If you are a registered developer, look at the videos from the 2011 WWDC (which you can find by searching in the developer portal). There are two sessions relating to AVFoundation. There was also some sample code from one of the WWDC sessions, which was extremely useful.

OGG Vorbis in iPhone sdk

I want to know if it's possible to play Ogg Vorbis audio file with iPhone SDK or if exist a library or a framework that allow this.
I've read something about OpenAL but I don't find any tutorial...
Can anyone help me??
Better late than never ;)
I have found the answer here Use cocos2d for playing ogg file in my project?.
PASoundMgr is a different sound engine that had support for ogg file
playback. However, it hasn't been updated since iOS 2.x and there are
numerous issues that have cropped up since then that it doesn't
handle.
Why do you need to play ogg files? If you convert them to aac you will
be able to play them back using hardware decoding which is much more
efficient from a cpu usage point of view.
They mentioned PASoundMgr. It worked for me. I just copied from cocos2d framework all files->libraries that SoundEngineTest was based on. And got rid of unnecessary code.
Here is my demoProject that shows how to play ogg on ios.
Be careful with iOS 5.* simulators, they have some problems with sound library. My demo works on 4.3 simulator and on Device.
Here are steps that I made to create demo:
First you will need cocos2d-iphone framework. I've already had it, but you can find it here cocos-2d_download.
As you can notice SoundEngineTest depends on libvorbis.a. It's a library that made of files from external/Tremor group. Also it depends on OpenAl, AudioToolbox frameworks.
I copied all files from tremor group to my project. Crearted "vorbis" Cocoa Touch Static Library, without ARC. And added all source files and header to the "vorbis" target in Build Phases tab.
In the Build Phases of OggPlayDemo Added libraries (libvorbis, OpenAl, AudioToolbox) to the Link Binary with Libraries box.
Added PA classes to project. And checked OggPlayDemo as a target. To avoid problems with ARC, I disabled ARC compilation for this 3 PA files. (see disable ARC for single file)
Removed all cocos2d references. There were some code related to correcting position of listener depending on orientation... I commented it. I don't need this feature for just playing audio.
Copied audio file.
And finally added this code to ViewController:
- (void)viewDidLoad
{
[super viewDidLoad];
//init
[PASoundMgr sharedSoundManager];
[[[PASoundMgr sharedSoundManager] listener] setPosition:CGPointMake(0, 0)];
self.audioSource = [[PASoundMgr sharedSoundManager] addSound:#"trance-loop" withExtension:#"ogg" position:CGPointMake(0, 0) looped:YES];
// lower music track volume and play it
[self.audioSource setGain:0.5f];
}
- (IBAction)play:(id)sender {
[self.audioSource playAtListenerPosition];
}
Cricket Audio will play ogg files, among others, and works on iOS/Android/WP8.

Cannot play a recorded sound on device

I'm using the exact code from the iPhone Application Programming Guide Multimedia Support to use AVAudioRecorder to record a file to the disk and then AVAudioPlayer to load and play that file.
This is working fine in the simulator but is not working on the device. The file gets loaded (we can see the NSTimeInterval) but does not play (play returns false).
After it didn't work with the sample code from the website, we tried changing to a bunch of different codecs with no success. And of course, the sound is on.
Thanks a bunch.
SOLVED!
The problem was that after you set the AVAudioSession category to Record for recording, you have to set it to Play for playing. Sounds obvious now but you know how it is. When you use the Simulator it uses a different underlying implementation so this problem does not show up.