iPhone merger UIImage and CAF Audio to AV file - iphone

I want to merger an image file and an audio file.
I have UIImage object in which I have my image.
I have also CAF audio recording of 4-5 seconds.
Now, I would like to create a movie file with image repeating for the length of audio.
Using which class/methods can I do this? Any Idea?
Thanks in Advance.

The answer to this question should help you:
How do I export UIImage array as a movie?
THe only difference is that in the question they use several UIImages, you only need to repeat the same one.
Good Luck

Related

Save Audio Produced by MixerHost Sample on developer.apple

I get sample of MixerHost from developer.apple and now i want that Mix Sound to save in another audio File,
can any one Help me???
Use EXTAudioFileWriteASync to write the contents of the IOBuffer to an audio file object. This can be done in the render callback.
Then when you are done use ExtAudioFileCreateWithURL to save the audio to disk.

How to Use ffmpeg codes in iPHone

I need to convert the video into png images. I did that using ffmepg. But I need to do that quickly. Now it's taking lots of time to convert a video into images. Now to reduce the conversion time. I search lot But I got "ffmpeg -i video.mpg image%d.jpg" these codings as solution. Please teach me to use these kind of codings.
shoot and save the video with AVCaptureSession + AVCaptureMovieFileOutput
use AVAssetReader to extract the individual frames from the video as BGRA CVImageBufferRefs
save as PNG: CVImageBufferRef -> UIImage -> UIImagePNGRepresentation
This should be faster than ffmpeg because step 2 is hardware accelerated and also has the benefit of allowing you to discard a cumbersome LPGLed 3rd party library.
Enjoy!
with ffmpeg you can split video frame by frame and can mix audio with video and also check this

GLvideoFrame with MP4 file

I want to display a video frame buffer on a OpenGLES texture.
I have download and read the GLVideoFrame sample from apple.
It's great code, but i don't understand how it's possible to modify this code for use a movie file instead of video device.
You can use AVAssetReader to read frames from a file.

possible to create a video file from RGB frames using AV Foundation

I have an iOS app that I want to record some of its visual output into a video. It looks like the way to create a video on iOS is to use AVMutableComposition and feed AVAssets to it via insertTimeRange.
All the documentation and examples that I can find only add video and audio assets to an AVMutableComposition. Is there a way to add image data to it (i.e. add an image for each frame of the video)? I can get this image data as straight RGB, PNG, JPG, UIImage, or whatever is easiest to feed to AV Foundation (if it's even possible).
If it's not possible to feed images into an AVMutableComposition for the video frames, is there another way to generate an .mp4 file from frames in iOS.
To generate movies from frame you can use AVAssetWriter, here is a question that sort of covers that here on SO, question

How to Convert Audio File in iPhone

I want to programmatically convert audio WAV file to a compressed audio file (IMA4 or iLBC).
What is the best way to do that?
Thanks.
Have a look at TPAACAudioConverter and iPhoneExtAudioFileConvertTest