I'm recording audio separately from building an MPEG4 video on an iphone app. I'm trying to combine these using AVAssetExportSession.
Audio session is using 1 channel, kAudioFormatAppleLossless and AVAudioQualityMax. The video is AVVideoCodecH264, using pixel format type kCVPixelFormatType_32ARGB. This video without audio plays fine on my iphone via the photo library.
The exporter is telling me these are the supported presets, I'm using AVAssetExportPresetHighestQuality:
(
AVAssetExportPresetAppleM4A,
AVAssetExportPresetLowQuality,
AVAssetExportPresetHighestQuality,
AVAssetExportPreset640x480,
AVAssetExportPresetMediumQuality
)
and it's telling me this is the supported file type:
(
"com.apple.quicktime-movie"
)
if I switch to AVAssetExportPresetPassthrough I get file types of
(
"com.apple.quicktime-movie",
"com.apple.m4a-audio",
"public.mpeg-4",
"com.apple.m4v-video",
"public.3gpp",
"org.3gpp.adaptive-multi-rate-audio",
"com.microsoft.waveform-audio",
"public.aiff-audio",
"public.aifc-audio"
)
I have tried quicktime movie and mpeg 4 as outputFileType and still get the same results.
EDIT: May 10th
Crash was due to a pretty silly error. I have audio and video now ... sweet!
Key is .mp4 extension, exportSession setOutputFileType:AVFileTypeQuickTimeMovie, presetName:AVAssetExportPresetPassthrough
My crash was a simple issue. The bigger problem I had was using the wrong filename extension. I'm now using .mp4, AVFileTypeQuickTimeMovie and AVAssetExportPresetPassthrough.
Related
In my project, I have a video playing silently as a background video as an introduction. However, when I add the video to my project using NSURL I get this error (see screenshot). I have also provided the code below too. Any ideas?
let videoURL: NSURL = NSBundle.mainBundle().URLForResource("Background_2", withExtension: "mp4")!
I had this problem myself. I would suggest using a video encoder (Adobe Media Encoder CC is good). In the media encoder change, the video codec to H.264 and make sure it has a file extension of QuickTime .mov this will ensure Xcode and your application can read your video correctly.
In addition, when you have successfully changed the codec, make sure you add the video to your Bundle Resources!
Good luck!
What I'm doing :
I need to play audio and video files that are not supported by Apple on iPhone/iPad for example mkv/mka files which my contain several audio channels.
I'm using libffmpeg to find audio and video streams in media file.
Video is being decoded with avcodec_decode_video2 and audio with avcodec_decode_audio3
the return values are following for each function are following
avcodec_decode_video2 - returns AVFrame structure which encapsulates information about the video video frame from the pakcage, specifically is has data field which is a pointer to the picture/channel planes.
avcodec_decode_audio3 - returns samples of type int16_t * which I guess is the raw audio data
So basically I've done all this and successfully decoding the media content.
What I have to do :
I've to play the audio and video accordingly using Apples services. The playback I need to perform should support mixing of audio channels while playing video, i.e. let say mkv file contains two audio channel and a video channel. So I would like to know which service will be the appropriate choice for me ? My research showed that AudioQueue service might be useful audio playback, and probably AVFoundation for video.
Please help to find the right technology for my case i.e. video playeback + audio playback with possible audio channel mixing.
You are on the right path. If you are only playing audio (not recording at all) then I would use AudioQueues. It will do the mixing for you. If you are recording then you should use AudioUnits. Take a look at the MixerHost example project from Apple. For video I recommend using OpenGL. Assuming the image buffer is in YUV420 then you can render this with a simple two pass shader setup. I do believe there is an Apple example project showing how to do this. In any case you could render any pixel format using OpenGL and a shader to convert the pixel format to RGBA. Hope this help.
I am using AVfoundation framework to get video camera frames at real time and then modifying those frames using one algorithm(which gives new modified image).
now I want all modified frames to be save as a video to iPhone library. I found a way to save video for input(original) frames using AVCaptureMovieFileOutput but not for modified frames.
Is there any way to save modified frames to iPhone Library as a video ??
UISaveVideoAtPathToSavedPhotosAlbum
Adds the movie at the specified path to the user’s Camera Roll album.
I am using avcapturesession with a preset AVCaptureSessionPresetMedium to capture video, i am applying effect on this video with opengl using shaders.
I use assetWriter to write the video to an mp4 file. The problem is that the resulted video is slow specially when I add audio output.
This is how my code works :
In -
(void)captureOutput:(AVCaptureOutput
*)captureOutput... function I apply the opengl filter to the captured
frames
then check if the captureoutput is
video or audio if it's video, I use
glReadPixels to create a
CVPixelBufferRef that I send to an
AVAssetWriterInputPixelBufferAdaptor
to write it
if it's audio, I write directly the
CMSampleBufferRef
If someone can tell me what's wrong with my approach or which part is supposed to make the resulted video slow?
HI all
i am playing a video(.mp4 format) without sound (i mean it doesn't have sound it is a mute video ) and in the background i am playing an audio file (.mp3 format) when i play my code through simulator it works fine as i want like when i tap on the video it is just mute but behind i am playing the audio so for user it seems that video has this sound but when i installed my code in device and play video than it doesn't work like so it play video but without sound than how can i play an audio and a video together in the above format ?
actually we are not just playing a single video or audio file it just comes from an array by choosing randomly and same for the audio file so we cann't do this i think so any other idea for it ??
Should we use another format for audio aur video for doing this thing ??
thanks for the help
Balraj verma
The problem is that the default Audio Session does not allow audio mixing.
In the Reference Library (Working with Movie Players) they say you should use a mixable category configuration for your audio session, for example the Ambient category. In the Application Delegate:
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *setCategoryError = nil;
[audioSession setCategory:AVAudioSessionCategoryAmbient error: &setCategoryError];
Apple's documentation states that some audio formats are not suited to be played back simultaneously because of hardware restrictions. See Playing Multiple Sounds Simultaneously for more details.
A solution for you would be to use PCM encoded audio which should allow simultaneous playback.
I would like to add that I managed to play two mp3 audio files at the same time on 3G and 3GS iPhones. Which shouldn't be possible according to documentation but worked for me.
You can rather use the an instance of AvAudioPlayer in a new thread. Use the following link to see how does this work
http://www.mobileorchard.com/easy-audio-playback-with-avaudioplayer/
create an instance of MPMoviePlayer to start playback of videos.
Hope this will work.
You should combine the audio and video using some video editing software like iMovie or Windows Movie Maker. Using the software to "composite" audio and video together is not a good idea and adds additional overhead, synchronization issues, etc.
As far as I know, you can't play AVAudioPlayer and MPMoviePlayer at the same time.