How to add audio file contents in movie using AVFoundation on iOS4? - iphone

I am using AVFoundation to create a movie file (mp4) using images and sound files.
I have successfully created movie file using AVAssetWriterInputPixelBufferAdaptor which appends CVPixelBufferRef (exracted from UIImage objects) in movie file.
Now, I want to add audio contents from a file in that movie. Fetching data from device microphone is not what I am thinking here. And, I could not find anything similar to AVAssetWriterInputPixelBufferAdaptor which can help writing audio data into that movie file.
Am I missing something here?

At least for me the solution was to use AVMutableComposition class.
1) create AVMutableComposition class object
2) create 2 AVURLAsset class objects, the first based on your video file, and the second based on a file that you want to extract audio track from
3) create 2 AVMutableCompositionTrack class objects, like before one with audio track, the second one with video track(based on appropriate assets objects from 2))
4) create AVAssetExportSession class based on composition object from 1)
5) export your session
Best regards

Thank #peter. Here is solution in code.
-(BOOL)compositeVideo{
//Record cur video
NSURL *curAudio = [[NSBundle mainBundle]URLForResource:#"a" withExtension:#".pcm"];
NSURL *curVideo = [[NSBundle mainBundle]URLForResource:#"v" withExtension:#".mp4"];
AVAsset *video = [AVAsset assetWithURL:curVideo];
AVAsset *audio = [AVAsset assetWithURL:curAudio];
AVAssetTrack *vTrack = [[video tracksWithMediaType:AVMediaTypeVideo] firstObject];
NSArray *arr = [audio tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *aTrack = [arr firstObject];
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *visualTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:1];
AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *error;
[visualTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, video.duration) ofTrack:vTrack atTime:kCMTimeZero error:&error];
if (error) {
NSLog(#"video composition failed! error:%#", error);
return NO;
}
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audio.duration) ofTrack:aTrack atTime:kCMTimeZero error:&error];
if (error) {
NSLog(#"audio composition failed! error:%#", error);
return NO;
}
AVAssetExportSession *exporter = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
NSString *path = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).firstObject;
exporter.outputURL = [NSURL fileURLWithPath:[path stringByAppendingPathComponent:#"compositedVideo.mp4"]];
[[NSFileManager defaultManager] removeItemAtPath:path error:nil];
exporter.outputFileType = AVFileTypeQuickTimeMovie;
[exporter exportAsynchronouslyWithCompletionHandler:^{
if (exporter.error) {
NSLog(#"exporter synthesization failed! error:%#", error);
[self.delegate compositeDidFinishAtURL:nil duration:-1];
}else{
[self.delegate compositeDidFinishAtURL:exporter.outputURL duration:CMTimeGetSeconds(video.duration)];
}
}];
return YES;
}

Related

Exporting audio from 3gp video files with AVAssetExportSession

I am trying to export the audio from a 3gpp video file and it is not working... Does anyone know what I may be doing wrong? Here is the code I am using:
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *filePath = [[paths objectAtIndex:0] stringByAppendingPathComponent:#"newFile.m4a"];
NSString *tempFile = [[paths objectAtIndex:0] stringByAppendingPathComponent:#"oldFile.3gp"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL URLWithString:tempFilePath] options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler:^ {
//HERE IS THE PROBLEM. THE ARRAY OF TRACKS IS EMPTY FOR SOME REASON.
AVAssetTrack* audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableComposition* audioComposition = [AVMutableComposition composition];
AVMutableCompositionTrack* audioCompositionTrack = [audioComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioCompositionTrack insertTimeRange:[audioTrack timeRange] ofTrack:audioTrack atTime:CMTimeMake(0, 1) error:nil];
AVAssetExportSession *exprortSession = [AVAssetExportSession exportSessionWithAsset:audioComposition presetName:AVAssetExportPresetAppleM4A];
NSURL *toFileURL = [NSURL URLWithString:filePath];
exprortSession.outputURL = toFileURL;
exprortSession.outputFileType = #"com.apple.m4a-audio";
NSLog(#"exportAsynchronouslyWithCompletionHandler will start");
[exprortSession exportAsynchronouslyWithCompletionHandler: ^(void) {
if (exprortSession.status == AVAssetExportSessionStatusCompleted) {
NSLog(#"Export success");
}
else {
NSLog(#"Export failed");
}
}];
}];
Try to load the asset with
[AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:tempFilePath] options:nil];
instead of
[AVURLAsset URLAssetWithURL:[NSURL URLWithString:tempFilePath] options:nil];
As your task will be manipulating the audio sample buffers directly you should use the second variant that AVFoundation will give you: paired AVAssetReader and AVAssetWriter setup. You'll find proper sample code as in AVReaderWriterOSX from Apple developer source. This should also work with iOS besides you have different I/O format settings available. The availability to decompress audio as PCM and write back to uncompressed .wav or Audio file should be given

How To Record Audio With background Music?

from Last Few Days i am working on an iphone application Which needs to record the users audio and save it with a background music in it , in Simple words by appending two audio files generate a third audio File,
I try to do it using AudioToolBox api but no success in it , can any one suggest me the right direction where to go for that any suggestion,???
Thanks,
You Can do this by
- (BOOL) combineVoices1
{
NSError *error = nil;
BOOL ok = NO;
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
CMTime nextClipStartTime = kCMTimeZero;
//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.8];
NSString *soundOne =[[NSBundle mainBundle]pathForResource:#"test1" ofType:#"caf"];
NSURL *url = [NSURL fileURLWithPath:soundOne];
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack = [[avAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionAudioTrack1 = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.3];
NSString *soundOne1 =[[NSBundle mainBundle]pathForResource:#"test" ofType:#"caf"];
NSURL *url1 = [NSURL fileURLWithPath:soundOne1];
AVAsset *avAsset1 = [AVURLAsset URLAssetWithURL:url1 options:nil];
NSArray *tracks1 = [avAsset1 tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack1 = [[avAsset1 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack1 insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:clipAudioTrack1 atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionAudioTrack2 = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack2 setPreferredVolume:1.0];
NSString *soundOne2 =[[NSBundle mainBundle]pathForResource:#"song" ofType:#"caf"];
NSURL *url2 = [NSURL fileURLWithPath:soundOne2];
AVAsset *avAsset2 = [AVURLAsset URLAssetWithURL:url2 options:nil];
NSArray *tracks2 = [avAsset2 tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack2 = [[avAsset2 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack1 insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset2.duration) ofTrack:clipAudioTrack2 atTime:kCMTimeZero error:nil];
AVAssetExportSession *exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (nil == exportSession) return NO;
NSString *soundOneNew = [documentsDirectory stringByAppendingPathComponent:#"combined10.m4a"];
//NSLog(#"Output file path - %#",soundOneNew);
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:soundOneNew]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
// perform the export
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (AVAssetExportSessionStatusCompleted == exportSession.status) {
NSLog(#"AVAssetExportSessionStatusCompleted");
} else if (AVAssetExportSessionStatusFailed == exportSession.status) {
// a failure may happen because of an event out of your control
// for example, an interruption like a phone call comming in
// make sure and handle this case appropriately
NSLog(#"AVAssetExportSessionStatusFailed");
} else {
NSLog(#"Export Session Status: %d", exportSession.status);
}
}];
return YES;
}
You won't find any pre-rolled tool for doing this, but it's not too hard once you get the recording bit down. After that, you will need to mix the file with the background music, which can be done simply by adding the raw samples together.
For that part to work, you will either need to decode the background music from whatever compressed format you are using to raw PCM so you can manipulate the samples directly. It's been a long time since I did any iOS development, so I don't know if the iOS SDK is able to do this directly or whether you will need to bundle libffmpeg with your code (or something similar to that). But IIRC, the iPhone does support decoding compressed audio to PCM, but not encoding it (more on that in a second).
Otherwise, you can distribute compressed (as zip, not mp3/aac/ogg/whatever) raw PCM files with your app and unzip them to get the sample data directly.
Once you get the final mixdown, you can stream it directly back through the playback device as raw PCM. If you need to save or export it, you'll need to look again into a decoding/encoding library.
Speaking from experience on this issue, you will probably want to do a bit of basic processing to the vocals before mixing down with the background music. First, you will want to have your background tracks normalized to -3dB (or so) so that the user's voice is audible over the music. Second, you should apply a highpass filter to the vocals to remove all frequencies below 60Hz, as wind or other background noises can be picked up by the iPhone's mic. Finally, you will probably want to apply compression + limiter to the vocal sample to make the vocals a bit easier to hear during quiet stretches.
Unfortunately, the question you asked isn't as simple as "just use function mixdownTracksTogether()", but you can definitely get this working by chaining other tools and functions together. Hope this gets you on the right track!

recording video like "Talking tom" iphone (issue in adding audio)

I have to record a video of app exactly similar to "Talking tom".
Taking help from Here and Here i have captured screen and made a video using those images but that does not has any sound.
I have recorded both sound and video files separately but don't know how to add them
can anyone tell me how to add sound to this video or how to record it with sound.
Can anyone help ?
-(void) processVideo: (NSURL*) videoUrl{
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL: videoUrl options:nil];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AppDelegate *appDelegate = (AppDelegate *)[[UIApplication sharedApplication] delegate];
NSError * error = nil;
for (NSMutableDictionary * audioInfo in appDelegate.audioInfoArray)
{
NSString *pathString = [[NSHomeDirectory() stringByAppendingString:#”/Documents/”] stringByAppendingString: [audioInfo objectForKey: #”fileName”]];
AVURLAsset * urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:pathString] options:nil];
AVAssetTrack * audioAssetTrack = [[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID: kCMPersistentTrackID_Invalid];
NSLog(#”%lf”, [[audioInfo objectForKey: #”startTime”] doubleValue]);
CMTime audioStartTime = CMTimeMake(([[audioInfo objectForKey: #”startTime”] doubleValue]*TIME_SCALE), TIME_SCALE);
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,urlAsset.duration) ofTrack:audioAssetTrack atTime:audioStartTime error:&error];
}
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetPassthrough];
NSString* videoName = #”export.mov”;
NSString *exportPath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent:videoName];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
{
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}
_assetExport.outputFileType = #”com.apple.quicktime-movie”;
NSLog(#”file type %#”,_assetExport.outputFileType);
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
switch (_assetExport.status)
{
case AVAssetExportSessionStatusCompleted:
//export complete
NSLog(#”Export Complete”);
//[self uploadToYouTube];
break;
case AVAssetExportSessionStatusFailed:
NSLog(#”Export Failed”);
NSLog(#”ExportSessionError: %#”, [_assetExport.error localizedDescription]);
//export error (see exportSession.error)
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#”Export Failed”);
NSLog(#”ExportSessionError: %#”, [_assetExport.error localizedDescription]);
//export cancelled
break;
}
}]; }
Just assign your movie file(ie.without audio) to NSURL and pass it to the above ProcessVideo method.Then just add your Sound files(you want to merge with your video) in the audioInfoArray somewhere else in the program before calling the processVideo method.Then it will merge your audio with your video file.
You can also decide where the sound starts to play in the video as per the value assigned under the key "startTime" in audioinfoArray. Using the Switch Case,you can play the video,upload to facebook etc as per your wish.
An iOS app can't really record (using any public API) the sound that it itself makes. What an app can do is generate the same audio twice, one for playing, one for streaming to a file. You have to stick with only sounds that you know how to do both ways, such as copying PCM waveforms into buffers, etc.
Once you have your duplicate buffer of audio samples, there should be example code on how to send it to an AVAssetWriter.

How to concatenate 2 or 3 audio files in iOS?

I am newcomer in Objective-C and have experience only 5 months in iPhone development.
What I need:
I need to concatenate 2 or more audio files into one, and export result as aiff, mp3, caf or m4a format.
For example:
First audio file containing "You need", second "download" and third "document".
Every audio part depends on actions from user.
I spent 2 days without luck. That place is my last frontier.
I will very appreciate for a piece of code.
Code below can be used to merge audio files.
Input files: Ids of the files to be supplied in array audioIds.
Eg. audio1.mp3, audio2.mp3 … audioN.mp3 to be available in documents folder
Output file: combined.m4a
- (BOOL) combineVoices {
NSError *error = nil;
BOOL ok = NO;
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
CMTime nextClipStartTime = kCMTimeZero;
//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
for (int i = 0; i< [self.audioIds count]; i++) {
int key = [[self.audioIds objectAtIndex:i] intValue];
NSString *audioFileName = [NSString stringWithFormat:#"audio%d", key];
//Build the filename with path
NSString *soundOne = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.mp3", audioFileName]];
//NSLog(#"voice file - %#",soundOne);
NSURL *url = [NSURL fileURLWithPath:soundOne];
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
if ([tracks count] == 0)
return NO;
CMTimeRange timeRangeInAsset = CMTimeRangeMake(kCMTimeZero, [avAsset duration]);
AVAssetTrack *clipAudioTrack = [[avAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
ok = [compositionAudioTrack insertTimeRange:timeRangeInAsset ofTrack:clipAudioTrack atTime:nextClipStartTime error:&error];
if (!ok) {
NSLog(#"Current Video Track Error: %#",error);
}
nextClipStartTime = CMTimeAdd(nextClipStartTime, timeRangeInAsset.duration);
}
// create the export session
// no need for a retain here, the session will be retained by the
// completion handler since it is referenced there
AVAssetExportSession *exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (nil == exportSession) return NO;
NSString *soundOneNew = [documentsDirectory stringByAppendingPathComponent:#"combined.m4a"];
//NSLog(#"Output file path - %#",soundOneNew);
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:soundOneNew]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
// perform the export
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (AVAssetExportSessionStatusCompleted == exportSession.status) {
NSLog(#"AVAssetExportSessionStatusCompleted");
} else if (AVAssetExportSessionStatusFailed == exportSession.status) {
// a failure may happen because of an event out of your control
// for example, an interruption like a phone call comming in
// make sure and handle this case appropriately
NSLog(#"AVAssetExportSessionStatusFailed");
} else {
NSLog(#"Export Session Status: %d", exportSession.status);
}
}];
return YES;
}
You can use this method to merge 3 sounds together.
- (BOOL) combineVoices1
{
NSError *error = nil;
BOOL ok = NO;
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
CMTime nextClipStartTime = kCMTimeZero;
//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.8];
NSString *soundOne =[[NSBundle mainBundle]pathForResource:#"test1" ofType:#"caf"];
NSURL *url = [NSURL fileURLWithPath:soundOne];
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack = [[avAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionAudioTrack1 = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.3];
NSString *soundOne1 =[[NSBundle mainBundle]pathForResource:#"test" ofType:#"caf"];
NSURL *url1 = [NSURL fileURLWithPath:soundOne1];
AVAsset *avAsset1 = [AVURLAsset URLAssetWithURL:url1 options:nil];
NSArray *tracks1 = [avAsset1 tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack1 = [[avAsset1 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack1 insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:clipAudioTrack1 atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionAudioTrack2 = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack2 setPreferredVolume:1.0];
NSString *soundOne2 =[[NSBundle mainBundle]pathForResource:#"song" ofType:#"caf"];
NSURL *url2 = [NSURL fileURLWithPath:soundOne2];
AVAsset *avAsset2 = [AVURLAsset URLAssetWithURL:url2 options:nil];
NSArray *tracks2 = [avAsset2 tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack2 = [[avAsset2 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack1 insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset2.duration) ofTrack:clipAudioTrack2 atTime:kCMTimeZero error:nil];
AVAssetExportSession *exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (nil == exportSession) return NO;
NSString *soundOneNew = [documentsDirectory stringByAppendingPathComponent:#"combined10.m4a"];
//NSLog(#"Output file path - %#",soundOneNew);
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:soundOneNew]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
// perform the export
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (AVAssetExportSessionStatusCompleted == exportSession.status) {
NSLog(#"AVAssetExportSessionStatusCompleted");
} else if (AVAssetExportSessionStatusFailed == exportSession.status) {
// a failure may happen because of an event out of your control
// for example, an interruption like a phone call comming in
// make sure and handle this case appropriately
NSLog(#"AVAssetExportSessionStatusFailed");
} else {
NSLog(#"Export Session Status: %d", exportSession.status);
}
}];
return YES;
}
You should probably look into this post for doing the same:
Combine two audio files into one in objective c
The answer is similar to what Dimitar has suggested. But 2 important things that you have to keep in mind are that, first - it works only for mp3 format and second the bit-rate of all the files that you are trying to concatenate should be same, or else only parts of your file would play in the final output. It would stop playing where the bit rate changes.
SSteve- Files like Wav files have their own headers and if you just write one file after another, It would play just the first file and would then stop playing, inspite that the file info showing a greater file size. this is because we do not have information updated into the header of the first file.
I have taken two diffrent mp3 files from AVMutableCompositionTrack. and these two mp3 files are stored in same AVMutableComposition.
when i will press the button the path of new mp3 will be shown by console.
-(IBAction)play
{
[self mixAudio];
}
-(void)mixAudio
{
CFAbsoluteTime currentTime=CFAbsoluteTimeGetCurrent();
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.8];
NSString *soundOne =[[NSBundle mainBundle]pathForResource:#"KICK1" ofType:#"mp3"];
NSURL *url = [NSURL fileURLWithPath:soundOne];
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack = [tracks objectAtIndex:0];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionAudioTrack1 = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.8];
NSString *soundOne1 =[[NSBundle mainBundle]pathForResource:#"KICK2" ofType:#"mp3"];
NSURL *url1 = [NSURL fileURLWithPath:soundOne1];
AVAsset *avAsset1 = [AVURLAsset URLAssetWithURL:url1 options:nil];
NSArray *tracks1 = [avAsset1 tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack1 = [tracks1 objectAtIndex:0];
[compositionAudioTrack1 insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset1.duration) ofTrack:clipAudioTrack1 atTime: kCMTimeZero error:nil];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSLibraryDirectory, NSUserDomainMask, YES);
NSString *libraryCachesDirectory = [paths objectAtIndex:0];
NSString *strOutputFilePath = [libraryCachesDirectory stringByAppendingPathComponent:#"output.mov"];
NSString *requiredOutputPath = [libraryCachesDirectory stringByAppendingPathComponent:#"output.m4a"];
NSURL *audioFileOutput = [NSURL fileURLWithPath:requiredOutputPath];
[[NSFileManager defaultManager] removeItemAtURL:audioFileOutput error:NULL];
AVAssetExportSession *exporter=[[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetAppleM4A];
exporter.outputURL=audioFileOutput;
exporter.outputFileType=AVFileTypeAppleM4A;
[exporter exportAsynchronouslyWithCompletionHandler:^{
NSLog(#" OUtput path is \n %#", requiredOutputPath);
NSFileManager * fm = [[NSFileManager alloc] init];
[fm moveItemAtPath:strOutputFilePath toPath:requiredOutputPath error:nil];
NSLog(#" OUtput path is \n %#", requiredOutputPath);
NSLog(#"export complete: %lf",CFAbsoluteTimeGetCurrent()-currentTime);
NSError *error;
audioPlayer=[[AVAudioPlayer alloc]initWithContentsOfURL:audioFileOutput error:&error];
audioPlayer.numberOfLoops=0;
[audioPlayer play];
}];
}
Have you tried something like this:
AudioFileCreateWithURL //to create the output file
For each input file:
AudioFileOpenURL //open file
repeat
AudioFileReadBytes //read from input file
AudioFileWriteBytes //write to output file
until eof(input file)
AudioFileClose //close input file
AudioFileClose //close output file
This would probably require that the input files are all the same format and would create the output file in that same format. If you need to convert the format, that might be better done after creating the output file.
The easiest way to implement multiple combinations of aac:
- (NSString *)concatenatedAACVoicesPath{
NSMutableData *concatenatedData = [[NSMutableData alloc] init];
NSArray *aacPathArr = [self queryAAC];
for (NSString *path in aacPathArr) {
NSData *data = [[NSData alloc] initWithContentsOfFile:path];
[concatenatedData appendData: data];
}
NSString *fileNamePath = [NSString stringWithFormat:#"%#/%#.aac",[NSString createPath:currRecordDocName],currRecordDocName];
[concatenatedData writeToFile:fileNamePath atomically:YES];
return fileNamePath;
}
I have an idea, not sure it will work. Try to get NSData from these 3 files, append the data into another NSData and then write it. Something like:
NSMutableData *concatenatedData = [NSMutableData alloc] init];
NSData *data1 = [[NSData alloc] initWithContentsOfFile:(NSString *)path];
NSData *data2 = [[NSData alloc] initWithContentsOfFile:(NSString *)path];
NSData *data3 = [[NSData alloc] initWithContentsOfFile:(NSString *)path];
[concatenatedData appendData: data1];
[concatenatedData appendData: data2];
[concatenatedData appendData: data3];
[concatenatedData writeToFile:#"/path/to/concatenatedData.mp3" atomically:YES];
It's a theory I'm not sure it will work :), it actually works if i open an mp3 with hex editor - copy everything and paste it in the end - then I have the same sound twice. Please try it and let us know if it works.

when i export the video its not played

here i create the video is succeded and i combine the video and audio is merged in to the MOV Format and By Using the AVAssetExportSession the file is Exported, But When the file is played in media player is not played it just displays the blank screen
here i attached the merging code for video and audio
-(void)combine:(NSString *)audiopathvalue videoURL:(NSString *)videopathValue;
{
// 1. Create a AVMutableComposition
CFAbsoluteTime currentTime = CFAbsoluteTimeGetCurrent(); //Debug purpose - used to calculate the total time taken
NSError *error = nil;
AVMutableComposition *saveComposition = [AVMutableComposition composition];
// 2. Get the video and audio file path
NSString *tempPath = NSTemporaryDirectory();
NSString *videoPath = videopathValue ;//<Video file path>;
NSString *audioPath = audiopathvalue ;//<Audio file path>;;
//3. Create the video asset 
NSURL * url1 = [[NSURL alloc] initFileURLWithPath:videoPath];
AVURLAsset *video = [AVURLAsset URLAssetWithURL:url1 options:nil];
[url1 release];
// 4. Get the AVMutableCompositionTrack for video and add the video track to it.
// The method insertTimeRange: ofTrack: atTime: decides the what portion of the video to be added and also where the video track should appear in the final video created.
AVMutableCompositionTrack *compositionVideoTrack = [saveComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipVideoTrack = [[video tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [video duration]) ofTrack:clipVideoTrack atTime:kCMTimeZero error:nil];
NSLog(#"%f %#",CMTimeGetSeconds([video duration]),error);
//5. Create the Audio asset 
NSLog(#"audioPath:%#",audioPath);
NSURL * url2 = [[NSURL alloc] initFileURLWithPath:audioPath];
AVURLAsset *audio = [AVURLAsset URLAssetWithURL:url2 options:nil];
[url2 release];
//6. Get the AVMutableCompositionTrack for audio and add the audio track to it.
AVMutableCompositionTrack *compositionAudioTrack = [saveComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipAudioTrack = [[audio tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [audio duration]) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
NSLog(#"%f %#",CMTimeGetSeconds([audio duration]),error);
//7. Get file path for of the final video.
NSString *path = [tempPath stringByAppendingPathComponent:#"mergedvideo.MOV"];
if([[NSFileManager defaultManager] fileExistsAtPath:path])
{
[[NSFileManager defaultManager] removeItemAtPath:path error:nil];
}
NSURL *url = [[NSURL alloc] initFileURLWithPath: path];
//8. Create the AVAssetExportSession and set the preset to it.
//The completion handler will be called upon the completion of the export.
AVAssetExportSession *exporter = [[[AVAssetExportSession alloc] initWithAsset:saveComposition presetName:AVAssetExportPresetHighestQuality] autorelease];
exporter.outputURL=url;
exporter.outputFileType = #"com.apple.quicktime-movie";
NSLog(#"file type %#",exporter.outputFileType);
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
switch ([exporter status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[exporter error] localizedDescription]);
NSLog(#"ExportSessionError: %#", exporter.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
case AVAssetExportSessionStatusCompleted:
{
NSLog(#"Export Completed");
ImageToAirPlayAppDelegate *theApp_iphone=(ImageToAirPlayAppDelegate *)[[UIApplication sharedApplication] delegate];
[theApp_iphone call];
break;
}
default:
break;
}
//[exporter release];
}];
in the video path it contains the series of images
and in the audio path only one audio
The function (not in your code):
- (void) captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
Try doing the processing there.
Gives you the outputFileURL that is the one you have to use in your mix. There is no reason to use an NSString in the function combine.
I also recommend you to use AVFileTypeQuickTimeMovie instead "com.apple.quicktime-movie". It is the same but easier to handle in case you want to experiment with other format.
To know the available formats just use
NSLog(#"%#", [exporter supportedFileTypes]);