How to add audio to video file on iphone SDK - iphone

I have a video file and an audio file. Is it possible to merge it to one video with with sound file. I think AVMutableComposition should help me, but I still dont understand how. any advices?

Thanks Daniel. I figured it out, its easy.
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audioUrl options:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:videoUrl options:nil];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetPassthrough];
NSString* videoName = #"export.mov";
NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
{
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}
_assetExport.outputFileType = #"com.apple.quicktime-movie";
DLog(#"file type %#",_assetExport.outputFileType);
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
// your completion code here
}
}
];

Yes it is possible, here is a snippet of code that is used to add audio to an existing composition, i grabbed this from apples sample code, you should probably view the whole project, youll find it very useful, the project is AVEditDemo and you can find it in the WWDC 2010 material that they posted here developer.apple.com/videos/wwdc/2010. Hope that helps
- (void)addCommentaryTrackToComposition:(AVMutableComposition *)composition withAudioMix:(AVMutableAudioMix *)audioMix
{
NSInteger i;
NSArray *tracksToDuck = [composition tracksWithMediaType:AVMediaTypeAudio]; // before we add the commentary
// Clip commentary duration to composition duration.
CMTimeRange commentaryTimeRange = CMTimeRangeMake(self.commentaryStartTime, self.commentary.duration);
if (CMTIME_COMPARE_INLINE(CMTimeRangeGetEnd(commentaryTimeRange), >, [composition duration]))
commentaryTimeRange.duration = CMTimeSubtract([composition duration], commentaryTimeRange.start);
// Add the commentary track.
AVMutableCompositionTrack *compositionCommentaryTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, commentaryTimeRange.duration) ofTrack:[[self.commentary tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:commentaryTimeRange.start error:nil];
NSMutableArray *trackMixArray = [NSMutableArray array];
CMTime rampDuration = CMTimeMake(1, 2); // half-second ramps
for (i = 0; i < [tracksToDuck count]; i++) {
AVMutableAudioMixInputParameters *trackMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:[tracksToDuck objectAtIndex:i]];
[trackMix setVolumeRampFromStartVolume:1.0 toEndVolume:0.2 timeRange:CMTimeRangeMake(CMTimeSubtract(commentaryTimeRange.start, rampDuration), rampDuration)];
[trackMix setVolumeRampFromStartVolume:0.2 toEndVolume:1.0 timeRange:CMTimeRangeMake(CMTimeRangeGetEnd(commentaryTimeRange), rampDuration)];
[trackMixArray addObject:trackMix];
}
audioMix.inputParameters = trackMixArray;
}

Here is the swift version:
func mixAudio(audioURL audioURL: NSURL, videoURL: NSURL) {
let audioAsset = AVURLAsset(URL: audioURL)
let videoAsset = AVURLAsset(URL: videoURL)
let mixComposition = AVMutableComposition()
let compositionCommentaryTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
// add audio
let timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration)
let track = audioAsset.tracksWithMediaType(AVMediaTypeAudio)[0]
do {
try compositionCommentaryTrack.insertTimeRange(timeRange, ofTrack: track, atTime: kCMTimeZero)
}
catch {
print("Error insertTimeRange for audio track \(error)")
}
// add video
let compositionVideoTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let timeRangeVideo = CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
let trackVideo = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0]
do {
try compositionVideoTrack.insertTimeRange(timeRangeVideo, ofTrack: trackVideo, atTime: kCMTimeZero)
}
catch {
print("Error insertTimeRange for video track \(error)")
}
// export
let assetExportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough)
let videoName = "export.mov"
exportPath = "\(NSTemporaryDirectory())/\(videoName)"
let exportURL = NSURL(fileURLWithPath: exportPath!)
if NSFileManager.defaultManager().fileExistsAtPath(exportPath!) {
do {
try NSFileManager.defaultManager().removeItemAtPath(exportPath!)
}
catch {
print("Error deleting export.mov: \(error)")
}
}
assetExportSession?.outputFileType = "com.apple.quicktime-movie"
assetExportSession?.outputURL = exportURL
assetExportSession?.shouldOptimizeForNetworkUse = true
assetExportSession?.exportAsynchronouslyWithCompletionHandler({
print("Mixed audio and video!")
dispatch_async(dispatch_get_main_queue(), {
print(self.exportPath!)
})
})
}

Related

Video recorded from EAGLView is flipped when uploaded to youtube

In my ios app, I am trying to record video of the content of EAGLView(There is no Camera involvement). I have no problem recording the video.
After recording I have to add few sound tracks to the video and then have to share this video to Youtube and facebook. My problem is that the video is okay when I play it on iphone or on mac but when I upload this video to youtube (using Youtube Data Api v3), the video is vertically inverted or upside down.
I guess I need to rotate the frame in video before uploading but I don't know how to do that.
Any help will be highly appreciated.
The code I am using to add audio tracks to the video is below:
-(void)prepareVideoForPath:(NSString *)videoPath usingAudio:(NSArray *)audioArray andOutputPath:(NSString *)exportPath{
NSDictionary *optionsDictionary = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
NSURL *videoUrl=[NSURL fileURLWithPath:videoPath];
AVURLAsset* videoAsset = [AVURLAsset URLAssetWithURL:videoUrl options:optionsDictionary];
AVAssetTrack *FirstAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGAffineTransform firstTransform = FirstAssetTrack.preferredTransform;
AVMutableComposition* mixComposition = [AVMutableComposition composition];
//VideoTrack
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:FirstAssetTrack atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:firstTransform];
//Audio Track
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime audioStartTime = kCMTimeZero;
for (NSURL *audioURL in audioArray) {
AVURLAsset *audioAsset = [AVURLAsset URLAssetWithURL:audioURL options:optionsDictionary];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:audioStartTime error:nil];
audioStartTime = CMTimeAdd(audioStartTime, audioAsset.duration);
}
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetPassthrough];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
_assetExport.outputFileType = AVFileTypeQuickTimeMovie;
NSLog(#"file type %#",_assetExport.outputFileType);
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
// your completion code here
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(#"Mixing complete");
});
}
];
}
Ok I found it how to rotate a video.
Here is the code:
-(void)fixVideoOrientationForURL:(NSURL *)videoURL andOutputPath:(NSString *)exportPath{
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:videoURL options:#{ AVURLAssetPreferPreciseDurationAndTimingKey:#YES }];
AVMutableVideoCompositionInstruction *instruction = nil;
AVMutableVideoCompositionLayerInstruction *layerInstruction = nil;
CGAffineTransform transform;
AVAssetTrack *assetVideoTrack = nil;
AVAssetTrack *assetAudioTrack = nil;
// Check if the asset contains video and audio tracks
if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
}
if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0) {
assetAudioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
}
CMTime insertionPoint = kCMTimeZero;
NSError *error = nil;
// Step 1
// Create a composition with the given asset and insert audio and video tracks into it from the asset
AVMutableComposition *mutableComposition = [AVMutableComposition composition];
// Insert the video and audio tracks from AVAsset
if (assetVideoTrack != nil) {
AVMutableCompositionTrack *compositionVideoTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetVideoTrack atTime:insertionPoint error:&error];
}
if (assetAudioTrack != nil) {
AVMutableCompositionTrack *compositionAudioTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetAudioTrack atTime:insertionPoint error:&error];
}
// Step 2
// Translate the composition to compensate the movement caused by rotation (since rotation would cause it to move out of frame)
// Rotate transformation
transform = CGAffineTransformMake(1, 0, 0, -1, 0, assetVideoTrack.naturalSize.height);
// Step 3
// Set the appropriate render sizes and rotational transforms
// Create a new video composition
AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
mutableVideoComposition.renderSize = CGSizeMake(assetVideoTrack.naturalSize.width,assetVideoTrack.naturalSize.height);
mutableVideoComposition.frameDuration = CMTimeMake(1, 30);
// The rotate transform is set on a layer instruction
instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mutableComposition duration]);
layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:(mutableComposition.tracks)[0]];
[layerInstruction setTransform:t2 atTime:kCMTimeZero];
// Step 4
// Add the transform instructions to the video composition
instruction.layerInstructions = #[layerInstruction];
mutableVideoComposition.instructions = #[instruction];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:[mutableComposition copy] presetName:AVAssetExportPresetHighestQuality];
exportSession.videoComposition = mutableVideoComposition;
exportSession.outputURL = [NSURL fileURLWithPath:exportPath];
exportSession.outputFileType=AVFileTypeQuickTimeMovie;
exportSession.shouldOptimizeForNetworkUse = YES;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void){
dispatch_async(dispatch_get_main_queue(), ^{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:
NSLog(#"writing complete");
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed:%#",exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Canceled:%#",exportSession.error);
break;
default:
break;
}
});
}];
}

play and save video file

I have recorded one video and save it into document directory.
Now i want to edit video to slow,fast or normal speed.
I can able to do fast and normal speed of the video.
But i can't able to do slow speed of the video.It's alway converting to normal speed.
here is my code
AVMutableComposition *composition = [AVMutableComposition composition];
AVURLAsset * sourceAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:inputVideoPath] options:nil];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
BOOL ok = NO;
AVAssetTrack * sourceVideoTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CMTimeRange x = CMTimeRangeMake(kCMTimeZero, [sourceAsset duration]);
ok = [compositionVideoTrack insertTimeRange:x ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil];
CMTime fiveSecondsIn;
if([Cell.text isEqualToString:#"Fast"]){
fiveSecondsIn = CMTimeMake(10,1);
}
else if([Cell.text isEqualToString:#"Normal"]){
fiveSecondsIn = CMTimeMake(1,1);
}
else if([Cell.textLabel.text isEqualToString:#"Slow"]){
fiveSecondsIn = CMTimeMake(1,.5);
}
CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,sourceAsset.duration);
[compositionVideoTrack scaleTimeRange:video_timeRange toDuration:fiveSecondsIn];
if([[NSFileManager defaultManager] fileExistsAtPath:filePath]){
[[NSFileManager defaultManager] removeItemAtPath:filePath error:nil];
}
NSURL *exportUrl = [NSURL fileURLWithPath:filePath];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc]initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=exportUrl;
exporter.outputFileType = #"com.apple.quicktime-movie";
[exporter exportAsynchronouslyWithCompletionHandler:^(void){
[self performSelectorOnMainThread:#selector(playnewVideo) withObject:nil waitUntilDone:NO];
}];

Rotate an AVAsset with AVAssetExportSession

I'm trying to rotate a video to its correct orientation using an AVAssetExportSession and I always get the following error:
Error Domain=AVFoundationErrorDomain Code=-11841 "The operation couldn’t be completed. (AVFoundationErrorDomain error -11841.)"
That translates to AVErrorInvalidVideoComposition but I cannot see anything wrong with my video composition. Here's the code:
AVAssetTrack *sourceVideo = [[avAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
AVAssetTrack *sourceAudio = [[avAsset tracksWithMediaType:AVMediaTypeAudio] lastObject];
CGAffineTransform preferredTransform = [sourceVideo preferredTransform];
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetExportSession *exporter = [[[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality] autorelease];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceVideo
atTime:kCMTimeZero
error:nil];
if( !CGAffineTransformIsIdentity(preferredTransform) ) {
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.renderSize = CGSizeMake([avAsset naturalSize].height, [avAsset naturalSize].width);
videoComposition.frameDuration = CMTimeMake(1, compositionVideoTrack.naturalTimeScale);
AVMutableVideoCompositionLayerInstruction *instruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:sourceVideo];
[instruction setTransform:preferredTransform atTime:kCMTimeZero];
AVMutableVideoCompositionInstruction *videoTrackInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoTrackInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, avAsset.duration);
videoTrackInstruction.layerInstructions = [NSArray arrayWithObject:instruction];
[videoComposition setInstructions:[NSArray arrayWithObject:videoTrackInstruction]];
exporter.videoComposition = videoComposition;
}
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceAudio
atTime:kCMTimeZero
error:nil];
exporter.outputURL = tempPathUrl;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
[exporter exportAsynchronouslyWithCompletionHandler:^{ }];
What could be wrong with the composition? I've been through the documentation and cannot see anything wrong with it so far.
It might have to do with your frame duration. You're using
CMTimeMake(1, naturalTimeScale)
You should check the naturalTimeScale as it does not always equal the fps. See the AVFoundation Programming Guide 'Representations of Time' for more info.
I think it's because the order of width and height arguments is incorrect.
You have:
videoComposition.renderSize = CGSizeMake([avAsset naturalSize].height, [avAsset naturalSize].width);
Shouldn't it be
videoComposition.renderSize = CGSizeMake([avAsset naturalSize].width, [avAsset naturalSize].height);
instead?
The error is from how you set instruction.
... videoCompositionLayerInstructionWithAssetTrack:sourceVideo]
sourceVideo is not a member of the composition. In your case this should be compositionVideoTrack.

MOV to Mp4 video conversion iPhone programmatically

I am developing media server for Play station 3 in iPhone.
I came to know that PS3 doesn't support .MOV file so I have to convert it into Mp4 or something other transcode which PS3 support.
This is what I have done but it crashes if I set different file type than its source files.
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:avAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetLowQuality])
{
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:avAsset presetName:AVAssetExportPresetLowQuality];
exportSession.outputURL = [NSURL fileURLWithPath:videoPath];
exportSession.outputFileType = AVFileTypeMPEG4;
CMTime start = CMTimeMakeWithSeconds(1.0, 600);
CMTime duration = CMTimeMakeWithSeconds(3.0, 600);
CMTimeRange range = CMTimeRangeMake(start, duration);
exportSession.timeRange = range;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[exportSession error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
default:
break;
}
[exportSession release];
}];
}
If I set AVFileTypeMPEG4 here then it crashes, saying "Invalid file type". So I have to set it to AVFileTypeQuickTimeMovie and it gives MOV file.
Is it possible in iOS to convert video from MOV to Mp4 through AVAssetExportSession...OR without any Thirdparty libraries?
presetName use "AVAssetExportPresetPassthrough" instead "AVAssetExportPresetLowQuality"
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:avAsset presetName:AVAssetExportPresetPassthrough];
MOV is very similar to MP4, you might be able to just change the extension and have it work, Windows Phone cant play .MOVS but can play mp4, all i did to get that to work is change the extension from .mov to .mp4 and it works fine, and this is from videos shot on the iphone...and if anything you can def try exporting with AVAssetExporter and try there is a file type in there for MP4 and M4A as you can see from the fileformat UTIs here
hope it helps
You can convert video in mp4 by AVAssets.
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
NSArray *compatiblePresets = [AVAssetExportSession
exportPresetsCompatibleWithAsset:avAsset];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:avAsset presetName:AVAssetExportPresetLowQuality];
NSString* documentsDirectory=[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
exportSession.outputURL = url;
//set the output file format if you want to make it in other file format (ex .3gp)
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.shouldOptimizeForNetworkUse = YES;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status])
{
case AVAssetExportSessionStatusFailed:
NSLog(#"Export session failed");
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
case AVAssetExportSessionStatusCompleted:
{
//Video conversion finished
NSLog(#"Successful!");
}
break;
default:
break;
}
}];
To easily convert video to mp4 use this link.
You can also find sample project to convert video to mp4.
You need AVMutableComposition to do this. Because Asset can't be transcode to MP4 directly under iOS 5.0.
- (BOOL)encodeVideo:(NSURL *)videoURL
{
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
// Create the composition and tracks
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSArray *assetVideoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if (assetVideoTracks.count <= 0)
{
NSLog(#"Error reading the transformed video track");
return NO;
}
// Insert the tracks in the composition's tracks
AVAssetTrack *assetVideoTrack = [assetVideoTracks firstObject];
[videoTrack insertTimeRange:assetVideoTrack.timeRange ofTrack:assetVideoTrack atTime:CMTimeMake(0, 1) error:nil];
[videoTrack setPreferredTransform:assetVideoTrack.preferredTransform];
AVAssetTrack *assetAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[audioTrack insertTimeRange:assetAudioTrack.timeRange ofTrack:assetAudioTrack atTime:CMTimeMake(0, 1) error:nil];
// Export to mp4
NSString *mp4Quality = [MGPublic isIOSAbove:#"6.0"] ? AVAssetExportPresetMediumQuality : AVAssetExportPresetPassthrough;
NSString *exportPath = [NSString stringWithFormat:#"%#/%#.mp4",
[NSHomeDirectory() stringByAppendingString:#"/tmp"],
[BSCommon uuidString]];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:mp4Quality];
exportSession.outputURL = exportUrl;
CMTime start = CMTimeMakeWithSeconds(0.0, 0);
CMTimeRange range = CMTimeRangeMake(start, [asset duration]);
exportSession.timeRange = range;
exportSession.outputFileType = AVFileTypeMPEG4;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status])
{
case AVAssetExportSessionStatusCompleted:
NSLog(#"MP4 Successful!");
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[exportSession error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
default:
break;
}
}];
return YES;
}
Use the below code
NSURL * mediaURL = [info objectForKey:UIImagePickerControllerMediaURL];
AVAsset *video = [AVAsset assetWithURL:mediaURL];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:video presetName:AVAssetExportPresetMediumQuality];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeMPEG4;
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *basePath = ([paths count] > 0) ? [paths objectAtIndex:0] : nil;
basePath = [basePath stringByAppendingPathComponent:#"videos"];
if (![[NSFileManager defaultManager] fileExistsAtPath:basePath])
[[NSFileManager defaultManager] createDirectoryAtPath:basePath withIntermediateDirectories:YES attributes:nil error:nil];
compressedVideoUrl=nil;
compressedVideoUrl = [NSURL fileURLWithPath:basePath];
long CurrentTime = [[NSDate date] timeIntervalSince1970];
NSString *strImageName = [NSString stringWithFormat:#"%ld",CurrentTime];
compressedVideoUrl=[compressedVideoUrl URLByAppendingPathComponent:[NSString stringWithFormat:#"%#.mp4",strImageName]];
exportSession.outputURL = compressedVideoUrl;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"done processing video!");
NSLog(#"%#",compressedVideoUrl);
if(!dataMovie)
dataMovie = [[NSMutableData alloc] init];
dataMovie = [NSData dataWithContentsOfURL:compressedVideoUrl];
}];
Here is the code
func encodeVideo(videoURL: NSURL) {
let avAsset = AVURLAsset(URL: videoURL, options: nil)
var startDate = NSDate()
//Create Export session
exportSession = AVAssetExportSession(asset: avAsset, presetName: AVAssetExportPresetPassthrough)
// exportSession = AVAssetExportSession(asset: composition, presetName: mp4Quality)
//Creating temp path to save the converted video
let documentsDirectory = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0]
let myDocumentPath = NSURL(fileURLWithPath: documentsDirectory).URLByAppendingPathComponent("temp.mp4").absoluteString
let url = NSURL(fileURLWithPath: myDocumentPath)
let documentsDirectory2 = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] as NSURL
let filePath = documentsDirectory2.URLByAppendingPathComponent("rendered-Video.mp4")
deleteFile(filePath)
//Check if the file already exists then remove the previous file
if NSFileManager.defaultManager().fileExistsAtPath(myDocumentPath) {
do {
try NSFileManager.defaultManager().removeItemAtPath(myDocumentPath)
}
catch let error {
print(error)
}
}
url
exportSession!.outputURL = filePath
exportSession!.outputFileType = AVFileTypeMPEG4
exportSession!.shouldOptimizeForNetworkUse = true
var start = CMTimeMakeWithSeconds(0.0, 0)
var range = CMTimeRangeMake(start, avAsset.duration)
exportSession.timeRange = range
exportSession!.exportAsynchronouslyWithCompletionHandler({() -> Void in
switch self.exportSession!.status {
case .Failed:
print("%#",self.exportSession?.error)
case .Cancelled:
print("Export canceled")
case .Completed:
//Video conversion finished
var endDate = NSDate()
var time = endDate.timeIntervalSinceDate(startDate)
print(time)
print("Successful!")
print(self.exportSession.outputURL)
default:
break
}
})
}
func deleteFile(filePath:NSURL) {
guard NSFileManager.defaultManager().fileExistsAtPath(filePath.path!) else {
return
}
do {
try NSFileManager.defaultManager().removeItemAtPath(filePath.path!)
}catch{
fatalError("Unable to delete file: \(error) : \(__FUNCTION__).")
}
}
just wanted to say that the URL can not be like
[NSURL URLWithString: [#"~/Documents/movie.mov" stringByExpandingTildeInPath]]
It must be like
[NSURL fileURLWithPath: [#"~/Documents/movie.mov" stringByExpandingTildeInPath]]
Took me a while to figure that out :-)

IPhone avcomposition issue

Im trying to create a video that shows two videos one after the other using avcomposition on the iphone. This code works, however i can only see one of the videos for the entire duration of the newly created video
- (void) startEdit{
AVMutableComposition* mixComposition = [AVMutableComposition composition];
NSString* a_inputFileName = #"export.mov";
NSString* a_inputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:a_inputFileName];
NSURL* a_inputFileUrl = [NSURL fileURLWithPath:a_inputFilePath];
NSString* b_inputFileName = #"output.mov";
NSString* b_inputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:b_inputFileName];
NSURL* b_inputFileUrl = [NSURL fileURLWithPath:b_inputFilePath];
NSString* outputFileName = #"outputFile.mov";
NSString* outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:outputFileName];
NSURL* outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
[[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
CMTime nextClipStartTime = kCMTimeZero;
AVURLAsset* a_videoAsset = [[AVURLAsset alloc]initWithURL:a_inputFileUrl options:nil];
CMTimeRange a_timeRange = CMTimeRangeMake(kCMTimeZero,a_videoAsset.duration);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:a_timeRange ofTrack:[[a_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration);
AVURLAsset* b_videoAsset = [[AVURLAsset alloc]initWithURL:b_inputFileUrl options:nil];
CMTimeRange b_timeRange = CMTimeRangeMake(kCMTimeZero, b_videoAsset.duration);
AVMutableCompositionTrack *b_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[b_compositionVideoTrack insertTimeRange:b_timeRange ofTrack:[[b_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetLowQuality];
_assetExport.outputFileType = #"com.apple.quicktime-movie";
_assetExport.outputURL = outputFileUrl;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
[self saveVideoToAlbum:outputFilePath];
}
];
}
- (void) saveVideoToAlbum:(NSString*)path{
if(UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(path)){
UISaveVideoAtPathToSavedPhotosAlbum (path, self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
}
}
- (void) video: (NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo {
NSLog(#"Finished saving video with error: %#", error);
}
I've posted the whole code as it may help someone else.
Shouldn't
nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration);
[b_compositionVideoTrack insertTimeRange:b_timeRange ofTrack:[[b_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
add the second video to the end of the first
Cheers
Figured it out. Should have only had one AVMutableCompositionTrack.
Like so:
CMTime nextClipStartTime = kCMTimeZero;
AVURLAsset* a_videoAsset = [[AVURLAsset alloc]initWithURL:a_inputFileUrl options:nil];
CMTimeRange a_timeRange = CMTimeRangeMake(kCMTimeZero,a_videoAsset.duration);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:a_timeRange ofTrack:[[a_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration);
AVURLAsset* b_videoAsset = [[AVURLAsset alloc]initWithURL:b_inputFileUrl options:nil];
CMTimeRange b_timeRange = CMTimeRangeMake(kCMTimeZero, b_videoAsset.duration);
[a_compositionVideoTrack insertTimeRange:b_timeRange ofTrack:[[b_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
I haven't spotted the flaw yet, but I do have a couple of suggestions:
First, capture the error in insertTimeRange and every other opportunity and inspect it.
Second, for the simple case of appending videos, you can use AVMutableComposition without so much track mucking. Use "insertTimeRange:ofAsset:atTime:error:" with AVAssets initialized from the files and you will simplify your code greatly. If you need to do something more complicated such as crossfades, you'll need to use a video composition and audio mix as well, and at that point you can deal with the complexity of tracks.