Rotate an AVAsset with AVAssetExportSession - iphone

I'm trying to rotate a video to its correct orientation using an AVAssetExportSession and I always get the following error:
Error Domain=AVFoundationErrorDomain Code=-11841 "The operation couldn’t be completed. (AVFoundationErrorDomain error -11841.)"
That translates to AVErrorInvalidVideoComposition but I cannot see anything wrong with my video composition. Here's the code:
AVAssetTrack *sourceVideo = [[avAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
AVAssetTrack *sourceAudio = [[avAsset tracksWithMediaType:AVMediaTypeAudio] lastObject];
CGAffineTransform preferredTransform = [sourceVideo preferredTransform];
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetExportSession *exporter = [[[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality] autorelease];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceVideo
atTime:kCMTimeZero
error:nil];
if( !CGAffineTransformIsIdentity(preferredTransform) ) {
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.renderSize = CGSizeMake([avAsset naturalSize].height, [avAsset naturalSize].width);
videoComposition.frameDuration = CMTimeMake(1, compositionVideoTrack.naturalTimeScale);
AVMutableVideoCompositionLayerInstruction *instruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:sourceVideo];
[instruction setTransform:preferredTransform atTime:kCMTimeZero];
AVMutableVideoCompositionInstruction *videoTrackInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoTrackInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, avAsset.duration);
videoTrackInstruction.layerInstructions = [NSArray arrayWithObject:instruction];
[videoComposition setInstructions:[NSArray arrayWithObject:videoTrackInstruction]];
exporter.videoComposition = videoComposition;
}
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration)
ofTrack:sourceAudio
atTime:kCMTimeZero
error:nil];
exporter.outputURL = tempPathUrl;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
[exporter exportAsynchronouslyWithCompletionHandler:^{ }];
What could be wrong with the composition? I've been through the documentation and cannot see anything wrong with it so far.

It might have to do with your frame duration. You're using
CMTimeMake(1, naturalTimeScale)
You should check the naturalTimeScale as it does not always equal the fps. See the AVFoundation Programming Guide 'Representations of Time' for more info.

I think it's because the order of width and height arguments is incorrect.
You have:
videoComposition.renderSize = CGSizeMake([avAsset naturalSize].height, [avAsset naturalSize].width);
Shouldn't it be
videoComposition.renderSize = CGSizeMake([avAsset naturalSize].width, [avAsset naturalSize].height);
instead?

The error is from how you set instruction.
... videoCompositionLayerInstructionWithAssetTrack:sourceVideo]
sourceVideo is not a member of the composition. In your case this should be compositionVideoTrack.

Related

Video recorded from EAGLView is flipped when uploaded to youtube

In my ios app, I am trying to record video of the content of EAGLView(There is no Camera involvement). I have no problem recording the video.
After recording I have to add few sound tracks to the video and then have to share this video to Youtube and facebook. My problem is that the video is okay when I play it on iphone or on mac but when I upload this video to youtube (using Youtube Data Api v3), the video is vertically inverted or upside down.
I guess I need to rotate the frame in video before uploading but I don't know how to do that.
Any help will be highly appreciated.
The code I am using to add audio tracks to the video is below:
-(void)prepareVideoForPath:(NSString *)videoPath usingAudio:(NSArray *)audioArray andOutputPath:(NSString *)exportPath{
NSDictionary *optionsDictionary = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
NSURL *videoUrl=[NSURL fileURLWithPath:videoPath];
AVURLAsset* videoAsset = [AVURLAsset URLAssetWithURL:videoUrl options:optionsDictionary];
AVAssetTrack *FirstAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGAffineTransform firstTransform = FirstAssetTrack.preferredTransform;
AVMutableComposition* mixComposition = [AVMutableComposition composition];
//VideoTrack
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:FirstAssetTrack atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:firstTransform];
//Audio Track
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime audioStartTime = kCMTimeZero;
for (NSURL *audioURL in audioArray) {
AVURLAsset *audioAsset = [AVURLAsset URLAssetWithURL:audioURL options:optionsDictionary];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:audioStartTime error:nil];
audioStartTime = CMTimeAdd(audioStartTime, audioAsset.duration);
}
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetPassthrough];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
_assetExport.outputFileType = AVFileTypeQuickTimeMovie;
NSLog(#"file type %#",_assetExport.outputFileType);
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
// your completion code here
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(#"Mixing complete");
});
}
];
}
Ok I found it how to rotate a video.
Here is the code:
-(void)fixVideoOrientationForURL:(NSURL *)videoURL andOutputPath:(NSString *)exportPath{
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:videoURL options:#{ AVURLAssetPreferPreciseDurationAndTimingKey:#YES }];
AVMutableVideoCompositionInstruction *instruction = nil;
AVMutableVideoCompositionLayerInstruction *layerInstruction = nil;
CGAffineTransform transform;
AVAssetTrack *assetVideoTrack = nil;
AVAssetTrack *assetAudioTrack = nil;
// Check if the asset contains video and audio tracks
if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
}
if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0) {
assetAudioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
}
CMTime insertionPoint = kCMTimeZero;
NSError *error = nil;
// Step 1
// Create a composition with the given asset and insert audio and video tracks into it from the asset
AVMutableComposition *mutableComposition = [AVMutableComposition composition];
// Insert the video and audio tracks from AVAsset
if (assetVideoTrack != nil) {
AVMutableCompositionTrack *compositionVideoTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetVideoTrack atTime:insertionPoint error:&error];
}
if (assetAudioTrack != nil) {
AVMutableCompositionTrack *compositionAudioTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetAudioTrack atTime:insertionPoint error:&error];
}
// Step 2
// Translate the composition to compensate the movement caused by rotation (since rotation would cause it to move out of frame)
// Rotate transformation
transform = CGAffineTransformMake(1, 0, 0, -1, 0, assetVideoTrack.naturalSize.height);
// Step 3
// Set the appropriate render sizes and rotational transforms
// Create a new video composition
AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
mutableVideoComposition.renderSize = CGSizeMake(assetVideoTrack.naturalSize.width,assetVideoTrack.naturalSize.height);
mutableVideoComposition.frameDuration = CMTimeMake(1, 30);
// The rotate transform is set on a layer instruction
instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mutableComposition duration]);
layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:(mutableComposition.tracks)[0]];
[layerInstruction setTransform:t2 atTime:kCMTimeZero];
// Step 4
// Add the transform instructions to the video composition
instruction.layerInstructions = #[layerInstruction];
mutableVideoComposition.instructions = #[instruction];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:[mutableComposition copy] presetName:AVAssetExportPresetHighestQuality];
exportSession.videoComposition = mutableVideoComposition;
exportSession.outputURL = [NSURL fileURLWithPath:exportPath];
exportSession.outputFileType=AVFileTypeQuickTimeMovie;
exportSession.shouldOptimizeForNetworkUse = YES;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void){
dispatch_async(dispatch_get_main_queue(), ^{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:
NSLog(#"writing complete");
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed:%#",exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Canceled:%#",exportSession.error);
break;
default:
break;
}
});
}];
}

How to use scaleTimeRange in an AVMutableComposition?

I have a question about scaling the length of video in an AVMutableComposition to speed it up or slow it down. I know the code to do it is
scaleTimeRange:(CMTimeRange)timeRange toDuration:(CMTime)duration
The problem is that I don't know how to use that code. What I have currently for my composition which has a video track that's been recorded and an audio track that's been supplied.
Can someone show me how to add this in to get my video track to be sped up or slowed down?
Here's my code :
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audioURL options:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:videoURL options:nil];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetPassthrough];
NSString* videoName = #\"export.mov\";
NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];

play and save video file

I have recorded one video and save it into document directory.
Now i want to edit video to slow,fast or normal speed.
I can able to do fast and normal speed of the video.
But i can't able to do slow speed of the video.It's alway converting to normal speed.
here is my code
AVMutableComposition *composition = [AVMutableComposition composition];
AVURLAsset * sourceAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:inputVideoPath] options:nil];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
BOOL ok = NO;
AVAssetTrack * sourceVideoTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CMTimeRange x = CMTimeRangeMake(kCMTimeZero, [sourceAsset duration]);
ok = [compositionVideoTrack insertTimeRange:x ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil];
CMTime fiveSecondsIn;
if([Cell.text isEqualToString:#"Fast"]){
fiveSecondsIn = CMTimeMake(10,1);
}
else if([Cell.text isEqualToString:#"Normal"]){
fiveSecondsIn = CMTimeMake(1,1);
}
else if([Cell.textLabel.text isEqualToString:#"Slow"]){
fiveSecondsIn = CMTimeMake(1,.5);
}
CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,sourceAsset.duration);
[compositionVideoTrack scaleTimeRange:video_timeRange toDuration:fiveSecondsIn];
if([[NSFileManager defaultManager] fileExistsAtPath:filePath]){
[[NSFileManager defaultManager] removeItemAtPath:filePath error:nil];
}
NSURL *exportUrl = [NSURL fileURLWithPath:filePath];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc]initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=exportUrl;
exporter.outputFileType = #"com.apple.quicktime-movie";
[exporter exportAsynchronouslyWithCompletionHandler:^(void){
[self performSelectorOnMainThread:#selector(playnewVideo) withObject:nil waitUntilDone:NO];
}];

Switch audio trakcs for AVURLAsset with multiple AVAssetTracks of type audio

I have an AVURLAsset with multiple AVAssetTracks of type audio. I would like to be able to allow user to switch between these different audio track by touching a button. It is working to turn volume of 1st track on and off but other tracks are not heard when volume set to 1.0.
Here is code for adjusting the volume of the tracks (sender is a UIButton with tag set to index of asset in audioTracks).
AVURLAsset *asset = (AVURLAsset*)[[player currentItem] asset];
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
NSMutableArray *allAudioParams = [NSMutableArray array];
int i = 0;
NSLog(#"%#", audioTracks);
for (AVAssetTrack *track in audioTracks) {
AVMutableAudioMixInputParameters *audioInputParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
float volume = i == sender.tag ? 1.0 : 0.0;
[audioInputParams setVolume:volume atTime:kCMTimeZero];
[audioInputParams setTrackID:[track trackID]];
[allAudioParams addObject:audioInputParams];
i++;
}
AVMutableAudioMix *audioZeroMix = [AVMutableAudioMix audioMix];
[audioZeroMix setInputParameters:allAudioParams];
[[player currentItem] setAudioMix:audioZeroMix];
Do I need to do something to bring the desired track to be the active one?
Ok found issue. Was not related to above code as this works fine. The problem was the AVAssetTracks for audio other than 1st track were not enabled. To enable them had to recreate the asset using AVMutableComposition:
NSURL *fileURL = [[NSBundle mainBundle]
URLForResource:#"movie" withExtension:#"mp4"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:nil];
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
NSError* error = NULL;
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,asset.duration)
ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo]objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
NSArray *allAudio = [asset tracksWithMediaType:AVMediaTypeAudio];
for (int i=0; i < [allAudio count]; i++) {
NSError* error = NULL;
AVAssetTrack *audioAsset = (AVAssetTrack*)[allAudio objectAtIndex:i];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,asset.duration)
ofTrack:audioAsset
atTime:kCMTimeZero
error:&error];
NSLog(#"Error : %#", error);
}

IPhone avcomposition issue

Im trying to create a video that shows two videos one after the other using avcomposition on the iphone. This code works, however i can only see one of the videos for the entire duration of the newly created video
- (void) startEdit{
AVMutableComposition* mixComposition = [AVMutableComposition composition];
NSString* a_inputFileName = #"export.mov";
NSString* a_inputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:a_inputFileName];
NSURL* a_inputFileUrl = [NSURL fileURLWithPath:a_inputFilePath];
NSString* b_inputFileName = #"output.mov";
NSString* b_inputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:b_inputFileName];
NSURL* b_inputFileUrl = [NSURL fileURLWithPath:b_inputFilePath];
NSString* outputFileName = #"outputFile.mov";
NSString* outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:outputFileName];
NSURL* outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
[[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
CMTime nextClipStartTime = kCMTimeZero;
AVURLAsset* a_videoAsset = [[AVURLAsset alloc]initWithURL:a_inputFileUrl options:nil];
CMTimeRange a_timeRange = CMTimeRangeMake(kCMTimeZero,a_videoAsset.duration);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:a_timeRange ofTrack:[[a_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration);
AVURLAsset* b_videoAsset = [[AVURLAsset alloc]initWithURL:b_inputFileUrl options:nil];
CMTimeRange b_timeRange = CMTimeRangeMake(kCMTimeZero, b_videoAsset.duration);
AVMutableCompositionTrack *b_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[b_compositionVideoTrack insertTimeRange:b_timeRange ofTrack:[[b_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetLowQuality];
_assetExport.outputFileType = #"com.apple.quicktime-movie";
_assetExport.outputURL = outputFileUrl;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
[self saveVideoToAlbum:outputFilePath];
}
];
}
- (void) saveVideoToAlbum:(NSString*)path{
if(UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(path)){
UISaveVideoAtPathToSavedPhotosAlbum (path, self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
}
}
- (void) video: (NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo {
NSLog(#"Finished saving video with error: %#", error);
}
I've posted the whole code as it may help someone else.
Shouldn't
nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration);
[b_compositionVideoTrack insertTimeRange:b_timeRange ofTrack:[[b_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
add the second video to the end of the first
Cheers
Figured it out. Should have only had one AVMutableCompositionTrack.
Like so:
CMTime nextClipStartTime = kCMTimeZero;
AVURLAsset* a_videoAsset = [[AVURLAsset alloc]initWithURL:a_inputFileUrl options:nil];
CMTimeRange a_timeRange = CMTimeRangeMake(kCMTimeZero,a_videoAsset.duration);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:a_timeRange ofTrack:[[a_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration);
AVURLAsset* b_videoAsset = [[AVURLAsset alloc]initWithURL:b_inputFileUrl options:nil];
CMTimeRange b_timeRange = CMTimeRangeMake(kCMTimeZero, b_videoAsset.duration);
[a_compositionVideoTrack insertTimeRange:b_timeRange ofTrack:[[b_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
I haven't spotted the flaw yet, but I do have a couple of suggestions:
First, capture the error in insertTimeRange and every other opportunity and inspect it.
Second, for the simple case of appending videos, you can use AVMutableComposition without so much track mucking. Use "insertTimeRange:ofAsset:atTime:error:" with AVAssets initialized from the files and you will simplify your code greatly. If you need to do something more complicated such as crossfades, you'll need to use a video composition and audio mix as well, and at that point you can deal with the complexity of tracks.