AVVideoComposition Failure - iphone

As in mine. To use it.
I am string several different video clips together into an AVMutableComposition and attempting to correct their orientation if needed.
Here is my code:
composition = [[AVMutableComposition alloc] init];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime nextClipStartTime = kCMTimeZero;
// orientation compensation vars
AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
NSMutableArray *compInst = [[NSMutableArray alloc] init];
// get view size
CGSize viewSize = playerView.frame.size;
// generate movie assets
for (NSString* moviePath in [currentBlam valueForKey:#"movies"]) {
NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
AVURLAsset *movieAsset = [AVURLAsset URLAssetWithURL:movieURL options:nil];
// scale asset to fit screen
CMTimeRange tr = CMTimeRangeFromTimeToTime(CMTimeMakeWithSeconds(0.0f, 1), CMTimeMakeWithSeconds(0.0f, 1));
// create video track
AVAssetTrack *clipVideoTrack = [[movieAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
// create audio track
AVAssetTrack *clipAudioTrack = [[movieAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
tr = CMTimeRangeFromTimeToTime(CMTimeMakeWithSeconds(0.0f, 1), CMTimeMakeWithSeconds(CMTimeGetSeconds([movieAsset duration]), 1));
AVMutableVideoCompositionLayerInstruction *layerInst;
layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
int or = [self orientationForTrack:movieAsset];
if (or==1) {
float rot = (0.0f);
[layerInst setTransform:CGAffineTransformMakeRotation(rot) atTime:nextClipStartTime];
} else if (or==2) {
float rot = (M_PI);
[layerInst setTransform:CGAffineTransformMakeRotation(rot) atTime:nextClipStartTime];
} else if (or==3) {
float rot = (M_PI*-0.5f);
[layerInst setTransform:CGAffineTransformMakeRotation(rot) atTime:nextClipStartTime];
} else if (or==4) {
float rot = (M_PI*0.5f);
[layerInst setTransform:CGAffineTransformMakeRotation(rot) atTime:nextClipStartTime];
}
[layerInst setTransform:clipVideoTrack.preferredTransform atTime:nextClipStartTime];
[compInst addObject:layerInst];
// insert video track
[compositionVideoTrack insertTimeRange:tr
ofTrack:clipVideoTrack
atTime:nextClipStartTime
error:nil];
// insert audio track
[compositionAudioTrack insertTimeRange:tr
ofTrack:clipAudioTrack
atTime:nextClipStartTime
error:nil];
nextClipStartTime = CMTimeAdd(nextClipStartTime, tr.duration);
}
//set size and duration
composition.naturalSize = viewSize;
videoComposition.frameDuration = composition.duration;
videoComposition.renderSize = viewSize;
videoComposition.renderScale = 1.0f;
//apply instructions
inst.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
inst.layerInstructions = compInst;
videoComposition.instructions = [NSArray arrayWithObject:inst];
playerItem = [[AVPlayerItem alloc] initWithAsset:composition];
playerItem.videoComposition = videoComposition;
[playerItem addObserver:self forKeyPath:#"status" options:0 context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:playerItem];
player = [AVPlayer playerWithPlayerItem:playerItem];
[playerView setPlayer:player];
When I run this and populate with content nothing shows.
This was working previously without the AVVideoComposition being applied to the player. In fact, commenting out the playerItem.videoComposition = videoComposition allows it to work, albeit without correcting the assets rotations.
At this point I know I'm just misunderstanding something here. Can someone please point out what?

Did you consider that maybe the point the video is being rotated about is not the center and the video is being drawn off screen? This may totally be wrong but its just my first thought.

I think you are setting each single frame to be the length of the entire clip:
videoComposition.frameDuration = composition.duration;

Related

How to merge Audio and video?

I'm merging video with video and audio with video. It is working fine in video with video case but when audio file merge that give black screen. I don't know what thing I'm going to wrong with this code
-(void)mergeAllMediaAtTime:(NSMutableArray*)startTimeArray {
NSURL *firstURL = [NSURL fileURLWithPath:[urlArray objectAtIndex:counter]];
firstAsset = [AVAsset assetWithURL:firstURL];
NSString* videoDirPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Crop Videos"];
NSString* fileName = [VideoAndAudioNameArray objectAtIndex:counter];
NSString *pSecondVideoPath = [videoDirPath stringByAppendingPathComponent:fileName];
NSURL *secondURL = [NSURL fileURLWithPath:pSecondVideoPath];
secondAsset = [AVAsset assetWithURL:secondURL];
if(firstAsset !=nil && secondAsset!=nil)
{
AVVideoComposition *origionalComposition = [AVVideoComposition videoCompositionWithPropertiesOfAsset:firstAsset];
//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
//VIDEO TRACK
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack* track = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:track atTime:kCMTimeZero error:nil];
int time = [[startTimeArray objectAtIndex:counter] intValue];
CMTime pTime = CMTimeMake(time, 1);
///////////////////////
AVMutableCompositionTrack *secondTrack;
if ([[fileName pathExtension] isEqualToString:#"mov"])
{
secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration) ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:pTime error:nil];
}
// If Audio file
else
{
secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration) ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:pTime error:nil];
NSLog(#"Audio file's Merging");
}
/****** First Video *********/
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstAsset.duration);
MainInstruction.backgroundColor = [[UIColor clearColor] CGColor];
//FIXING ORIENTATION//
AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
AVAssetTrack *FirstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[FirstlayerInstruction setTransform:FirstAssetTrack.preferredTransform atTime:kCMTimeZero];
[FirstlayerInstruction setOpacity:0.0 atTime:firstAsset.duration];
if ([[fileName pathExtension] isEqualToString:#"mov"])
{
/****** Second Video *********/
AVMutableVideoCompositionLayerInstruction *SecondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];
AVAssetTrack *SecondAssetTrack = [[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[SecondlayerInstruction setOpacity:1.0 atTime:pTime];
UIImageOrientation SecondAssetOrientation_ = UIImageOrientationUp;
BOOL isSecondAssetPortrait_ = NO;
CGAffineTransform secondTransform = SecondAssetTrack.preferredTransform;
if(secondTransform.a == 0 && secondTransform.b == 1.0 && secondTransform.c == -1.0 && secondTransform.d == 0) {SecondAssetOrientation_= UIImageOrientationRight; isSecondAssetPortrait_ = YES;}
if(secondTransform.a == 0 && secondTransform.b == -1.0 && secondTransform.c == 1.0 && secondTransform.d == 0) {SecondAssetOrientation_ = UIImageOrientationLeft; isSecondAssetPortrait_ = YES;}
if(secondTransform.a == 1.0 && secondTransform.b == 0 && secondTransform.c == 0 && secondTransform.d == 1.0) {SecondAssetOrientation_ = UIImageOrientationUp;}
if(secondTransform.a == -1.0 && secondTransform.b == 0 && secondTransform.c == 0 && secondTransform.d == -1.0) {SecondAssetOrientation_ = UIImageOrientationDown;}
CGFloat SecondAssetScaleToFitRatioOfWidth = nRenderWidth/SecondAssetTrack.naturalSize.width;
if(isSecondAssetPortrait_)
{
CGFloat SecondAssetScaleToFitRatioOfHeight = nRenderWidth/SecondAssetTrack.naturalSize.height;
CGAffineTransform SecondAssetScaleFactor = CGAffineTransformMakeScale(SecondAssetScaleToFitRatioOfWidth,SecondAssetScaleToFitRatioOfHeight);
[SecondlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformMakeScale(1.0f,1.0f), SecondAssetScaleFactor) atTime:kCMTimeZero];
//CGAffineTransformConcat(CGAffineTransformMakeScale(1.0f,1.0f), SecondAssetScaleFactor)
}
else
{
CGFloat SecondAssetScaleToFitRatioOfWidth = nRenderWidth/SecondAssetTrack.naturalSize.width;
CGFloat SecondAssetScaleToFitRatioOfHeight = nRenderWidth/SecondAssetTrack.naturalSize.height;
CGAffineTransform SecondAssetScaleFactor = CGAffineTransformMakeScale(SecondAssetScaleToFitRatioOfWidth,SecondAssetScaleToFitRatioOfHeight);
//[SecondlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(SecondAssetTrack.preferredTransform, SecondAssetScaleFactor),CGAffineTransformMakeTranslation(0, 160)) atTime:firstAsset.duration];
[SecondlayerInstruction setTransform:CGAffineTransformConcat(SecondAssetScaleFactor ,CGAffineTransformMakeTranslation(1, 1)) atTime:kCMTimeZero];
//CGAffineTransformConcat(CGAffineTransformMakeScale(1.0f,1.0f),CGAffineTransformMakeTranslation(0, 100))
}
[SecondlayerInstruction setOpacity:0.0 atTime:CMTimeAdd(pTime, secondAsset.duration)];
MainInstruction.layerInstructions = [NSArray arrayWithObjects:SecondlayerInstruction, FirstlayerInstruction,nil];
}
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = origionalComposition.frameDuration;
MainCompositionInst.renderScale = 1.0;
MainCompositionInst.renderSize = CGSizeMake(nRenderWidth, nRenderHeight);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"mergeVideo_%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
[urlArray addObject:myPathDocs];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.videoComposition = MainCompositionInst;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^{
[self exportDidFinish:exporter];
}
});
}];
}
}
In that case,you start with your audio from end of the video frames. You can use "atTime:kCMTimeZero".
i.e: shown in below(code below)
else
{
secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration) ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
NSLog(#"Audio file's Merging");
}

How to fix video orientation issue in iOS

I am working with an app in which user picks video from Photos and uploads it to server. As my server is .Net server , the video gets rotated. I know the reason of problem is probably same as was in case of image (you may refer my earlier answer https://stackoverflow.com/a/10601175/1030951 ) , So i googled and got a code to fix video orientation , I got a code from RayWenderlich.com and modified in following way. Now my output video works fine but the video is mute. it plays but doesn't play audio. Kindly help me if I am missing something.
I pass Info dictionary of -(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info method
- (void)fix:(NSDictionary*)pobjInfoDirectory withFileName:(NSString*)pstrOutputFileName
{
firstAsset = [AVAsset assetWithURL:[pobjInfoDirectory objectForKey:UIImagePickerControllerMediaURL]];
if(firstAsset !=nil)
{
//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
//VIDEO TRACK
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstAsset.duration);
//FIXING ORIENTATION//
AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
AVAssetTrack *FirstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation FirstAssetOrientation_ = UIImageOrientationUp;
BOOL isFirstAssetPortrait_ = NO;
CGAffineTransform firstTransform = FirstAssetTrack.preferredTransform;
if(firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0)
{
FirstAssetOrientation_= UIImageOrientationRight; isFirstAssetPortrait_ = YES;
}
if(firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0)
{
FirstAssetOrientation_ = UIImageOrientationLeft; isFirstAssetPortrait_ = YES;
}
if(firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0)
{
FirstAssetOrientation_ = UIImageOrientationUp;
}
if(firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0)
{
FirstAssetOrientation_ = UIImageOrientationDown;
}
CGFloat FirstAssetScaleToFitRatio = 320.0/FirstAssetTrack.naturalSize.width;
if(isFirstAssetPortrait_)
{
FirstAssetScaleToFitRatio = 320.0/FirstAssetTrack.naturalSize.height;
CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
[FirstlayerInstruction setTransform:CGAffineTransformConcat(FirstAssetTrack.preferredTransform, FirstAssetScaleFactor) atTime:kCMTimeZero];
}
else
{
CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
[FirstlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(FirstAssetTrack.preferredTransform, FirstAssetScaleFactor),CGAffineTransformMakeTranslation(0, 160)) atTime:kCMTimeZero];
}
[FirstlayerInstruction setOpacity:0.0 atTime:firstAsset.duration];
MainInstruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction,nil];;
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = CMTimeMake(1, 30);
MainCompositionInst.renderSize = CGSizeMake(320.0, 480.0);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"mergeVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.videoComposition = MainCompositionInst;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^{
[self exportDidFinish:exporter];
});
}];
}
}
- (void)exportDidFinish:(AVAssetExportSession*)session
{
if(session.status == AVAssetExportSessionStatusCompleted)
{
NSURL *outputURL = session.outputURL;
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL])
{
if ([self.delegate respondsToSelector:#selector(videoExported)])
[self.delegate videoExported];
}
}
firstAsset = nil;
}
Add this after the //VIDEO TRACK part
//AUDIO TRACK
AVMutableCompositionTrack *firstAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[firstAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:#"your video url here..." options:nil];
Add After AVMutableCompositionTrack.
set setPreferredTransform: set here your source video that you want to export with same orientation.
// Grab the source track from AVURLAsset for example.
AVAssetTrack *assetVideoTrack = [videoAsset tracksWithMediaType:AVMediaTypeVideo].lastObject;
// Grab the composition video track from AVMutableComposition you already made.
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition tracksWithMediaType:AVMediaTypeVideo].lastObject;
// Apply the original transform.
if (assetVideoTrack && compositionVideoTrack) {
[compositionVideoTrack setPreferredTransform:assetVideoTrack.preferredTransform];
}

Transform not working in AVMutableVideoComposition while exporting

My goal is to compose a set of clips recorded from the camera and export them at a certain preferred size. Of course, the video orientation needs to be rotated before exporting.
I'm doing this by composing an AVMutableComposition from an array of video clips, stored in avAssets below. I am able to compose them fine, and export it. However, the rotation transform I am setting on the AVMutableVideoComposition is not being honored. If I use the same transform and set it on the preferredTransform property of the video track, then it works. In both cases, the video renderSize is not being honored. It's as if the exporter ignores the videoComposition completely. Any ideas what could be happening?
I do have an AVCaptureSession running, but I turned it off before exporting and that didn't make any difference. I am fairly new to iOS programming, so it could be I'm missing something basic. :)
My code:
-(void) finalRecord{
NSError *error = nil;
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
NSLog(#"Video track id is %d", [compositionVideoTrack trackID]);
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
// avAssets hold the video clips to be composited
int pieces = [avAssets count];
CGAffineTransform transform = CGAffineTransformMakeRotation( M_PI_2);
// [compositionVideoTrack setPreferredTransform:transform];
for (int i = 0; i<pieces; i++) {
AVURLAsset *sourceAsset = [avAssets objectAtIndex:i];
AVAssetTrack *sourceVideoTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *sourceAudioTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[timeRanges addObject:[NSValue valueWithCMTimeRange:CMTimeRangeMake(kCMTimeZero, sourceAsset.duration)]];
[videoTracks addObject:sourceVideoTrack];
[audioTracks addObject:sourceAudioTrack];
}
[compositionVideoTrack insertTimeRanges:timeRanges ofTracks:videoTracks atTime:kCMTimeZero error:&error];
[compositionAudioTrack insertTimeRanges:timeRanges ofTracks:audioTracks atTime:kCMTimeZero error:&error];
AVMutableVideoCompositionInstruction *vtemp = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
vtemp.timeRange = CMTimeRangeMake(kCMTimeZero, [composition duration]);
NSLog(#"\nInstruction vtemp's time range is %f %f", CMTimeGetSeconds( vtemp.timeRange.start),
CMTimeGetSeconds(vtemp.timeRange.duration));
// Also tried videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack
AVMutableVideoCompositionLayerInstruction *vLayerInstruction = [AVMutableVideoCompositionLayerInstruction
videoCompositionLayerInstructionWithAssetTrack:composition.tracks[0]];
[vLayerInstruction setTransform:transform atTime:kCMTimeZero];
vtemp.layerInstructions = #[vLayerInstruction];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.renderSize = CGSizeMake(320.0, 240.0);
videoComposition.frameDuration = CMTimeMake(1,30);
videoComposition.instructions = #[vtemp];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:composition presetName:gVideoExportQuality];
NSParameterAssert(exporter != nil);
exporter.videoComposition = videoComposition;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
NSString *rootName = [[self captureManager] tempFileRoot];
NSString *temp = [NSString stringWithFormat:#"%#%#.mov", NSTemporaryDirectory(), rootName];
exporter.outputURL = [NSURL fileURLWithPath:temp ];
[exporter exportAsynchronouslyWithCompletionHandler:^{
switch ([exporter status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [exporter error]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export successfully");
[self exportFile:exporter.outputURL];
[self.delegate recordingEndedWithFile:exporter.outputURL];
isExporting = FALSE;
[[[self captureManager] session] startRunning];
break;
default:
break;
}
if (exporter.status != AVAssetExportSessionStatusCompleted){
NSLog(#"Retry export");
}
}];
}
Ok figured it out and posting here to help other people not waste the time that I did.
The issue is that if you use AVAssetExportPresetPassthrough on an AVExportSession, then the exporter will ignore the video composition instructions. I was expecting it to at least honor the video composition instructions while passing through the format etc. but apparently that isn't how it works. After I have filled a documentation bug, you can find it in Technical Q&A.
The solution if you want to use AVAssetExportPresetPassthrough:
compositionVideoTrack.preferredTransform = transform
more info here: https://developer.apple.com/library/archive/qa/qa1744/_index.html
Otherwise, if you specify the AVAssetExportPresetPassthrough export option to let all tracks pass through but you still want to set a transform on the composition, set the preferredTransform property on the composition tracks as described above.

AVErrorInvalidVideoComposition = -11841

I am merging multiple videos and multiple songs and I am bot getting what is wrong in the code because the same code was running absolutely fine yesterday but today I'm getting the following response
AVAssetExportSessionStatus = 4,error = Error Domain=AVFoundationErrorDomain Code=-11841 "The operation couldn’t be completed. (AVFoundationErrorDomain error -11841.)"
I did some research and found that exporting is getting failed due to invalid video composition.Please find out what is wrong with the video composition.
- (void)mergeAllselectedVideos
{
NSArray *pathArray = [DocumentDirectory getUrlFromDocumentDirectoryOfList:self.selectedClipsArray];
AVMutableComposition *mixComposition = [[AVMutableComposition alloc]init];
NSMutableArray *layerinstructions = [[NSMutableArray alloc]init];
CMTime time = kCMTimeZero;
CMTime previousSongDuration = kCMTimeZero;
for (int i = 0 ; i < pathArray.count; i++)
{
//VIDEO TRACK//
time = CMTimeAdd(time, previousSongDuration);
NSURL *url = [NSURL URLWithString:[pathArray objectAtIndex:i]];
AVAsset *avAsset = [AVAsset assetWithURL:url];
AVMutableCompositionTrack *track = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[track insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:[[avAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:time error:nil];
previousSongDuration = avAsset.duration;
}
CMTime audioTime = kCMTimeZero;
for (int i = 0; i < self.selectedSongsArray.count; i++)
{
MPMediaItem * songItem = [self.selectedSongsArray objectAtIndex:i];
NSURL *songURL = [songItem valueForProperty: MPMediaItemPropertyAssetURL];
AVAsset *audioAsset = [AVAsset assetWithURL:songURL];
AVMutableCompositionTrack *AudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTimeRange timeRange = CMTimeRangeMake(audioTime, audioAsset.duration);
if(CMTimeGetSeconds(CMTimeAdd(audioTime, audioAsset.duration)) > CMTimeGetSeconds(time))
{
timeRange = CMTimeRangeMake(audioTime, CMTimeSubtract(time,audioTime));
}
[AudioTrack insertTimeRange:timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
audioTime = CMTimeAdd(audioTime, audioAsset.duration);
}
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, time);
MainInstruction.layerInstructions = layerinstructions;
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = CMTimeMake(1, 30);
MainCompositionInst.renderSize = CGSizeMake(320.0, 480.0);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
movieName = [CoreDataFunctions getNameForMovieForDate:[CalendarFunctions getCurrentDateString]];
self.moviePlayButton.titleLabel.text = movieName;
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:movieName];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.videoComposition = MainCompositionInst;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{dispatch_async(dispatch_get_main_queue(), ^{[self exportDidFinish:exporter];});}];
}
- (void)exportDidFinish:(AVAssetExportSession*)session
{
//Printing error
NSLog(#"AVAssetExportSessionStatus = %i,error = %#",session.status,session.error);
}
I found your question while having the same problem. My theory on this issue is that all the properties for the video composition are not set at export time, so it's crapping out. Here's the stanza that I am now using which is now resulting in an error-free export:
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1,30);
videoComposition.renderScale = 1.0;
videoComposition.renderSize = CGSizeMake(352.0, 288.0);
instruction.layerInstructions = [NSArray arrayWithObject: layerInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
videoComposition.instructions = [NSArray arrayWithObject: instruction];
In my case, I was missing the timeRange property on the instruction. Check your own properties to ensure they're getting the correct values. Good luck! This stuff is hard.
If you happen to be passing more than one main instruction to the instructions array of AVMutableVideoComposition, make sure that the time ranges do not overlap or it will cause this error.
You need to set opacity for the first LayerInstruction, e.g.:
[FirstlayerInstruction setOpacity:0.0 atTime:firstAsset.duration];

naturalSize returning wrong orientation from AVURLAsset

Im using the code found [here][1] to attach image on video taken using UIImagePickerController.
the video is in portrait and playing fine but once I use AVURLASSet it turning landscape orientation instead of portrait and I cant find why?
Can any one point me to the right direction ?
My Code:
-(IBAction)addWaterMark:(id)sender {
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:tempPath] options:nil];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:clipVideoTrack
atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:[[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]];
CGSize videoSize = [videoAsset naturalSize];
NSLog(#"%f %f",videoSize.width,videoSize.height);
}
at this point I get 480,360 instead of the correct size for temppath
found detailed tutorial describing how to edit and manipulate videos with AVfoundation introduction here
If they are landscape, we can use the naturalSize property we are supplied with, but if they are portrait, we must flip the the naturalSize so that the width is now the height and vice-versa
I left the correct code here in case others need although you've chosen the answer of yourself as the correct one.
//readout the size of video
AVAssetTrack *vT = nil;
if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0)
{
vT = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
}
if (vT != nil)
{
orgWidth = vT.naturalSize.width;
orgHeight = vT.naturalSize.height;
}
//check the orientation
CGAffineTransform txf = [vT preferredTransform];
if ((width == txf.tx && height == txf.ty)|(txf.tx == 0 && txf.ty == 0))
{
//Landscape
}
else
{
//Portrait
}