AVErrorInvalidVideoComposition = -11841 - iphone

I am merging multiple videos and multiple songs and I am bot getting what is wrong in the code because the same code was running absolutely fine yesterday but today I'm getting the following response
AVAssetExportSessionStatus = 4,error = Error Domain=AVFoundationErrorDomain Code=-11841 "The operation couldn’t be completed. (AVFoundationErrorDomain error -11841.)"
I did some research and found that exporting is getting failed due to invalid video composition.Please find out what is wrong with the video composition.
- (void)mergeAllselectedVideos
{
NSArray *pathArray = [DocumentDirectory getUrlFromDocumentDirectoryOfList:self.selectedClipsArray];
AVMutableComposition *mixComposition = [[AVMutableComposition alloc]init];
NSMutableArray *layerinstructions = [[NSMutableArray alloc]init];
CMTime time = kCMTimeZero;
CMTime previousSongDuration = kCMTimeZero;
for (int i = 0 ; i < pathArray.count; i++)
{
//VIDEO TRACK//
time = CMTimeAdd(time, previousSongDuration);
NSURL *url = [NSURL URLWithString:[pathArray objectAtIndex:i]];
AVAsset *avAsset = [AVAsset assetWithURL:url];
AVMutableCompositionTrack *track = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[track insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:[[avAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:time error:nil];
previousSongDuration = avAsset.duration;
}
CMTime audioTime = kCMTimeZero;
for (int i = 0; i < self.selectedSongsArray.count; i++)
{
MPMediaItem * songItem = [self.selectedSongsArray objectAtIndex:i];
NSURL *songURL = [songItem valueForProperty: MPMediaItemPropertyAssetURL];
AVAsset *audioAsset = [AVAsset assetWithURL:songURL];
AVMutableCompositionTrack *AudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTimeRange timeRange = CMTimeRangeMake(audioTime, audioAsset.duration);
if(CMTimeGetSeconds(CMTimeAdd(audioTime, audioAsset.duration)) > CMTimeGetSeconds(time))
{
timeRange = CMTimeRangeMake(audioTime, CMTimeSubtract(time,audioTime));
}
[AudioTrack insertTimeRange:timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
audioTime = CMTimeAdd(audioTime, audioAsset.duration);
}
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, time);
MainInstruction.layerInstructions = layerinstructions;
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = CMTimeMake(1, 30);
MainCompositionInst.renderSize = CGSizeMake(320.0, 480.0);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
movieName = [CoreDataFunctions getNameForMovieForDate:[CalendarFunctions getCurrentDateString]];
self.moviePlayButton.titleLabel.text = movieName;
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:movieName];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.videoComposition = MainCompositionInst;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{dispatch_async(dispatch_get_main_queue(), ^{[self exportDidFinish:exporter];});}];
}
- (void)exportDidFinish:(AVAssetExportSession*)session
{
//Printing error
NSLog(#"AVAssetExportSessionStatus = %i,error = %#",session.status,session.error);
}

I found your question while having the same problem. My theory on this issue is that all the properties for the video composition are not set at export time, so it's crapping out. Here's the stanza that I am now using which is now resulting in an error-free export:
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1,30);
videoComposition.renderScale = 1.0;
videoComposition.renderSize = CGSizeMake(352.0, 288.0);
instruction.layerInstructions = [NSArray arrayWithObject: layerInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
videoComposition.instructions = [NSArray arrayWithObject: instruction];
In my case, I was missing the timeRange property on the instruction. Check your own properties to ensure they're getting the correct values. Good luck! This stuff is hard.

If you happen to be passing more than one main instruction to the instructions array of AVMutableVideoComposition, make sure that the time ranges do not overlap or it will cause this error.

You need to set opacity for the first LayerInstruction, e.g.:
[FirstlayerInstruction setOpacity:0.0 atTime:firstAsset.duration];

Related

How to merge Audio and video?

I'm merging video with video and audio with video. It is working fine in video with video case but when audio file merge that give black screen. I don't know what thing I'm going to wrong with this code
-(void)mergeAllMediaAtTime:(NSMutableArray*)startTimeArray {
NSURL *firstURL = [NSURL fileURLWithPath:[urlArray objectAtIndex:counter]];
firstAsset = [AVAsset assetWithURL:firstURL];
NSString* videoDirPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Crop Videos"];
NSString* fileName = [VideoAndAudioNameArray objectAtIndex:counter];
NSString *pSecondVideoPath = [videoDirPath stringByAppendingPathComponent:fileName];
NSURL *secondURL = [NSURL fileURLWithPath:pSecondVideoPath];
secondAsset = [AVAsset assetWithURL:secondURL];
if(firstAsset !=nil && secondAsset!=nil)
{
AVVideoComposition *origionalComposition = [AVVideoComposition videoCompositionWithPropertiesOfAsset:firstAsset];
//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
//VIDEO TRACK
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack* track = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:track atTime:kCMTimeZero error:nil];
int time = [[startTimeArray objectAtIndex:counter] intValue];
CMTime pTime = CMTimeMake(time, 1);
///////////////////////
AVMutableCompositionTrack *secondTrack;
if ([[fileName pathExtension] isEqualToString:#"mov"])
{
secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration) ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:pTime error:nil];
}
// If Audio file
else
{
secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration) ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:pTime error:nil];
NSLog(#"Audio file's Merging");
}
/****** First Video *********/
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstAsset.duration);
MainInstruction.backgroundColor = [[UIColor clearColor] CGColor];
//FIXING ORIENTATION//
AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
AVAssetTrack *FirstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[FirstlayerInstruction setTransform:FirstAssetTrack.preferredTransform atTime:kCMTimeZero];
[FirstlayerInstruction setOpacity:0.0 atTime:firstAsset.duration];
if ([[fileName pathExtension] isEqualToString:#"mov"])
{
/****** Second Video *********/
AVMutableVideoCompositionLayerInstruction *SecondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];
AVAssetTrack *SecondAssetTrack = [[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[SecondlayerInstruction setOpacity:1.0 atTime:pTime];
UIImageOrientation SecondAssetOrientation_ = UIImageOrientationUp;
BOOL isSecondAssetPortrait_ = NO;
CGAffineTransform secondTransform = SecondAssetTrack.preferredTransform;
if(secondTransform.a == 0 && secondTransform.b == 1.0 && secondTransform.c == -1.0 && secondTransform.d == 0) {SecondAssetOrientation_= UIImageOrientationRight; isSecondAssetPortrait_ = YES;}
if(secondTransform.a == 0 && secondTransform.b == -1.0 && secondTransform.c == 1.0 && secondTransform.d == 0) {SecondAssetOrientation_ = UIImageOrientationLeft; isSecondAssetPortrait_ = YES;}
if(secondTransform.a == 1.0 && secondTransform.b == 0 && secondTransform.c == 0 && secondTransform.d == 1.0) {SecondAssetOrientation_ = UIImageOrientationUp;}
if(secondTransform.a == -1.0 && secondTransform.b == 0 && secondTransform.c == 0 && secondTransform.d == -1.0) {SecondAssetOrientation_ = UIImageOrientationDown;}
CGFloat SecondAssetScaleToFitRatioOfWidth = nRenderWidth/SecondAssetTrack.naturalSize.width;
if(isSecondAssetPortrait_)
{
CGFloat SecondAssetScaleToFitRatioOfHeight = nRenderWidth/SecondAssetTrack.naturalSize.height;
CGAffineTransform SecondAssetScaleFactor = CGAffineTransformMakeScale(SecondAssetScaleToFitRatioOfWidth,SecondAssetScaleToFitRatioOfHeight);
[SecondlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformMakeScale(1.0f,1.0f), SecondAssetScaleFactor) atTime:kCMTimeZero];
//CGAffineTransformConcat(CGAffineTransformMakeScale(1.0f,1.0f), SecondAssetScaleFactor)
}
else
{
CGFloat SecondAssetScaleToFitRatioOfWidth = nRenderWidth/SecondAssetTrack.naturalSize.width;
CGFloat SecondAssetScaleToFitRatioOfHeight = nRenderWidth/SecondAssetTrack.naturalSize.height;
CGAffineTransform SecondAssetScaleFactor = CGAffineTransformMakeScale(SecondAssetScaleToFitRatioOfWidth,SecondAssetScaleToFitRatioOfHeight);
//[SecondlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(SecondAssetTrack.preferredTransform, SecondAssetScaleFactor),CGAffineTransformMakeTranslation(0, 160)) atTime:firstAsset.duration];
[SecondlayerInstruction setTransform:CGAffineTransformConcat(SecondAssetScaleFactor ,CGAffineTransformMakeTranslation(1, 1)) atTime:kCMTimeZero];
//CGAffineTransformConcat(CGAffineTransformMakeScale(1.0f,1.0f),CGAffineTransformMakeTranslation(0, 100))
}
[SecondlayerInstruction setOpacity:0.0 atTime:CMTimeAdd(pTime, secondAsset.duration)];
MainInstruction.layerInstructions = [NSArray arrayWithObjects:SecondlayerInstruction, FirstlayerInstruction,nil];
}
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = origionalComposition.frameDuration;
MainCompositionInst.renderScale = 1.0;
MainCompositionInst.renderSize = CGSizeMake(nRenderWidth, nRenderHeight);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"mergeVideo_%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
[urlArray addObject:myPathDocs];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.videoComposition = MainCompositionInst;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^{
[self exportDidFinish:exporter];
}
});
}];
}
}
In that case,you start with your audio from end of the video frames. You can use "atTime:kCMTimeZero".
i.e: shown in below(code below)
else
{
secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration) ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
NSLog(#"Audio file's Merging");
}

How to fix video orientation issue in iOS

I am working with an app in which user picks video from Photos and uploads it to server. As my server is .Net server , the video gets rotated. I know the reason of problem is probably same as was in case of image (you may refer my earlier answer https://stackoverflow.com/a/10601175/1030951 ) , So i googled and got a code to fix video orientation , I got a code from RayWenderlich.com and modified in following way. Now my output video works fine but the video is mute. it plays but doesn't play audio. Kindly help me if I am missing something.
I pass Info dictionary of -(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info method
- (void)fix:(NSDictionary*)pobjInfoDirectory withFileName:(NSString*)pstrOutputFileName
{
firstAsset = [AVAsset assetWithURL:[pobjInfoDirectory objectForKey:UIImagePickerControllerMediaURL]];
if(firstAsset !=nil)
{
//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
//VIDEO TRACK
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstAsset.duration);
//FIXING ORIENTATION//
AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
AVAssetTrack *FirstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation FirstAssetOrientation_ = UIImageOrientationUp;
BOOL isFirstAssetPortrait_ = NO;
CGAffineTransform firstTransform = FirstAssetTrack.preferredTransform;
if(firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0)
{
FirstAssetOrientation_= UIImageOrientationRight; isFirstAssetPortrait_ = YES;
}
if(firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0)
{
FirstAssetOrientation_ = UIImageOrientationLeft; isFirstAssetPortrait_ = YES;
}
if(firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0)
{
FirstAssetOrientation_ = UIImageOrientationUp;
}
if(firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0)
{
FirstAssetOrientation_ = UIImageOrientationDown;
}
CGFloat FirstAssetScaleToFitRatio = 320.0/FirstAssetTrack.naturalSize.width;
if(isFirstAssetPortrait_)
{
FirstAssetScaleToFitRatio = 320.0/FirstAssetTrack.naturalSize.height;
CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
[FirstlayerInstruction setTransform:CGAffineTransformConcat(FirstAssetTrack.preferredTransform, FirstAssetScaleFactor) atTime:kCMTimeZero];
}
else
{
CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
[FirstlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(FirstAssetTrack.preferredTransform, FirstAssetScaleFactor),CGAffineTransformMakeTranslation(0, 160)) atTime:kCMTimeZero];
}
[FirstlayerInstruction setOpacity:0.0 atTime:firstAsset.duration];
MainInstruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction,nil];;
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = CMTimeMake(1, 30);
MainCompositionInst.renderSize = CGSizeMake(320.0, 480.0);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"mergeVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.videoComposition = MainCompositionInst;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^{
[self exportDidFinish:exporter];
});
}];
}
}
- (void)exportDidFinish:(AVAssetExportSession*)session
{
if(session.status == AVAssetExportSessionStatusCompleted)
{
NSURL *outputURL = session.outputURL;
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL])
{
if ([self.delegate respondsToSelector:#selector(videoExported)])
[self.delegate videoExported];
}
}
firstAsset = nil;
}
Add this after the //VIDEO TRACK part
//AUDIO TRACK
AVMutableCompositionTrack *firstAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[firstAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:#"your video url here..." options:nil];
Add After AVMutableCompositionTrack.
set setPreferredTransform: set here your source video that you want to export with same orientation.
// Grab the source track from AVURLAsset for example.
AVAssetTrack *assetVideoTrack = [videoAsset tracksWithMediaType:AVMediaTypeVideo].lastObject;
// Grab the composition video track from AVMutableComposition you already made.
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition tracksWithMediaType:AVMediaTypeVideo].lastObject;
// Apply the original transform.
if (assetVideoTrack && compositionVideoTrack) {
[compositionVideoTrack setPreferredTransform:assetVideoTrack.preferredTransform];
}

Save audio with fade in fade out with setVolumeRampFromStartVolume not working in iOS

I am trying to cut an audio file for an iPhone project. I can cut it and save it, but any fade in / fade out that I try to apply doesn't work, the audio file is just saved cutted but not faded.
I am using the following code:
//
// NO PROBLEMS TO SEE HERE, MOVE ON
//
NSArray *documentsFolders = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
int currentFileNum = 10;
NSURL *url = [NSURL fileURLWithPath: [[documentsFolders objectAtIndex:0] stringByAppendingPathComponent:[NSString stringWithFormat:#"%#%d.%#", AUDIO_SOURCE_FILE_NAME ,currentFileNum, AUDIO_SOURCE_FILE_EXTENSION ]]];
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES]
forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:options];
AVAssetExportSession* exporter = [AVAssetExportSession exportSessionWithAsset:asset presetName:AVAssetExportPresetAppleM4A];
for (NSString* filetype in exporter.supportedFileTypes) {
if ([filetype isEqualToString:AVFileTypeAppleM4A]) {
exporter.outputFileType = AVFileTypeAppleM4A;
break;
}
}
if (exporter.outputFileType == nil) {
NSLog(#"Needed output file type not found? (%#)", AVFileTypeAppleM4A);
//return;
}
NSString* outPath = [[documentsFolders objectAtIndex:0] stringByAppendingPathComponent:[NSString stringWithFormat:#"%#%d.%#", AUDIO_CUTTED_FILE_NAME ,currentFileNum, AUDIO_SOURCE_FILE_EXTENSION ]];
NSURL* const outUrl = [NSURL fileURLWithPath:outPath];
exporter.outputURL = outUrl;
float endTrimTime = CMTimeGetSeconds(asset.duration);
float startTrimTime = fminf(AUDIO_DURATION, endTrimTime);
CMTime startTrimCMTime=CMTimeSubtract(asset.duration, CMTimeMake(startTrimTime, 1));
exporter.timeRange = CMTimeRangeMake(startTrimCMTime, asset.duration);
//
// TRYING TO APPLY FADEIN FADEOUT, NOT WORKING, NO RESULTS, "CODE IGNORED"
//
AVMutableAudioMix *exportAudioMix = [AVMutableAudioMix audioMix];
NSMutableArray* inputParameters = [NSMutableArray arrayWithCapacity:1];
CMTime startFadeInTime = startTrimCMTime;
CMTime endFadeInTime = CMTimeMake(startTrimTime+1, 1);
CMTime startFadeOutTime = CMTimeMake(endTrimTime-1, 1);
CMTime endFadeOutTime = CMTimeMake(endTrimTime, 1);
CMTimeRange fadeInTimeRange = CMTimeRangeFromTimeToTime(startFadeInTime, endFadeInTime);
CMTimeRange fadeOutTimeRange = CMTimeRangeFromTimeToTime(startFadeOutTime, endFadeOutTime);
AVMutableAudioMixInputParameters *exportAudioMixInputParameters = [AVMutableAudioMixInputParameters audioMixInputParameters];
[exportAudioMixInputParameters setVolume:0.0 atTime:CMTimeMakeWithSeconds(startTrimTime-0.01, 1)];
[exportAudioMixInputParameters setVolumeRampFromStartVolume:0.0 toEndVolume:1.0 timeRange:fadeInTimeRange];
[exportAudioMixInputParameters setVolumeRampFromStartVolume:1.0 toEndVolume:0.0 timeRange:fadeOutTimeRange];
[inputParameters insertObject:exportAudioMixInputParameters atIndex:0];
exportAudioMix.inputParameters = inputParameters;
exporter.audioMix = exportAudioMix;
[exporter exportAsynchronouslyWithCompletionHandler:^(void) {
NSString* message;
switch (exporter.status) {
case AVAssetExportSessionStatusFailed:
message = [NSString stringWithFormat:#"Export failed. Error: %#", exporter.error.description];
[asset release];
break;
case AVAssetExportSessionStatusCompleted: {
[asset release];
[self reallyConvert:currentFileNum];
message = [NSString stringWithFormat:#"Export completed: %#", outPath];
break;
}
case AVAssetExportSessionStatusCancelled:
message = [NSString stringWithFormat:#"Export cancelled!"];
[asset release];
break;
default:
NSLog(#"Export 4 unhandled status: %d", exporter.status);
[asset release];
break;
}
}];
You need to select the track. Instead of calling:
AVMutableAudioMixInputParameters *exportAudioMixInputParameters = [AVMutableAudioMixInputParameters audioMixInputParameters];
Call:
AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0];
AVMutableAudioMixInputParameters *exportAudioMixInputParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:assetTrack];
In your existing code you can also specify the track like this:
exportAudioMixInputParameters.trackID = [[[asset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0] trackID];
Good luck!
Here is the solution.
setVolumeRampFromStartVolume doesn't work.
AVMutableAudioMix *exportAudioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *exportAudioMixInputParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
//fade in
[exportAudioMixInputParameters setVolume:0.0 atTime:CMTimeMakeWithSeconds(start-1, 1)];
[exportAudioMixInputParameters setVolume:0.1 atTime:CMTimeMakeWithSeconds(start, 1)];
[exportAudioMixInputParameters setVolume:0.5 atTime:CMTimeMakeWithSeconds(start+1, 1)];
[exportAudioMixInputParameters setVolume:1.0 atTime:CMTimeMakeWithSeconds(start+2, 1)];
//fade out
[exportAudioMixInputParameters setVolume:1.0 atTime:CMTimeMakeWithSeconds((start+length-2), 1)];
[exportAudioMixInputParameters setVolume:0.5 atTime:CMTimeMakeWithSeconds((start+length-1), 1)];
[exportAudioMixInputParameters setVolume:0.1 atTime:CMTimeMakeWithSeconds((start+length), 1)];
exportAudioMix.inputParameters = [NSArray arrayWithObject:exportAudioMixInputParameters];
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:filePath]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
exportSession.timeRange = exportTimeRange; // trim time ranges
exportSession.audioMix = exportAudioMix; // fade in audio mix
// perform the export
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (AVAssetExportSessionStatusCompleted == exportSession.status) {
NSLog(#"AVAssetExportSessionStatusCompleted");
} else if (AVAssetExportSessionStatusFailed == exportSession.status) {
NSLog(#"AVAssetExportSessionStatusFailed");
} else {
NSLog(#"Export Session Status: %d", exportSession.status);
}
}];
I've made the same mistake as you dozens of times !
Apple's API is really weird on this :
CMTimeRange fadeInTimeRange = CMTimeRangeFromTimeToTime(startFadeInTime, endFadeInTime);
CMTimeRange fadeOutTimeRange = CMTimeRangeFromTimeToTime(startFadeOutTime, endFadeOutTime);
Should be :
CMTimeRangeFromTimeToTime(startFadeInTime, fadeInDURATION);
CMTimeRangeFromTimeToTime(startFadeOutTime, fadeOutDURATION);
CMTimeRange is created from start and duration, not from start and end !
But most of the time, the end time is also the duration (if the start time is 0) that's why so many people (including me) make the mistake.
And no Apple, that's not intuitive at all !
This is my working code, just take it and have a nice day!
+(void)makeAudioFadeOutWithSourceURL:(NSURL*)sourceURL destinationURL:(NSURL*)destinationURL fadeOutBeginSecond:(NSInteger)beginTime fadeOutEndSecond:(NSInteger)endTime fadeOutBeginVolume:(CGFloat)beginVolume fadeOutEndVolume:(CGFloat)endVolume callback:(void(^)(BOOL))callback
{
NSAssert(callback, #"need callback");
NSParameterAssert(beginVolume >= 0 && beginVolume <=1);
NSParameterAssert(endVolume >= 0 && endVolume <= 1);
BOOL sourceExist = [[NSFileManager defaultManager] fileExistsAtPath:sourceURL.path];
NSAssert(sourceExist, #"source not exist");
AVURLAsset *asset = [AVAsset assetWithURL:sourceURL];;
AVAssetExportSession* exporter = [AVAssetExportSession exportSessionWithAsset:asset presetName:AVAssetExportPresetAppleM4A];
exporter.outputURL = destinationURL;
exporter.outputFileType = AVFileTypeAppleM4A;
AVMutableAudioMix *exportAudioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *exportAudioMixInputParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:asset.tracks.lastObject];
[exportAudioMixInputParameters setVolumeRampFromStartVolume:beginVolume toEndVolume:endVolume timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(beginTime, 1), CMTimeSubtract(CMTimeMakeWithSeconds(endTime, 1), CMTimeMakeWithSeconds(beginTime, 1)))];
NSArray *audioMixParameters = #[exportAudioMixInputParameters];
exportAudioMix.inputParameters = audioMixParameters;
exporter.audioMix = exportAudioMix;
[exporter exportAsynchronouslyWithCompletionHandler:^(void){
AVAssetExportSessionStatus status = exporter.status;
if (status != AVAssetExportSessionStatusCompleted) {
if (callback) {
callback(NO);
}
}
else {
if (callback) {
callback(YES);
}
}
NSError *error = exporter.error;
NSLog(#"export done,error %#,status %d",error,status);
}];
}

iPhone AVErrorInvalidVideoComposition error when doing image overlay on video?

I have combined the 2 code chunks found here into one solid chunk (and verified the process with an Apple Developer Xcode tutorial file). When I run it, however, I get an error. It says:
Error Domain=AVFoundationErrorDomain Code=-11841 "The operation couldn’t be completed. (AVFoundationErrorDomain error -11841.)"
Any idea why it throws an AVErrorInvalidVideoComposition error? Thanks! (I'm new here so please let me know if you need more info.)
NSURL *videoURL = [info valueForKey:UIImagePickerControllerMediaURL];
/// UIImage into CALayer
UIImage *myImage = [UIImage imageNamed:#"Test.png"];
CALayer *aLayer = [CALayer layer];
aLayer.contents = (id)myImage.CGImage;
AVURLAsset* url = [AVURLAsset URLAssetWithURL:videoURL options:nil];
AVMutableComposition *videoComposition = [[AVMutableComposition alloc] init];
NSError *error;
NSFileManager *fileManager = [NSFileManager defaultManager];
AVMutableCompositionTrack *compositionVideoTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipVideoTrack = [[url tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [url duration]) ofTrack:clipVideoTrack atTime:kCMTimeZero error:&error];
AVMutableVideoComposition* videoComp = [[AVMutableVideoComposition alloc] init];
videoComp.renderSize = CGSizeMake(640, 480);
videoComp.frameDuration = CMTimeMake(1, 30);
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, videoComp.renderSize.width, videoComp.renderSize.height);
videoLayer.frame = CGRectMake(0, 0, videoComp.renderSize.width, videoComp.renderSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:aLayer];
videoComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
/// instruction
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) );
AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
videoComp.instructions = [NSArray arrayWithObject: instruction];
/// outputs
NSString *filePath = nil;
filePath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
filePath = [filePath stringByAppendingPathComponent:#"temp.mov"];
NSLog(#"exporting to: %#", filePath);
if ([fileManager fileExistsAtPath:filePath])
{
BOOL success = [fileManager removeItemAtPath:filePath error:&error];
if (!success) NSLog(#"FM error: %#", [error localizedDescription]);
}
/// exporting
AVAssetExportSession *exporter;
exporter = [[AVAssetExportSession alloc] initWithAsset:videoComposition presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComp;
exporter.outputURL=[NSURL fileURLWithPath:filePath];
exporter.outputFileType=AVFileTypeQuickTimeMovie;
[exporter exportAsynchronouslyWithCompletionHandler:^(void){
switch (exporter.status) {
case AVAssetExportSessionStatusFailed:
NSLog(#"exporting failed:%#",exporter.error);
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"exporting completed");
UISaveVideoAtPathToSavedPhotosAlbum(filePath, self, #selector(video:didFinishSavingWithError:contextInfo:), NULL);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"export cancelled");
break;
}
}];
One common problem with invalid compositions is time.
If your various assets that you are combining don't have the same length, make sure the composition you create covers the entire segment.
Check to ensure that your AVMutableVideoCompositionInstruction's timeRange is correct.

AVVideoComposition Failure

As in mine. To use it.
I am string several different video clips together into an AVMutableComposition and attempting to correct their orientation if needed.
Here is my code:
composition = [[AVMutableComposition alloc] init];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime nextClipStartTime = kCMTimeZero;
// orientation compensation vars
AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
NSMutableArray *compInst = [[NSMutableArray alloc] init];
// get view size
CGSize viewSize = playerView.frame.size;
// generate movie assets
for (NSString* moviePath in [currentBlam valueForKey:#"movies"]) {
NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
AVURLAsset *movieAsset = [AVURLAsset URLAssetWithURL:movieURL options:nil];
// scale asset to fit screen
CMTimeRange tr = CMTimeRangeFromTimeToTime(CMTimeMakeWithSeconds(0.0f, 1), CMTimeMakeWithSeconds(0.0f, 1));
// create video track
AVAssetTrack *clipVideoTrack = [[movieAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
// create audio track
AVAssetTrack *clipAudioTrack = [[movieAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
tr = CMTimeRangeFromTimeToTime(CMTimeMakeWithSeconds(0.0f, 1), CMTimeMakeWithSeconds(CMTimeGetSeconds([movieAsset duration]), 1));
AVMutableVideoCompositionLayerInstruction *layerInst;
layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
int or = [self orientationForTrack:movieAsset];
if (or==1) {
float rot = (0.0f);
[layerInst setTransform:CGAffineTransformMakeRotation(rot) atTime:nextClipStartTime];
} else if (or==2) {
float rot = (M_PI);
[layerInst setTransform:CGAffineTransformMakeRotation(rot) atTime:nextClipStartTime];
} else if (or==3) {
float rot = (M_PI*-0.5f);
[layerInst setTransform:CGAffineTransformMakeRotation(rot) atTime:nextClipStartTime];
} else if (or==4) {
float rot = (M_PI*0.5f);
[layerInst setTransform:CGAffineTransformMakeRotation(rot) atTime:nextClipStartTime];
}
[layerInst setTransform:clipVideoTrack.preferredTransform atTime:nextClipStartTime];
[compInst addObject:layerInst];
// insert video track
[compositionVideoTrack insertTimeRange:tr
ofTrack:clipVideoTrack
atTime:nextClipStartTime
error:nil];
// insert audio track
[compositionAudioTrack insertTimeRange:tr
ofTrack:clipAudioTrack
atTime:nextClipStartTime
error:nil];
nextClipStartTime = CMTimeAdd(nextClipStartTime, tr.duration);
}
//set size and duration
composition.naturalSize = viewSize;
videoComposition.frameDuration = composition.duration;
videoComposition.renderSize = viewSize;
videoComposition.renderScale = 1.0f;
//apply instructions
inst.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
inst.layerInstructions = compInst;
videoComposition.instructions = [NSArray arrayWithObject:inst];
playerItem = [[AVPlayerItem alloc] initWithAsset:composition];
playerItem.videoComposition = videoComposition;
[playerItem addObserver:self forKeyPath:#"status" options:0 context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:playerItem];
player = [AVPlayer playerWithPlayerItem:playerItem];
[playerView setPlayer:player];
When I run this and populate with content nothing shows.
This was working previously without the AVVideoComposition being applied to the player. In fact, commenting out the playerItem.videoComposition = videoComposition allows it to work, albeit without correcting the assets rotations.
At this point I know I'm just misunderstanding something here. Can someone please point out what?
Did you consider that maybe the point the video is being rotated about is not the center and the video is being drawn off screen? This may totally be wrong but its just my first thought.
I think you are setting each single frame to be the length of the entire clip:
videoComposition.frameDuration = composition.duration;