Transform not working in AVMutableVideoComposition while exporting - iphone

My goal is to compose a set of clips recorded from the camera and export them at a certain preferred size. Of course, the video orientation needs to be rotated before exporting.
I'm doing this by composing an AVMutableComposition from an array of video clips, stored in avAssets below. I am able to compose them fine, and export it. However, the rotation transform I am setting on the AVMutableVideoComposition is not being honored. If I use the same transform and set it on the preferredTransform property of the video track, then it works. In both cases, the video renderSize is not being honored. It's as if the exporter ignores the videoComposition completely. Any ideas what could be happening?
I do have an AVCaptureSession running, but I turned it off before exporting and that didn't make any difference. I am fairly new to iOS programming, so it could be I'm missing something basic. :)
My code:
-(void) finalRecord{
NSError *error = nil;
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
NSLog(#"Video track id is %d", [compositionVideoTrack trackID]);
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
// avAssets hold the video clips to be composited
int pieces = [avAssets count];
CGAffineTransform transform = CGAffineTransformMakeRotation( M_PI_2);
// [compositionVideoTrack setPreferredTransform:transform];
for (int i = 0; i<pieces; i++) {
AVURLAsset *sourceAsset = [avAssets objectAtIndex:i];
AVAssetTrack *sourceVideoTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *sourceAudioTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[timeRanges addObject:[NSValue valueWithCMTimeRange:CMTimeRangeMake(kCMTimeZero, sourceAsset.duration)]];
[videoTracks addObject:sourceVideoTrack];
[audioTracks addObject:sourceAudioTrack];
}
[compositionVideoTrack insertTimeRanges:timeRanges ofTracks:videoTracks atTime:kCMTimeZero error:&error];
[compositionAudioTrack insertTimeRanges:timeRanges ofTracks:audioTracks atTime:kCMTimeZero error:&error];
AVMutableVideoCompositionInstruction *vtemp = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
vtemp.timeRange = CMTimeRangeMake(kCMTimeZero, [composition duration]);
NSLog(#"\nInstruction vtemp's time range is %f %f", CMTimeGetSeconds( vtemp.timeRange.start),
CMTimeGetSeconds(vtemp.timeRange.duration));
// Also tried videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack
AVMutableVideoCompositionLayerInstruction *vLayerInstruction = [AVMutableVideoCompositionLayerInstruction
videoCompositionLayerInstructionWithAssetTrack:composition.tracks[0]];
[vLayerInstruction setTransform:transform atTime:kCMTimeZero];
vtemp.layerInstructions = #[vLayerInstruction];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.renderSize = CGSizeMake(320.0, 240.0);
videoComposition.frameDuration = CMTimeMake(1,30);
videoComposition.instructions = #[vtemp];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:composition presetName:gVideoExportQuality];
NSParameterAssert(exporter != nil);
exporter.videoComposition = videoComposition;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
NSString *rootName = [[self captureManager] tempFileRoot];
NSString *temp = [NSString stringWithFormat:#"%#%#.mov", NSTemporaryDirectory(), rootName];
exporter.outputURL = [NSURL fileURLWithPath:temp ];
[exporter exportAsynchronouslyWithCompletionHandler:^{
switch ([exporter status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [exporter error]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export successfully");
[self exportFile:exporter.outputURL];
[self.delegate recordingEndedWithFile:exporter.outputURL];
isExporting = FALSE;
[[[self captureManager] session] startRunning];
break;
default:
break;
}
if (exporter.status != AVAssetExportSessionStatusCompleted){
NSLog(#"Retry export");
}
}];
}

Ok figured it out and posting here to help other people not waste the time that I did.
The issue is that if you use AVAssetExportPresetPassthrough on an AVExportSession, then the exporter will ignore the video composition instructions. I was expecting it to at least honor the video composition instructions while passing through the format etc. but apparently that isn't how it works. After I have filled a documentation bug, you can find it in Technical Q&A.

The solution if you want to use AVAssetExportPresetPassthrough:
compositionVideoTrack.preferredTransform = transform
more info here: https://developer.apple.com/library/archive/qa/qa1744/_index.html
Otherwise, if you specify the AVAssetExportPresetPassthrough export option to let all tracks pass through but you still want to set a transform on the composition, set the preferredTransform property on the composition tracks as described above.

Related

How can I overlap audio files and combine for iPhone in Xcode?

So I have two audio files, same format, potentially different length. I would like to combine these files (overlay the audio from one onto the other, NOT join them at ends).
Lets say I have two files:
Audio File A, length 30 seconds, size 220k
Audio File B, length 45 seconds, size 300k
What I'd like, a combines audio file:
Audio File C, length 45 seconds, size 300k (I recognize this could be more)
Appreciate everyone's help!
Here's what I did in my app.
- (void) setUpAndAddAudioAtPath:(NSURL*)assetURL toComposition:(AVMutableComposition *)composition
{
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
AVMutableCompositionTrack *track = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *sourceAudioTrack = [[songAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
NSError *error = nil;
BOOL ok = NO;
CMTime startTime = CMTimeMakeWithSeconds(0, 1);
CMTime trackDuration = songAsset.duration;
//CMTime longestTime = CMTimeMake(848896, 44100); //(19.24 seconds)
CMTimeRange tRange = CMTimeRangeMake(startTime, trackDuration);
//Set Volume
AVMutableAudioMixInputParameters *trackMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
[trackMix setVolume:0.8f atTime:startTime];
[self.audioMixParams addObject:trackMix];
//Insert audio into track
ok = [track insertTimeRange:tRange ofTrack:sourceAudioTrack atTime:CMTimeMake(0, 44100) error:&error];
}
- (IBAction)saveRecording
{
AVMutableComposition *composition = [AVMutableComposition composition];
audioMixParams = [[NSMutableArray alloc] initWithObjects:nil];
//IMPLEMENT FOLLOWING CODE WHEN WANT TO MERGE ANOTHER AUDIO FILE
//Add Audio Tracks to Composition
NSString *URLPath1 = [[NSBundle mainBundle] pathForResource:#"mysound" ofType:#"mp3"];
NSString *URLPath2 = [[NSBundle mainBundle] pathForResource:#"mysound2" ofType:#"mp3"];
NSURL *assetURL1 = [NSURL fileURLWithPath:URLPath1];
[self setUpAndAddAudioAtPath:assetURL1 toComposition:composition];
NSURL *assetURL2 = [NSURL fileURLWithPath:URLPath2];
[self setUpAndAddAudioAtPath:assetURL2 toComposition:composition];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = [NSArray arrayWithArray:audioMixParams];
//If you need to query what formats you can export to, here's a way to find out
NSLog (#"compatible presets for songAsset: %#",
[AVAssetExportSession exportPresetsCompatibleWithAsset:composition]);
AVAssetExportSession *exporter = [[AVAssetExportSession alloc]
initWithAsset: composition
presetName: AVAssetExportPresetAppleM4A];
exporter.audioMix = audioMix;
exporter.outputFileType = #"com.apple.m4a-audio";
NSURL *exportURL = [NSURL fileURLWithPath:exportFile];
exporter.outputURL = exportURL;
// do the export
[exporter exportAsynchronouslyWithCompletionHandler:^{
int exportStatus = exporter.status;
NSError *exportError = exporter.error;
switch (exportStatus) {
case AVAssetExportSessionStatusFailed:
break;
case AVAssetExportSessionStatusCompleted: NSLog (#"AVAssetExportSessionStatusCompleted");
break;
case AVAssetExportSessionStatusUnknown: NSLog (#"AVAssetExportSessionStatusUnknown"); break;
case AVAssetExportSessionStatusExporting: NSLog (#"AVAssetExportSessionStatusExporting"); break;
case AVAssetExportSessionStatusCancelled: NSLog (#"AVAssetExportSessionStatusCancelled"); break;
case AVAssetExportSessionStatusWaiting: NSLog (#"AVAssetExportSessionStatusWaiting"); break;
default: NSLog (#"didn't get export status"); break;
}
}];
}
Beware that I did do this a while ago, and you might have to work with it a tiny bit to make it work. But it did work at one point. Let me know if you're having problems.
If Audio Asset track is not found in selected asset.You can use this statement to check the sound of particular Video.
if([[songAsset tracksWithMediaType:AVMediaTypeAudio] firstObject]==NULL)
{
NSLog(#"Sound is not Present");
}
else
{
NSLog(#"Sound is Present");
//You will initalise all things
AVAssetTrack *sourceAudioTrack = [[songAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
}

How to change video metadata using AVAssetWriter?

How to change a video(.mp4) meta-info using a AVAssetWriter API?
I want to not re-encode. only wanted to modify the video meta-info.
how to write a next code?
AVAssetWriter *writer = [AVAssetWriter assetWriterWithURL:[NSURL URLWithString:myPath] fileType:AVFileTypeQuickTimeMovie error:nil];
if I mistaken, give me some hint.
Thanks!!
refer a following code.
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:"your path"] options:nil];
NSMutableArray *metadata = [NSMutableArray array];
AVMutableMetadataItem *metaItem = [AVMutableMetadataItem metadataItem];
metaItem.key = AVMetadataCommonKeyPublisher;
metaItem.keySpace = AVMetadataKeySpaceCommon;
metaItem.value = #"your_value";
[metadata addObject:metaItem];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetPassthrough];
exportSession.outputURL = [NSURL fileURLWithPath:"your output path"];
CMTime start = CMTimeMakeWithSeconds(0.0, BASIC_TIMESCALE);
CMTimeRange range = CMTimeRangeMake(start, [asset duration]);
exportSession.timeRange = range;
exportSession.outputFileType = AVFileTypeAppleM4V // AVFileTypeMPEG4 or AVFileTypeQuickTimeMovie (video format);
exportSession.metadata = metadata;
exportSession.shouldOptimizeForNetworkUse = YES;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status])
{
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export sucess");
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[exportSession error] localizedDescription]);
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
default:
break;
}
}];

naturalSize returning wrong orientation from AVURLAsset

Im using the code found [here][1] to attach image on video taken using UIImagePickerController.
the video is in portrait and playing fine but once I use AVURLASSet it turning landscape orientation instead of portrait and I cant find why?
Can any one point me to the right direction ?
My Code:
-(IBAction)addWaterMark:(id)sender {
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:tempPath] options:nil];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:clipVideoTrack
atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:[[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]];
CGSize videoSize = [videoAsset naturalSize];
NSLog(#"%f %f",videoSize.width,videoSize.height);
}
at this point I get 480,360 instead of the correct size for temppath
found detailed tutorial describing how to edit and manipulate videos with AVfoundation introduction here
If they are landscape, we can use the naturalSize property we are supplied with, but if they are portrait, we must flip the the naturalSize so that the width is now the height and vice-versa
I left the correct code here in case others need although you've chosen the answer of yourself as the correct one.
//readout the size of video
AVAssetTrack *vT = nil;
if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0)
{
vT = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
}
if (vT != nil)
{
orgWidth = vT.naturalSize.width;
orgHeight = vT.naturalSize.height;
}
//check the orientation
CGAffineTransform txf = [vT preferredTransform];
if ((width == txf.tx && height == txf.ty)|(txf.tx == 0 && txf.ty == 0))
{
//Landscape
}
else
{
//Portrait
}

iPhone AVErrorInvalidVideoComposition error when doing image overlay on video?

I have combined the 2 code chunks found here into one solid chunk (and verified the process with an Apple Developer Xcode tutorial file). When I run it, however, I get an error. It says:
Error Domain=AVFoundationErrorDomain Code=-11841 "The operation couldn’t be completed. (AVFoundationErrorDomain error -11841.)"
Any idea why it throws an AVErrorInvalidVideoComposition error? Thanks! (I'm new here so please let me know if you need more info.)
NSURL *videoURL = [info valueForKey:UIImagePickerControllerMediaURL];
/// UIImage into CALayer
UIImage *myImage = [UIImage imageNamed:#"Test.png"];
CALayer *aLayer = [CALayer layer];
aLayer.contents = (id)myImage.CGImage;
AVURLAsset* url = [AVURLAsset URLAssetWithURL:videoURL options:nil];
AVMutableComposition *videoComposition = [[AVMutableComposition alloc] init];
NSError *error;
NSFileManager *fileManager = [NSFileManager defaultManager];
AVMutableCompositionTrack *compositionVideoTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipVideoTrack = [[url tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [url duration]) ofTrack:clipVideoTrack atTime:kCMTimeZero error:&error];
AVMutableVideoComposition* videoComp = [[AVMutableVideoComposition alloc] init];
videoComp.renderSize = CGSizeMake(640, 480);
videoComp.frameDuration = CMTimeMake(1, 30);
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, videoComp.renderSize.width, videoComp.renderSize.height);
videoLayer.frame = CGRectMake(0, 0, videoComp.renderSize.width, videoComp.renderSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:aLayer];
videoComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
/// instruction
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) );
AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
videoComp.instructions = [NSArray arrayWithObject: instruction];
/// outputs
NSString *filePath = nil;
filePath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
filePath = [filePath stringByAppendingPathComponent:#"temp.mov"];
NSLog(#"exporting to: %#", filePath);
if ([fileManager fileExistsAtPath:filePath])
{
BOOL success = [fileManager removeItemAtPath:filePath error:&error];
if (!success) NSLog(#"FM error: %#", [error localizedDescription]);
}
/// exporting
AVAssetExportSession *exporter;
exporter = [[AVAssetExportSession alloc] initWithAsset:videoComposition presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComp;
exporter.outputURL=[NSURL fileURLWithPath:filePath];
exporter.outputFileType=AVFileTypeQuickTimeMovie;
[exporter exportAsynchronouslyWithCompletionHandler:^(void){
switch (exporter.status) {
case AVAssetExportSessionStatusFailed:
NSLog(#"exporting failed:%#",exporter.error);
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"exporting completed");
UISaveVideoAtPathToSavedPhotosAlbum(filePath, self, #selector(video:didFinishSavingWithError:contextInfo:), NULL);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"export cancelled");
break;
}
}];
One common problem with invalid compositions is time.
If your various assets that you are combining don't have the same length, make sure the composition you create covers the entire segment.
Check to ensure that your AVMutableVideoCompositionInstruction's timeRange is correct.

AVVideoComposition Failure

As in mine. To use it.
I am string several different video clips together into an AVMutableComposition and attempting to correct their orientation if needed.
Here is my code:
composition = [[AVMutableComposition alloc] init];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime nextClipStartTime = kCMTimeZero;
// orientation compensation vars
AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
NSMutableArray *compInst = [[NSMutableArray alloc] init];
// get view size
CGSize viewSize = playerView.frame.size;
// generate movie assets
for (NSString* moviePath in [currentBlam valueForKey:#"movies"]) {
NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
AVURLAsset *movieAsset = [AVURLAsset URLAssetWithURL:movieURL options:nil];
// scale asset to fit screen
CMTimeRange tr = CMTimeRangeFromTimeToTime(CMTimeMakeWithSeconds(0.0f, 1), CMTimeMakeWithSeconds(0.0f, 1));
// create video track
AVAssetTrack *clipVideoTrack = [[movieAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
// create audio track
AVAssetTrack *clipAudioTrack = [[movieAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
tr = CMTimeRangeFromTimeToTime(CMTimeMakeWithSeconds(0.0f, 1), CMTimeMakeWithSeconds(CMTimeGetSeconds([movieAsset duration]), 1));
AVMutableVideoCompositionLayerInstruction *layerInst;
layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
int or = [self orientationForTrack:movieAsset];
if (or==1) {
float rot = (0.0f);
[layerInst setTransform:CGAffineTransformMakeRotation(rot) atTime:nextClipStartTime];
} else if (or==2) {
float rot = (M_PI);
[layerInst setTransform:CGAffineTransformMakeRotation(rot) atTime:nextClipStartTime];
} else if (or==3) {
float rot = (M_PI*-0.5f);
[layerInst setTransform:CGAffineTransformMakeRotation(rot) atTime:nextClipStartTime];
} else if (or==4) {
float rot = (M_PI*0.5f);
[layerInst setTransform:CGAffineTransformMakeRotation(rot) atTime:nextClipStartTime];
}
[layerInst setTransform:clipVideoTrack.preferredTransform atTime:nextClipStartTime];
[compInst addObject:layerInst];
// insert video track
[compositionVideoTrack insertTimeRange:tr
ofTrack:clipVideoTrack
atTime:nextClipStartTime
error:nil];
// insert audio track
[compositionAudioTrack insertTimeRange:tr
ofTrack:clipAudioTrack
atTime:nextClipStartTime
error:nil];
nextClipStartTime = CMTimeAdd(nextClipStartTime, tr.duration);
}
//set size and duration
composition.naturalSize = viewSize;
videoComposition.frameDuration = composition.duration;
videoComposition.renderSize = viewSize;
videoComposition.renderScale = 1.0f;
//apply instructions
inst.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
inst.layerInstructions = compInst;
videoComposition.instructions = [NSArray arrayWithObject:inst];
playerItem = [[AVPlayerItem alloc] initWithAsset:composition];
playerItem.videoComposition = videoComposition;
[playerItem addObserver:self forKeyPath:#"status" options:0 context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:playerItem];
player = [AVPlayer playerWithPlayerItem:playerItem];
[playerView setPlayer:player];
When I run this and populate with content nothing shows.
This was working previously without the AVVideoComposition being applied to the player. In fact, commenting out the playerItem.videoComposition = videoComposition allows it to work, albeit without correcting the assets rotations.
At this point I know I'm just misunderstanding something here. Can someone please point out what?
Did you consider that maybe the point the video is being rotated about is not the center and the video is being drawn off screen? This may totally be wrong but its just my first thought.
I think you are setting each single frame to be the length of the entire clip:
videoComposition.frameDuration = composition.duration;