Having a problem where the estimatedOutputFileLength property of AVAssetExportSession always returns 0 (and returns -9223372036854775808 on the simulator).
I've tried everything to get this to work, trying different outputFileTypes, toggling shouldOptimizeForNetworkUse on and off, specifying (or not specifying) the outputURL... despite all this, nothing seems to work and I'm beginning to think this may be a bug in the SDK.
This is my code:
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetMediumQuality]; // doesn't matter which preset is used
//exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
NSLog(#"bytes = %lld", exportSession.estimatedOutputFileLength);
I just can't figure out why this isn't working! (iOS 6, iPhone 5)
You can workaround this problem by setting proper timeRange on the exportSession:
exportSession.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
It seems that in iOS, the AVAssetExportSessionInternal.timeRange is not getting sensible result when estimating file length.
You need to include the timerange.
How much of the file you intend to export. Without that it will return 0,
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset: songAsset presetName: AVAssetExportPresetAppleM4A];
exporter.outputFileType = AVFileTypeAppleM4A;
CMTime full = CMTimeMultiplyByFloat64(exporter.asset.duration, 1);
exporter.timeRange = CMTimeRangeMake(kCMTimeZero, full);
long long size = exporter.estimatedOutputFileLength;
fileInfo.fileSize = size;
Related
Some videos play fine alone as simple AVAssets but when inserted in an AVMutableVideoComposition they make the whole composition fail even if those videos are inserted alone in an empty composition. Anyone has similar issues? Sometime re-encoding the videos before inserting them makes it work sometime not. I can't see any mistake in my timing instruction and using some other video don't cause any problem at all no matter their length. Can there be issues with the number of frame or the duration of the asset or their format? (all are h264 single track)
Well indeed it looks like some video may cause problem when being inserted. example here duration of the track is the issue.
AVAsset * video = ...;
NSArray * videoTracks = [video tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack * videoTrack = videoTracks.firstObject;
CMTime duration = videoTrack.timeRange.duration // this time cause error
AVMutableCompositionTrack * track = ...;
CMTime insertCMTime = kCMTimeZero;
CMTimeRange trackRange = CMTimeRangeMake(kCMTimeZero, duration);
[track insertTimeRange:trackRange
ofTrack:videoTrack
atTime:insertCMTime
error:nil];
Trimming the range to insert to a round second solved the problem for all video tested so far
NSTimeInterval interval = (int)CMTimeGetSeconds(videoTrack.timeRange.duration);
CMTime roundedDuration = CMTimeMakeWithSeconds(interval, 60000);
I am recording and uploading video to server from my iPhone application. The video needs to have a aspect ratio of 16:9 so I set
[videoRecorderController setVideoQuality:UIImagePickerControllerQualityTypeIFrame960x540]
but the video of few seconds takes several MBs space.
Is there any way out to reduce the memory size of the video and maintaining the 16:9 aspect ratio?
This helped me:
- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL
outputURL:(NSURL*)outputURL
handler:(void (^)(AVAssetExportSession*))handler
{
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetMediumQuality];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
{
handler(exportSession);
}];
}
I'm porting an app that works with aac audio files to iOS6 and I've found an strange behavior, when I try to get the duration of the (valid) aac audio file, it's always returns 0, in iOS4 and iOS5 it works fine.
¿Is there any bug on AvAudioPlayer class that affects duration property? I have read about some troubles with the currentTime property.
Here's the code:
NSURL* urlFichero = [NSURL fileURLWithPath:rutaFichero];
avaPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL: urlFichero error:nil];
segundos = avaPlayer.duration;
NSLog(#"[ControladorFicheros] Fichero: '%#' Duración: '%f'", nombreFichero, segundos);
[avaPlayer stop];
[avaPlayer release];
Thanks ;)
Finally the problem is that in newest versions of the API, AVAudioPlayer appears to only returns the correct duration of a file when it is ready for play it, that's why my solution was wrong, the correct way to get the duration of a file (in seconds) if you don't want to reproduce it is:
AVURLAsset *asset = [[[AVURLAsset alloc] initWithURL:anURI_ToResource
options:[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES],
AVURLAssetPreferPreciseDurationAndTimingKey,
nil]] autorelease];
NSTimeInterval durationInSeconds = 0.0;
if (asset)
durationInSeconds = CMTimeGetSeconds(asset.duration) ;
Swift
let asset = AVURLAsset(url: url, options: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
let durationInSeconds = CMTimeGetSeconds(asset.duration)
I noticed the same problem. My solution is to use instead.
MPMoviePlayerController *testPlayer = [[MPMoviePlayerController alloc] initWithContentURL:filePath];
[testPlater prepareToPlay];
[testPlater play];
I am using the below code to try and merge two m4v files stored in the documents folder :
CMTime insertionPoint = kCMTimeZero;
NSError * error = nil;
AVMutableComposition *composition = [AVMutableComposition composition];
AVURLAsset* asset = [AVURLAsset URLAssetWithURL: [assetURLArray objectForKey:kIntroVideo] options:nil];
if (![composition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
ofAsset:asset
atTime:insertionPoint
error:&error])
{
NSLog(#"error: %#",error);
}
insertionPoint = CMTimeAdd(insertionPoint, asset.duration);
AVURLAsset* asset2 = [AVURLAsset URLAssetWithURL: [assetURLArray objectForKey:kMainVideo] options:nil];
if (![composition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset2.duration)
ofAsset:asset2
atTime:insertionPoint
error:&error])
{
NSLog(#"error: %#",error);
}
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
NSString *exportVideoPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/FinishedVideo.m4v"];
NSURL *exportURL = [NSURL fileURLWithPath:exportVideoPath];
exportSession.outputURL = exportURL;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusFailed:{
NSLog (#"FAIL");
break;
}
case AVAssetExportSessionStatusCompleted: {
NSLog (#"SUCCESS");
}
};
}];
}
The problem is that the two videos will not merge properly. The total merged movie duration is correct, however the video never transitions to the second movie and continues to display the last frame of the first movie for its duration. Oddly I can hear the audio for the second video playing in the background.
Does anyone have any ideas what is wrong ?
EDIT - The odd thing is is that if I merge two clips of exactly the same length it works.
EDIT - Have tried changing file extension to .mov with same problem.
You havent set the composition to the exportSession.
After the line:
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
Add this line
exportSession.videoComposition = composition;
This should solve your problem.
Ok - so I eventually got this working by using individual AVMutableComposition tracks and then setting a mutablecomposition for audio and one for video.
I am trying to get a thumbnail from a .mov file that I captured from the iphone camera. I currently have the movie saved in the documents portion of the app. When I call [Asset duration] it returns a null object. Also when I try to call the copyCGImageAtTime:actualtime:error method it also returns a null object. I've spent count less hours trying to figure this out. I've tried moving my code to another main section portion of my app to just see if I could get it to work. I've also tried to run it on the simulator with no luck. Here is the code:
NSString* destinationPath = [NSString stringWithFormat:#"%#/aaa/aaa.mov", [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0]];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:[NSURL URLWithString:destinationPath] options:nil];
AVAssetImageGenerator *gen = [[AVAssetImageGenerator alloc] initWithAsset:asset];
gen.appliesPreferredTrackTransform = YES;
CMTime time = CMTimeMakeWithSeconds(0.0, 600);
NSError *error2 = nil;
CMTime actualTime;
CGImageRef image = [gen copyCGImageAtTime:time actualTime:&actualTime error:&error2];
UIImage *thumb = [[UIImage alloc] initWithCGImage:image];
CGImageRelease(image);
I have also confirmed that the movie does exist under that folder. Any help would be greatly appreciated. Thanks :)
--Edit--
Forgot to mention that the Error from copyCGImageAtTime is the AVUnknown error.
--Edit2--
Found out the problem. I didn't include file:// at the beginning of the url. It works now.
Found out the problem. I didn't include file:// at the beginning of the url. It works now.