I am using an MPMediaPickerController to allow the user to select videos and songs from the library on the device. I allow this with the: initWithMediaTypes:MPMediaTypeAny initialization for the picker. The user can then play the song or video in-app after the export takes place. Here is my movie-exporting code after stripping it to its core functionality:
- (void)mediaPicker:(MPMediaPickerController*)mediaPicker didPickMediaItems:(MPMediaItemCollection*)mediaItemCollection {
AVAssetExportSession *exportSession;
NSString *filePath;
NSURL *fileUrl;
for (MPMediaItem *item in mediaItemCollection.items) {
NSURL *assetUrl = [item valueForProperty:MPMediaItemPropertyAssetURL];
AVAsset *currAsset = [AVAsset assetWithURL:assetUrl];
exportSession = [[AVAssetExportSession alloc] initWithAsset:[AVAsset assetWithURL:assetUrl] presetName:AVAssetExportPresetHighestQuality];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
filePath = [title stringByAppendingString:#".mov"];
fileUrl = [NSURL fileURLWithPath:[[NSFileManager documentDirectory] stringByAppendingPathComponent:filePath]];
exportSession.outputURL = fileUrl;
dispatch_group_enter(dispatchGroup);
[exportSession exportAsynchronouslyWithCompletionHandler:^{
// success
}
dispatch_group_leave(dispatchGroup);
}];
}
This similar code works fine for doing audio, but for video, the video's audio does not play. Most content from iTunes is protected and non-exportable, so I wanted to test with a homemade quick video I shot with my iPhone. I shot the video, dragged it into iTunes (and I made it a "music video" so that it shows up properly and can be exported to my phone's library). Then I sync'd and sent it to my device for testing.
In the app, the video shows up fine in the Media Picker, and I can export it with no errors that I can see. However, when I play it in-app, it only plays the video and not the audio. Other videos that I import from other sources work fine for playing the video's audio, so I don't 'think' it's the player itself.
Is there something I may be missing here on why the audio would not be coming across from this kind of export from the media picker? Thanks in advance for any assistance on this issue!
Not sure if this is the ideal solution, but the only way we found around this issue was to change it to force m4v format with PresetPassthrough set. I.e:
exportSession = [[AVAssetExportSession alloc] initWithAsset:[AVAsset assetWithURL:assetUrl] presetName:AVAssetExportPresetPassthrough];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeAppleM4V;
filePath = [title stringByAppendingString:#".m4v"];
Audio and video seems to work fine for videos imported locally this way, after making that change.
Related
I have searched and read the docs, but I cannot seem to find a solution to this (seemingly-simple) issue I've run into. I have songs exporting working fine from the user's iTunes library, and it downloads into the user's documents folder with no issues every time, but videos just don't seem to work.
I have it showing an MPMediaPickerController (allowsPickingMultipleItems = YES) to allow the user to select either videos or songs from their downloaded library. When done, here is the relavent code I'm using:
- (void)mediaPicker:(MPMediaPickerController*)mediaPicker didPickMediaItems:(MPMediaItemCollection*)mediaItemCollection {
AVAssetExportSession *exportSession;
for (MPMediaItem *item in mediaItemCollection.items) {
NSURL *assetUrl = [item valueForProperty:MPMediaItemPropertyAssetURL];
MPMediaType type = [[item valueForProperty:MPMediaItemPropertyMediaType] intValue];
if (type >= MPMediaTypeMovie) {
exportSession = [[AVAssetExportSession alloc] initWithAsset:[AVAsset assetWithURL:assetUrl] presetName:AVAssetExportPreset640x480];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
filePath = [title stringByAppendingString:#".mov"];
exportSession.outputURL = [NSURL fileURLWithPath:[[NSFileManager documentDirectory] stringByAppendingPathComponent:filePath]];
} // .. check for song-types here and set session up appropriately
[exportSession exportAsynchronouslyWithCompletionHandler:^{
// never gets into AVAssetExportSessionStatusCompleted here for videos
}
}
}
The error I get every time is the following:
Error Domain=AVFoundationErrorDomain Code=-11820 "Cannot Complete Export" UserInfo=0x1e1a2180 {NSLocalizedRecoverySuggestion=Try exporting again., NSLocalizedDescription=Cannot Complete Export}
Not very helpful. :( I feel like I may be potentially missing something obvious here. Am I going about this the correct way? Is it potentially a problem with me trying to "force" it to MOV-format? Or perhaps needing a different way of setting up the export session?
For reference, I'm using iOS 6.0.1 on my iPhone 5 for testing, with a baseSDK of 6.0. Thanks in advance for any guidance that can be offered on this!
Additional Info #1: something that's odd. It seems to crash immediately with a "SIGTRAP" if I set the outputFileType to "AVFileTypeAppleM4V".. I wanted to try M4V, because when I do a log output of the assetURL, I see something like: ipod-library://item/item.m4v?id=12345. Don't know if that makes a difference or not, but odd that it just crashes like that if I try m4v format. Probably because it's not in the supported filetypes list (see next info point).
Additional Info #2: The supported file types I get (from calling the "supportedFileTypes" method are: "com.apple.quicktime-movie" and "public.mpeg-4". The "exportPresetsCompatibleWithAsset" include all of the video ones, including m4a, low/med/high quality, and the specific dimensions ones. I have tried EVERY combination of all these, such as AVFileTypeQuickTimeMovie and AVFileTypeMPEG4 for fileTypes, and all of the presets, including the low/med/high, and all of the dimension ones. It never fails that I get the "Cannot Complete Export" error.
Additional Info #3: I am also using a Deployment Target of 5.1. But yes, I have tried 6.0, and it gives the same error. :(
Additional Info #4: If needed to know, the movie I'm testing with is a "Pilot" TV show, one video, the first one I saw in iTunes that was free. So I downloaded it for use in this app.
Additional Info #5: Not sure if this is important, but the "hasProtectedContent" method returns YES for the AVAsset (and AVURLAsset if I convert). May not make a difference, but thought I'd throw it out there.
After trying to replicate the issue and doing some testing, I strongly suspect the protected content is an issue. Here's why:
I copied your code, and tested it on my iPod Touch (5th gen, iOS 6.0.1), though instead of coming from a media picker, I just let it loop through all the videos I have on the device (7 of them.) It worked great, and called the completion handler and made proper .mov files in the documents directory of the app sandbox. I moved the .mov files to my Mac and they all played.
These video files had the hasProtectedContent as NO.
So I placed a video file I got from the iTunes store, and confirmed it had the hasProtectedContent as YES. Interestingly, when I try to get the URL from MPMediaItemPropertyAssetURL, I get nil for the protected/iTunes obtained video.
I strongly suspect the media protection is the problem.
Here's the variation of code that I used. I didn't change your conversion code at all, just how the URLs are supplied:
// select all the video files
MPMediaPropertyPredicate *predicate = [MPMediaPropertyPredicate predicateWithValue:[NSNumber numberWithInteger:MPMediaTypeMovie] forProperty:MPMediaItemPropertyMediaType];
MPMediaQuery *query = [[MPMediaQuery alloc] init];
[query addFilterPredicate:predicate];
NSArray *items = [query items];
// now go through them all to export them
NSString* title;
NSURL * url;
AVAssetExportSession *exportSession;
NSString *storePath;
AVAsset *theAsset;
// we fill put the output at this path
NSString *applicationDocumentsDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
// loop through the items and export
for (MPMediaItem* item in items)
{
title = [item valueForProperty:MPMediaItemPropertyTitle];
url = [item valueForProperty:MPMediaItemPropertyAssetURL];
NSLog(#"Title: %#, URL: %#",title,url);
theAsset = [AVAsset assetWithURL:url];
if ([theAsset hasProtectedContent]) {
NSLog(#"%# is protected.",title);
} else {
NSLog(#"%# is NOT protected.",title);
}
exportSession = [[AVAssetExportSession alloc] initWithAsset:theAsset presetName:AVAssetExportPreset640x480];
storePath = [applicationDocumentsDir stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.mov",title]];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.outputURL = [NSURL fileURLWithPath:storePath];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"done!");
}];
}
Out of curiosity, are you checking the AVAsset exportable flag?
We can crop images. Can we crop videos?
Since video is a collection of pictures you can crop all frames from video and after create new video. AVFoundation guide describe some tasks:
Putting it all Together: Capturing Video Frames as UIImage Objects
After this you crops images and write video
You can use an asset writer to produce a QuickTime movie file or an
MPEG-4 file from media such as sample buffers or still images.
See for more details AV Foundation Framework
[[NSFileManager defaultManager] removeItemAtURL:outputURL error:nil];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetLowQuality];
exportSession.outputURL = outputURL;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(1.0, 600);
CMTime duration = CMTimeMakeWithSeconds(120.0, 600);
CMTimeRange range = CMTimeRangeMake(start, duration);
exportSession.timeRange = range;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void){
handler(exportSession);
[exportSession release];}];
Here we get a video of first 2 mins.
You should be able to do this using AVAssetExportSession, AVVideoComposition, and AVVideoCompositionCoreAnimationTool (and just set up a CALayer hierarchy with the positioning you want). I'm not sure if this is the most efficient way, though.
it's not as simple as images
but it could be as easy as the correct specification of the video but there is not enough information.
in the decoding settings, you can manipulate video pixels by geometry, ie, anamorphic, squeezed, stretched and also player/browser settings, the image window or player window, you can specify a small player window and a magnification level. if you allow or disallow zoom/magnification, you'll force an offscreeen draw or black bars.
i would encode to the correct size and platform for best quality, these kinds of fixes are 'kludges' but they work in a pinch. i would grab the quicktime sdk and poke around.
I want to play a movie in iOS 4.3 on the iPad. I've successfully used MPMoviePlayerController and AVPlayer to load files from a remote URL when the filename has a file extension. However, when I use a CDN that doesn't return the filename (just an un-guessable random name), neither MPMoviePlayerController or AVPlayer seem to be able to cope.
Is there a way to tell either player that it really is a movie of type x and it should just get on playing it?
MPMoviePlayerController will return the following error from it's changed state notification:
{
MPMoviePlayerPlaybackDidFinishReasonUserInfoKey = 1;
error = "Error Domain=MediaPlayerErrorDomain Code=-12847 \"This movie format is not supported.\" UserInfo=0x5b60030 {NSLocalizedDescription=This movie format is not supported.}";
}
I know that file is a valid m4v file, as when I rename it all is fine.
File at tmp
NSString* _filePath
Create symlink
NSFileManager *filemgr = [NSFileManager defaultManager];
NSString *slink = [_filePath stringByAppendingPathExtension:#"m4v"];
if (![filemgr fileExistsAtPath:slink]) {
NSError *error = nil;
[filemgr createSymbolicLinkAtPath:[_filePath stringByAppendingPathExtension:#"m4v"] withDestinationPath: _filePath error: &error];
if (error) {
...
}
}
...
play video by slink
If the player can't guess the file format you need to check that the CDN sends the right mime type back. My guess is that your CDN can't guess the mimetype correctly nor can the player.
In most cases this is due to how the CDN presents the HTTP header. Check that the "Content-Type" header is set to a video format matching your content.
WebKit handle this by a Private AVURLAsset option: AVURLAssetOutOfBandMIMETypeKey, this option is used when you specify a MIME type in the HTML's video tag,
You can use this option like:
NSString * mimeType = #"video/mp4";
// or even with codecs
mimeType = #"video/mp4; codecs=\"avc1.42E01E, mp4a.40.2\"";
// create asset
AVURLAsset * asset = [[AVURLAsset alloc] initWithURL:url options:#{#"AVURLAssetOutOfBandMIMETypeKey": mimeType}];
// create AVPlayer with AVURLAsset
AVPlayer * player = [AVPlayer playerWithPlayerItem:[AVPlayerItem playerItemWithAsset:asset]];
Since it is a private key, you may want to obfuscate it if you plan to submit it to AppStore.
The WebKit source can be found here:
https://opensource.apple.com/source/WebCore/WebCore-7604.1.38.1.6/platform/graphics/avfoundation/objc/MediaPlayerPrivateAVFoundationObjC.mm.auto.html
Finally, I found the answer.
You should use AVURLAsset (the subclass of AVAsset) and set the MIMEType in the options input :
let mimeType = "video/mp4; codecs=\"avc1.42E01E, mp4a.40.2\""
let urlAsset = AVURLAsset(url: url, options: ["AVURLAssetOutOfBandMIMETypeKey": mimeType])
Source -> https://stackoverflow.com/a/54087143/6736184
iPhone support video H.264, MPEG-4 in .mp4, .m4v, .mov formats and audio files in AAC, MP3, M4a, Apple lossless and Audible.
You can use NSFileManager's -contentsOfDirectoryAtPath:error: method to get an array with the contents of a directory (as strings).Then you just do strings operations .
Dylan is correct.
Both MPMoviePlayer and AVPlayer needs a file extension in order to play the file from URL otherwise an error message will be shown. Better to use some kind of tricks.
If you have problems to get the ContentType of your connection you could cycle through the playable MIME types and create symbolic links to the actual file with the extension and check if they are playable. Like so:
NSLog(#"linked path: %#",[videoURL absoluteString]);
NSString* linkedPath;
AVURLAsset* asset;
NSFileManager *filemgr = [NSFileManager defaultManager];
for (NSString* string in [AVURLAsset audiovisualMIMETypes]) {
if ([string containsString:#"video/"]) {
NSLog(#"Trying: %#",string);
linkedPath = [[videoURL absoluteString] stringByAppendingPathExtension:[string stringByReplacingOccurrencesOfString:#"video/" withString:#""]];
NSLog(#"linked path: %#",linkedPath);
if (![filemgr fileExistsAtPath:linkedPath]) {
NSError *error = nil;
[filemgr createSymbolicLinkAtURL:[NSURL URLWithString:linkedPath] withDestinationURL:videoURL error:&error];
if (error) {
NSLog(#"error %#",error.localizedDescription);
}
}
asset = [AVURLAsset assetWithURL:[NSURL URLWithString:linkedPath]];
if ([asset isPlayable]) {
NSLog(#"Playable");
break;
}else{
NSLog(#"Not Playable");
asset = nil;
}
}
}
It's sort of a hack, but what you could do is run each name through a method that checks for a period with three characters after it. If not, just append .m4v automatically. Or get the MIME type and append an extension automatically based on the returned type. If available. Look up documentation with NSString for more info. Good luck! Let me know if that helped.
The new features list of IOS 4.0 says that AV Foundation framework has got Media asset management, Track management, Media editing, and Metadata management for media items. What do they mean by this?
Using track management and media asset management can i access media files from the photos app?
Can i make my custom compositions using AVComposition and export and send it to a server?
Can i rename, move, edit the metadata information of an asset?
I tried to get some help/documentation on this and couldn't find any thing..
Thanks,
Tony
Yes, you can do most of the stuff you mentioned.
I think it's not that simple to access your media files of your phone. But you can read data from the network and export it to your camera roll if you like.
First you have to import your videos or audio files.
What you need to get started is your own videoplayer which you create in your own view.
If you don't like to play your videos but simply compose your stuff, you can simply go without a view.
This is very easy:
1. create a mutable composition:
AVMutableComposition *composition = [AVMutableComposition composition];
This will hold your videos. Now you have an empty Composition-Asset.
Add some files from your directory or from the web:
NSURL* sourceMovieURL = [NSURL fileURLWithPath:moviePath];
AVURLAsset* sourceAsset = [AVURLAsset URLAssetWithURL:sourceMovieURL options:nil];
Then add your video to your composition
// calculate time
CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), CMTimeMake(sourceAsset.duration.value, sourceAsset.duration.timescale));
// and add into your composition
BOOL result = [composition insertTimeRange:editRange ofAsset:sourceAsset atTime:composition.duration error:&editError];
If you would like to add more videos into your composition, you can add another Assets and set it again into your composition using your time range.
Now you can EXPORT your new composition using code like this:
NSError *exportError = nil;
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
NSURL *exportURL = [NSURL fileURLWithPath:exportVideoPath];
exportSession.outputURL = exportURL;
exportSession.outputFileType = #"com.apple.quicktime-movie";
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusFailed:{
NSLog (#"FAIL");
[self performSelectorOnMainThread:#selector (doPostExportFailed:)
withObject:nil
waitUntilDone:NO];
break;
}
case AVAssetExportSessionStatusCompleted: {
NSLog (#"SUCCESS");
[self performSelectorOnMainThread:#selector (doPostExportSuccess:)
withObject:nil
waitUntilDone:NO];
break;
}
};
}];
If you want to PLAY your videos, use code like this (I assume, you have access to your view):
// create an AVPlayer with your composition
AVPlayer* mp = [AVPlayer playerWithPlayerItem:[AVPlayerItem playerItemWithAsset:composition]];
// Add the player to your UserInterface
// Create a PlayerLayer:
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:mp];
// integrate it to your view. Here you can customize your player (Fullscreen, or a small preview)
[[self view].layer insertSublayer:playerLayer atIndex:0];
playerLayer.frame = [self view].layer.bounds;
playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
And finally play your video:
[mp play];
Export to camera roll:
NSString* exportVideoPath = >>your local path to your exported file<<
UISaveVideoAtPathToSavedPhotosAlbum (exportVideoPath, self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
And get the notification if it's finished (your callback method)
- (void) video: (NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo {
// Error is nil, if succeeded
NSLog(#"Finished saving video with error: %#", error);
// do postprocessing here, i.e. notifications or UI stuff
}
Unfortunately I haven't found any "legal" solution to read from the cameraroll.
A very good source on getting started is:
http://www.subfurther.com/blog/?cat=51
download VTM_Player.zip, VTM_AVRecPlay.zip or VTM_AVEditor.zip for a very nice introduction into this
How do I get a thumbnail of a video imported from the camera roll, or the camera itself?
This has been asked before, and has been answered. However, the answers kind of suck for me.
This thread iphone sdk > 3.0 . Video Thumbnail? has some options that boil down to:
Crawl some filesystem directory for a JPG with the latest modification date that should correspond to the video you just picked. This is extremely messy, and involves rooting around in directories Apple would probably not really want me doing.
Use ffmpeg. But this is so general that I cannot seem to figure out the steps that it would take to import ffmpeg into my project and to actually call it to extract images.
Is there really no other way? This seems like a HUGE oversight in the SDK to me. I mean the video picker has thumbnails in it, so Apple must be doing something to generate those, yet does not allow us to?
-(void)testGenerateThumbNailDataWithVideo {
NSString *path = [[NSBundle mainBundle] pathForResource:#"IMG_0106" ofType:#"MOV"];
NSURL *url = [NSURL fileURLWithPath:path];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetImageGenerator *generate = [[AVAssetImageGenerator alloc] initWithAsset:asset];
NSError *err = NULL;
CMTime time = CMTimeMake(1, 60);
CGImageRef imgRef = [generate copyCGImageAtTime:time actualTime:NULL error:&err];
[generate release];
NSLog(#"err==%#, imageRef==%#", err, imgRef);
UIImage *currentImg = [[UIImage alloc] initWithCGImage:imgRef];
static BOOL flag = YES;
if (flag) {
NSData *tmpData = UIImageJPEGRepresentation(currentImg, 0.8);
NSString *path = [NSString stringWithFormat:#"%#thumbNail.png", NSTemporaryDirectory()];
BOOL ret = [tmpData writeToFile:path atomically:YES];
NSLog(#"write to path=%#, flag=%d", path, ret);
flag = NO;
}
[currentImg release];
}
Best method I've found... MPMoviePlayerController thumbnailImageAtTime:timeOption
Nevermind this... see first comment below. That's the answer.
We use ffmpeg, you can explore our site for hints on how to do it, eventually I want to put up a tutorial.
But right now I'm more concentrated on getting ffmpeg to play movies.
Understand once you have that code the code to generate a thumbnail is just a subset of that.
http://sol3.typepad.com/tagalong_developer_journa/
This tutorial here, has helped us and maybe the majority of developers using ffmpeg to get started.
dranger.com/ffmpeg/ "
Finally,
Apple probably would maybe not have any problems with using the thumbnail generated from the video camera, I don't think its in a private folder however that is only created by the camera and not for videos picked from the image picker.