The new features list of IOS 4.0 says that AV Foundation framework has got Media asset management, Track management, Media editing, and Metadata management for media items. What do they mean by this?
Using track management and media asset management can i access media files from the photos app?
Can i make my custom compositions using AVComposition and export and send it to a server?
Can i rename, move, edit the metadata information of an asset?
I tried to get some help/documentation on this and couldn't find any thing..
Thanks,
Tony
Yes, you can do most of the stuff you mentioned.
I think it's not that simple to access your media files of your phone. But you can read data from the network and export it to your camera roll if you like.
First you have to import your videos or audio files.
What you need to get started is your own videoplayer which you create in your own view.
If you don't like to play your videos but simply compose your stuff, you can simply go without a view.
This is very easy:
1. create a mutable composition:
AVMutableComposition *composition = [AVMutableComposition composition];
This will hold your videos. Now you have an empty Composition-Asset.
Add some files from your directory or from the web:
NSURL* sourceMovieURL = [NSURL fileURLWithPath:moviePath];
AVURLAsset* sourceAsset = [AVURLAsset URLAssetWithURL:sourceMovieURL options:nil];
Then add your video to your composition
// calculate time
CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), CMTimeMake(sourceAsset.duration.value, sourceAsset.duration.timescale));
// and add into your composition
BOOL result = [composition insertTimeRange:editRange ofAsset:sourceAsset atTime:composition.duration error:&editError];
If you would like to add more videos into your composition, you can add another Assets and set it again into your composition using your time range.
Now you can EXPORT your new composition using code like this:
NSError *exportError = nil;
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
NSURL *exportURL = [NSURL fileURLWithPath:exportVideoPath];
exportSession.outputURL = exportURL;
exportSession.outputFileType = #"com.apple.quicktime-movie";
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusFailed:{
NSLog (#"FAIL");
[self performSelectorOnMainThread:#selector (doPostExportFailed:)
withObject:nil
waitUntilDone:NO];
break;
}
case AVAssetExportSessionStatusCompleted: {
NSLog (#"SUCCESS");
[self performSelectorOnMainThread:#selector (doPostExportSuccess:)
withObject:nil
waitUntilDone:NO];
break;
}
};
}];
If you want to PLAY your videos, use code like this (I assume, you have access to your view):
// create an AVPlayer with your composition
AVPlayer* mp = [AVPlayer playerWithPlayerItem:[AVPlayerItem playerItemWithAsset:composition]];
// Add the player to your UserInterface
// Create a PlayerLayer:
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:mp];
// integrate it to your view. Here you can customize your player (Fullscreen, or a small preview)
[[self view].layer insertSublayer:playerLayer atIndex:0];
playerLayer.frame = [self view].layer.bounds;
playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
And finally play your video:
[mp play];
Export to camera roll:
NSString* exportVideoPath = >>your local path to your exported file<<
UISaveVideoAtPathToSavedPhotosAlbum (exportVideoPath, self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
And get the notification if it's finished (your callback method)
- (void) video: (NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo {
// Error is nil, if succeeded
NSLog(#"Finished saving video with error: %#", error);
// do postprocessing here, i.e. notifications or UI stuff
}
Unfortunately I haven't found any "legal" solution to read from the cameraroll.
A very good source on getting started is:
http://www.subfurther.com/blog/?cat=51
download VTM_Player.zip, VTM_AVRecPlay.zip or VTM_AVEditor.zip for a very nice introduction into this
Related
it's my first question there, so sorry for mistakes and not clear description.
I'm developing app that capture videos in a loop and send them to server in background.
I want to sent it like separate files(post-request) and then connect them into 1 on sever side.
I'm not very experienced in using AVframework. So as a base - used AVCam project, and modify it for my features.
I need to solve such kind of problem right now.
I'm looking for a way how to record short video files near 2mb each, to upload them into server next. Main issue that I've got - delay between video parts after their concatenation. And how to get this videoparts without stopping recorder.
Trying few thing. First - using timer record 20sec fragments, stop by timer, and beging new record cycle.
Second. My idea was to use AVURLAsset(for getting current recording video file in temp dir) and by AVAssetExportSession get last not saved video data by it's duration. It seems working, I can record few videos, but again there are freezes between them, last 1-2 sec of each video - just 1 picture. So after concatenating it's not look like 1 movie.
-(void) startRecording
{
self.videoCounter=0;
self.videoTimer=[NSTimer scheduledTimerWithTimeInterval:lenghtTimer target:self selector:#selector(endRecoringVideo) userInfo:nil repeats:YES] ;
NSLog(#"timer started");
}
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{}]];
}
[self removeFile:[[self recorder] outputFileURL]];
[[self recorder] startRecordingWithOrientation:orientation];
}
-(void)saveVideoPart
{
NSUInteger count = 0;
NSString *filePath = nil;
do {
NSString *fileName = [NSString stringWithFormat:#"buf-%#-%u", AVAssetExportPresetLowQuality, count];
filePath = NSTemporaryDirectory();
filePath = [filePath stringByAppendingPathComponent:fileName];
filePath = [filePath stringByAppendingPathExtension:#"mov"];
count++;
} while ([[NSFileManager defaultManager] fileExistsAtPath:filePath]);
NSURL *outputURL = [NSURL fileURLWithPath:filePath];
AVURLAsset *videoAsset = [AVURLAsset URLAssetWithURL:[self tempFileURL] options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:videoAsset presetName:AVAssetExportPresetHighestQuality];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
Float64 durationSeconds = CMTimeGetSeconds([videoAsset duration]);
CMTime start;
CMTime duration;
start = CMTimeMakeWithSeconds(self.videoCounter, 1);
duration = CMTimeMakeWithSeconds(durationSeconds-self.videoCounter, 1);
self.videoCounter+=durationSeconds-self.videoCounter;
NSLog(#"duration video %f,recorded %f",durationSeconds,self.videoCounter);
CMTimeRange range = CMTimeRangeMake(start, duration);
exportSession.timeRange = range;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:
NSLog(#"Exported");
break;
case AVAssetExportSessionStatusFailed:
//
NSLog(#"Failed:%#",exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
//
NSLog(#"Canceled:%#",exportSession.error);
break;
default:
break;
}
}];
}
So I'll be so glad if you make proposes of algorithm or show me right direction where to move further. Main for me - divide vide to parts while recording. As an example video may be near 10 min long without pauses, I need to make few parts of it in background with 2mb long, and then upload them.
I am using an MPMediaPickerController to allow the user to select videos and songs from the library on the device. I allow this with the: initWithMediaTypes:MPMediaTypeAny initialization for the picker. The user can then play the song or video in-app after the export takes place. Here is my movie-exporting code after stripping it to its core functionality:
- (void)mediaPicker:(MPMediaPickerController*)mediaPicker didPickMediaItems:(MPMediaItemCollection*)mediaItemCollection {
AVAssetExportSession *exportSession;
NSString *filePath;
NSURL *fileUrl;
for (MPMediaItem *item in mediaItemCollection.items) {
NSURL *assetUrl = [item valueForProperty:MPMediaItemPropertyAssetURL];
AVAsset *currAsset = [AVAsset assetWithURL:assetUrl];
exportSession = [[AVAssetExportSession alloc] initWithAsset:[AVAsset assetWithURL:assetUrl] presetName:AVAssetExportPresetHighestQuality];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
filePath = [title stringByAppendingString:#".mov"];
fileUrl = [NSURL fileURLWithPath:[[NSFileManager documentDirectory] stringByAppendingPathComponent:filePath]];
exportSession.outputURL = fileUrl;
dispatch_group_enter(dispatchGroup);
[exportSession exportAsynchronouslyWithCompletionHandler:^{
// success
}
dispatch_group_leave(dispatchGroup);
}];
}
This similar code works fine for doing audio, but for video, the video's audio does not play. Most content from iTunes is protected and non-exportable, so I wanted to test with a homemade quick video I shot with my iPhone. I shot the video, dragged it into iTunes (and I made it a "music video" so that it shows up properly and can be exported to my phone's library). Then I sync'd and sent it to my device for testing.
In the app, the video shows up fine in the Media Picker, and I can export it with no errors that I can see. However, when I play it in-app, it only plays the video and not the audio. Other videos that I import from other sources work fine for playing the video's audio, so I don't 'think' it's the player itself.
Is there something I may be missing here on why the audio would not be coming across from this kind of export from the media picker? Thanks in advance for any assistance on this issue!
Not sure if this is the ideal solution, but the only way we found around this issue was to change it to force m4v format with PresetPassthrough set. I.e:
exportSession = [[AVAssetExportSession alloc] initWithAsset:[AVAsset assetWithURL:assetUrl] presetName:AVAssetExportPresetPassthrough];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeAppleM4V;
filePath = [title stringByAppendingString:#".m4v"];
Audio and video seems to work fine for videos imported locally this way, after making that change.
How can I play a YouTube video in an MPMoviePlayerController on the iPhone while avoiding going into fullscreen mode?
This question has been raised here: MPMoviePlayerController is playing YouTube video? and here: Play Youtube video in MPMoviePlayerController or play RTSP - 3GP link with answers claiming such functionality was impossible.
Yet this app, Deja, has exactly the functionality I would like: a seamless MPMoviePlayerController whose frame I have explicit control over. http://itunes.apple.com/app/deja/id417625158
How is this done!?
add this sample into you project
instantiate YoutubeStreamPathExtractorTest
invoke test method of YoutubeStreamPathExtractorTest instance.
Follow logs and be happy
#import "AFHTTPRequestOperationManager.h"
#import <MediaPlayer/MediaPlayer.h>
typedef void (^CallbackBlock)(NSArray* result, NSError* error);
static NSString* const kYouTubeStreamPathPattern = #"\\\"url_encoded_fmt_stream_map\\\\\":.*?url=(.*?)\\\\u0026";
#interface YoutubeStreamPathExtractorTest : NSObject
- (void)test;
- (void)youtubeURLPath:(NSString*)youtubeURLPath extractStreamURLPathsWithCallback:(CallbackBlock)callback;
#end
#implementation YoutubeStreamPathExtractorTest
- (void) test {
NSString* path = #"http://www.youtube.com/watch?v=TEV5DZpAXSw";
[self youtubeURLPath:path extractStreamURLPathsWithCallback:^(NSArray *result, NSError *error) {
if (error){
NSLog(#"extracting error:%#",[error localizedDescription]);
}
for(NSString* streamURLPath in result) {
NSLog(#"streamURLPath:%#",streamURLPath);
/*
NSURL* url = [NSURL URLWithString:streamURLPath];
MPMoviePlayerController* mpMoviePlayerController_ = [[MPMoviePlayerController alloc] initWithContentURL:url];
mpMoviePlayerController_.controlStyle = MPMovieControlStyleDefault;
[mpMoviePlayerController_ play];
*/
}
}];
}
- (void)youtubeURLPath:(NSString*)youtubeURLPath extractStreamURLPathsWithCallback:(CallbackBlock)callback {
__block NSMutableArray* resultArray = [NSMutableArray new];
AFHTTPRequestOperationManager* manager = [[AFHTTPRequestOperationManager alloc] initWithBaseURL:nil];
manager.responseSerializer = [AFHTTPResponseSerializer serializer];
manager.responseSerializer.acceptableContentTypes = [NSSet setWithObjects:#"text/html", nil];
[manager GET:youtubeURLPath
parameters:nil
success:^(AFHTTPRequestOperation* operation, id responseObject) {
NSData* data = (NSData*)responseObject;
NSString* string = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
NSError* error = nil;
NSRegularExpression* expression = [NSRegularExpression regularExpressionWithPattern:kYouTubeStreamPathPattern
options:NSRegularExpressionCaseInsensitive
error:&error];
NSRange range = NSMakeRange(0,[string length]);
NSArray* matches = [expression matchesInString:string options:0 range:range];
for(NSTextCheckingResult* checkingResult in matches) {
if ([checkingResult numberOfRanges]>1){
NSString* resultStr = [string substringWithRange:[checkingResult rangeAtIndex:1]];
//remove extra slashes
[resultArray addObject:[resultStr stringByReplacingOccurrencesOfString:#"\\" withString:#""]];
}
}
if (callback) {
callback(resultArray,error);
}
} failure:^(AFHTTPRequestOperation* operation, NSError* error) {
if (callback) {
callback(resultArray, error);
}
}];
}
#end
MPMoviePlayerController does not support the playback of YouTube SWF (Flash) video, period.
That app you are mentioning actually plays progressively downloaded files in MP4 format which YouTube also offers for some of its content. This actually is a violation of Apple's guidelines as it will (and does) exceed the maximum amount of progressive download per app per timeframe. I am surprised it got through the iTunes approval.
Warning: iOS apps submitted for distribution in the App Store must
conform to these requirements. If your app delivers video over
cellular networks, and the video exceeds either 10 minutes duration or
5 MB of data in a five minute period, you are required to use HTTP
Live Streaming. (Progressive download may be used for smaller clips.)
If your app uses HTTP Live Streaming over cellular networks, you are
required to provide at least one stream at 64 Kbps or lower bandwidth
(the low-bandwidth stream may be audio-only or audio with a still
image).
These requirements apply to iOS apps submitted for distribution in the
App Store for use on Apple products. Non-compliant apps may be
rejected or removed, at the discretion of Apple.
So your task boils down to the question on how to get the MP4 URL of a video offered through YouTube. That part is really tricky and nicely solved by Deja. Just use a packet sniffer and you will see that it actually creates a local server that feeds MPMoviePlayerController.
try this code:
NSString *urlStr=[Your url is here];
NSURL *url = [NSURL fileURLWithPath:urlStr];
MPMoviePlayerController* moviePlayer = [[MPMoviePlayerController alloc]initWithContentURL:url];
[self.view addSubview:moviePlayer.view];
moviePlayer.view.frame = CGRectMake(set frame is here);
[moviePlayer play];
[moviePlayer setFullscreen:NO animated:YES];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(moviePlayBackDidFinish)
name:MPMoviePlayerPlaybackDidFinishNotification
object:nil];
I guess it is against Youtube ToS but you can use this code here:
https://github.com/larcus94/LBYouTubeView
It is simple to use and works like a charm!
Use UIWebView. Copy HtML code video in youtube.
UIWevView* movie = [UIWebView alloc] initWithFrame:CGRectMake(0,0,320,460)];
NSString* urlString = #"past HTML code";
[self.webView loadHTMLString:urlString baseURL:nil];
[self.view addSubview:movie];
I am getting frame buffer one by one from video file using AVAssetReader and doing some operation on the frame and then saving new frame to temp file using AVAssetWritter.Now I have temp file path where all new frame is saving one by one.
Is there any way to play video at the time frames is continuously adding to temp file??
here is code to play video from temp path(where frames is continuously adding)
- (void)loadAssetFromFile {
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[(mMediaReader.mCameraRecorder) tempVideoFilePath ]] options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
// Completion handler block.
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
self.mPlayerItem = [AVPlayerItem playerItemWithAsset:asset];
[mPlayerItem addObserver:self forKeyPath:#"status"
options:0 context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:mPlayerItem];
self.mPlayer = [AVPlayer playerWithPlayerItem:mPlayerItem];
[mPlayerView setPlayer:mPlayer];
[self play:nil];
}
else {
// You should deal with the error appropriately.
NSLog(#"The asset's tracks were not loaded:\n%#", [error localizedDescription]);
}
});
}];
}
- (IBAction)play:sender {
[mPlayer play];
}
And code inside the block never runs.
Dividing the video in multiple sub videos worked for me.
What I did instead of saving full video in one temp path. I divided that video in multiple sub videos and then replaced AVPlayerItem property of AVPlayer accordingly.
So now functionality is working same as video streaming . :)
You can also convert the CMSampleBuffer that the AVAssetReader returns into a CGImage and then a UIImage and display that in a UIImageView, to render the frames as they are pulled out of the original video file.
There is example code inside the AVFoundation Programming Guide that shows how to do this conversion.
Is it possible to pick media items using MPMediaPickerController and then load them into an AVAudioPlayer object?
If MPMusicPlayerController doesn't meet your needs, you can copy the audio to your local bundle so you can use AVAudioPlayer.
EDIT
You basically have three options for playing audio from the user's iPod library: MPMediaPlayer, AVPlayer and AVAudioPlayer.
Here are examples for MPMediaPlayer and AVPlayer:
- (void) mediaPicker: (MPMediaPickerController *) mediaPicker
didPickMediaItems: (MPMediaItemCollection *) collection {
MPMediaItem *item = [[collection items] objectAtIndex:0];
NSURL *url = [item valueForProperty:MPMediaItemPropertyAssetURL];
[self dismissModalViewControllerAnimated: YES];
// Play the item using MPMusicPlayer
MPMusicPlayerController* appMusicPlayer = [MPMusicPlayerController applicationMusicPlayer];
[appMusicPlayer setQueueWithItemCollection:collection];
[appMusicPlayer play];
// Play the item using AVPlayer
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithURL:url];
AVPlayer *player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
[player play];
}
If you need to use AVAudioPlayer for some reason, or you need access to the audio file's actual audio data, you have to first copy the audio file to your app's directory and then work with it there. The AVAsset + AVPlayer stuff is the closest analogy to ALAsset if you're used to working with photos and videos.
Just wanted to say that it appears that in iOS 6 and 7, AVAudioPlayer can play file URLs directly from the iPod without having to copy the audio data into your app directory as Art suggested.
- (void) mediaPicker: (MPMediaPickerController *) mediaPicker didPickMediaItems: (MPMediaItemCollection *) collection {
MPMediaItem *item = [[collection items] objectAtIndex:0];
NSURL *url = [item valueForProperty:MPMediaItemPropertyAssetURL];
// Play the item using AVPlayer
self.avAudioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
[self.avAudioPlayer play];
}
I wanted to add (and I can't comment on SO yet) that it seems like in iOS 8, and perhaps before, songs that are stored on iCloud do not have a value for the property assetURL and return null for [npItem valueForProperty:MPMediaItemPropertyAssetURL].
Here is a sample output:
MPMediaItem *npItem = self.musicPlayerController.nowPlayingItem;
NSLog(#"Song Title: %#\n assetURL: %#\n Cloud Item: %d", npItem.title, [npItem valueForProperty:MPMediaItemPropertyAssetURL], npItem.cloudItem)
// Log Output
Song Title: Time We Had
assetURL: (null)
Cloud Item: 1
Song Title: The Quiet
assetURL: ipod-library://item/item.m4a?id=1529654720874100371
Cloud Item: 0
I think you'll find that the [NSBundle mainBundle] is READ-ONLY. It's not possible to write files into the APP, therefore AVAudioPlayer will not work for iPod-Library MPMediaItems!