Play video using AVPlayer - iphone

I am getting frame buffer one by one from video file using AVAssetReader and doing some operation on the frame and then saving new frame to temp file using AVAssetWritter.Now I have temp file path where all new frame is saving one by one.
Is there any way to play video at the time frames is continuously adding to temp file??
here is code to play video from temp path(where frames is continuously adding)
- (void)loadAssetFromFile {
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[(mMediaReader.mCameraRecorder) tempVideoFilePath ]] options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
// Completion handler block.
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
self.mPlayerItem = [AVPlayerItem playerItemWithAsset:asset];
[mPlayerItem addObserver:self forKeyPath:#"status"
options:0 context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:mPlayerItem];
self.mPlayer = [AVPlayer playerWithPlayerItem:mPlayerItem];
[mPlayerView setPlayer:mPlayer];
[self play:nil];
}
else {
// You should deal with the error appropriately.
NSLog(#"The asset's tracks were not loaded:\n%#", [error localizedDescription]);
}
});
}];
}
- (IBAction)play:sender {
[mPlayer play];
}
And code inside the block never runs.

Dividing the video in multiple sub videos worked for me.
What I did instead of saving full video in one temp path. I divided that video in multiple sub videos and then replaced AVPlayerItem property of AVPlayer accordingly.
So now functionality is working same as video streaming . :)

You can also convert the CMSampleBuffer that the AVAssetReader returns into a CGImage and then a UIImage and display that in a UIImageView, to render the frames as they are pulled out of the original video file.
There is example code inside the AVFoundation Programming Guide that shows how to do this conversion.

Related

How to play video file from asset library using AVPlayerLayer

Im trying to play alasset MOV file from asset library. Does anyone use AVPlayerLayer?
if yes some hist will be great...
NSURL *myMovieURL = [NSURL URLWithString:_vUrl.absoluteString];
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:myMovieURL options:nil];
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:avAsset];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:playerItem];
self.player = [[AVPlayer alloc]initWithPlayerItem:playerItem];
[_playerlayer addObserver:self forKeyPath:#"readyForDisplay" options:0 context:nil];
((AVPlayerLayer *)[self layer]).bounds = ((AVPlayerLayer *)[self layer]).bounds;
[(AVPlayerLayer *)[self layer] setPlayer:_player];
_playerlayer = (AVPlayerLayer *)[self layer];
[_player seekToTime:kCMTimeZero];
You can't get the actual file-path from the AssetsLibrary because of sandboxing. However, you have various options to access/play the video file.
1) Query the URL of the Asset using the url method of ALAssetRepresentation and pass it to an instance of MPMoviePlayerController to play the video. This url starts with assets-library:// and is not a file-system url, but MPMoviePlayerController knows how to handle such an URL.
2) Get the video contents by using the getBytes:fromOffset:length:error: of ALAssetsRepresentation to save the video to your own app sandbox to play/edit/share it or use getBytes:fromOffset:length:error: to stream the video contents.

player.duration shows always zero in MPMoviePlayerController for video file

I am playing a small video in mpmediaplayer controller using this code
MPMoviePlayerController *player = [[MPMoviePlayerController alloc]
initWithContentURL:[NSURL fileURLWithPath:videostr]];
where videostr is path of that video file.
Now i need to get length of that video file for that i am using this code.
length = player.duration;
But it always shows 0.000. But the video is playing well.
I am checking by googling every where code to get video duration is player.duration only.
And i try some other code also
AVURLAsset *asset = [[[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:videostr]
options:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil]] autorelease];
NSTimeInterval duration;
if (asset)
duration = CMTimeGetSeconds(asset.duration) ;
NSLog(#">>>>>>>>>>>>>>>>>>>> %f", asset.duration);
even though it shows zero.Can any one please help me.
Thank in advance.
You can not get a useful duration until the content is actually playable.
Register for load state changes:
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(MPMoviePlayerLoadStateDidChange:)
name:MPMoviePlayerLoadStateDidChangeNotification
object:nil];
Evaluate the state once being notified:
- (void)MPMoviePlayerLoadStateDidChange:(NSNotification *)notification
{
if ((player.loadState & MPMovieLoadStatePlaythroughOK) == MPMovieLoadStatePlaythroughOK)
{
NSLog(#"content play length is %g seconds", player.duration);
}
}
SWIFT
You can do in swift like this
func getMediaDuration(url: NSURL!) -> Float64{
var asset : AVURLAsset = AVURLAsset.assetWithURL(url) as AVURLAsset
var duration : CMTime = asset.duration
return CMTimeGetSeconds(duration)
}
use MPMovieDurationAvailableNotification
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(movieDurationAvailable:)
MPMovieDurationAvailableNotification object:nil];
- (void)movieDurationAvailable:(NSNotification *)notification {
NSLog(#"content play length is %g seconds", moviePlayer.duration);
}
You can fetch time duration from video file using below code.
NSURL *videoURL = [NSURL fileURLWithPath:VideoFileURL];
AVURLAsset *sourceAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
CMTime duration = sourceAsset.duration;
double durationInSeconds = duration.value/duration.timescale;
In durationInSeconds variable you get the exact duration in seconds.

having issue in playing video using AVFoundation framework

I am getting frame buffer one by one from video file using AVAssetReader and doing some operation on the frame and then saving new frame to temp file using AVAssetWritter.Now I have temp file path where all new frame is saving one by one. Is there any way to play video at the time frames is continuously adding to temp file?
here is code to play video from temp path(where frames is continuously adding):
(void)loadAssetFromFile {
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[(mMediaReader.mCameraRecorder) tempVideoFilePath ]] options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
// Completion handler block.
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
self.mPlayerItem = [AVPlayerItem playerItemWithAsset:asset];
[mPlayerItem addObserver:self forKeyPath:#"status"
options:0 context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:mPlayerItem];
self.mPlayer = [AVPlayer playerWithPlayerItem:mPlayerItem];
[mPlayerView setPlayer:mPlayer];
[self play:nil];
}
else {
// You should deal with the error appropriately.
NSLog(#"The asset's tracks were not loaded:\n%#", [error localizedDescription]);
}
});
}];
}
(IBAction)play:sender {
[mPlayer play];
}
And code inside the block never runs.
You can convert the CMSampleBuffer that the AVAssetReader returns into a CGImage and then a UIImage and display that in a UIImageView, to render the frames as they are pulled out of the original video file.
There is example code inside the AVFoundation Programming Guide that shows how to do this conversion.

iphone trying to play video from a file but only get a black screen

I'm trying to download a mp4 file from a server, then save it in a file. After it is saved in the file, I try to play it. When the program runs, it pauses when the file is being download, then the screen turns black when I create the MPMoviePlayerController object.
I did the following.
1. Traced through the code and view the NSDAta object to make sure the right anount of data was being load.
2. added a if ( [mFile fileExistsAtPath:filename]==true ) to make sure the file exist.
3. checked all the return values for a nil.
I'm left out of idears now!
// set up file name to save it under and play it
NSFileManager *mFile= [NSFileManager defaultManager];
NSString *filename=[ NSHomeDirectory() stringByAppendingPathComponent:#"ted10.dat"];
// load video
NSURL *movieUrl=[[NSURL alloc] initWithString:#"http://www.besttechsolutions.biz/projects/golfflix/ted.mp4"];
NSData *data = [NSData dataWithContentsOfURL:movieUrl];
[data writeToURL:movieUrl atomically:YES];
[mFile createFileAtPath:filename contents: data attributes:nil];
// makse sure file exist
if ( [mFile fileExistsAtPath:filename]==true )
{
// play it
NSURL *fileUrl=[ NSURL fileURLWithPath:filename];
mp=[[MPMoviePlayerController alloc] initWithContentURL: fileUrl];
[mp.view setFrame: self.view.bounds];
[self.view addSubview: mp.view];
if ([mp respondsToSelector:#selector(loadState)])
{
// Set movie player layout
[mp setControlStyle:MPMovieControlStyleFullscreen];
[mp setFullscreen:YES];
// May help to reduce latency
[mp prepareToPlay];
// Register that the load state changed (movie is ready)
}//localhost/Dino/golflix2/Classes/videolist.h
else
{
}
[mp play];
}
Here's your fixed code:
- (IBAction) doMovie: (id) sender
{
// set up file name to save it under and play it
NSFileManager *mFile= [NSFileManager defaultManager];
NSString *filename=[ NSHomeDirectory() stringByAppendingPathComponent:#"ted10.mp4"];
// load video
NSURL *movieUrl=[[NSURL alloc] initWithString:#"http://www.besttechsolutions.biz/projects/golfflix/ted.mp4"];
NSError *error = NULL;
NSData *data = [NSData dataWithContentsOfURL:movieUrl];
if( [data writeToFile:filename options: NSDataWritingAtomic error: &error] != YES)
{
NSLog( #"error writing data to file %#", filename );
} else {
// make sure file exist
if ( [mFile fileExistsAtPath:filename]==true )
{
// play it
NSURL *fileUrl=[ NSURL fileURLWithPath:filename];
MPMoviePlayerController * mp=[[MPMoviePlayerController alloc] initWithContentURL: fileUrl];
[mp.view setFrame: self.view.bounds];
[self.view addSubview: mp.view];
// Set movie player layout
[mp setControlStyle:MPMovieControlStyleFullscreen];
[mp setFullscreen:YES];
// May help to reduce latency
[mp prepareToPlay];
[mp play];
}
}
}
1)
Most importantly, you were originally trying to write your movie back to the remote URL. I think what you really wanted to do was write to a local file (or cache).
2)
The file extension (.mp4) gives a hint to the movie player what kind of file is playing. ".dat" is too generic.
3)
Speaking of writing to the local file, don't put it in the home directory. Find the cache directory or somewhere else that's more appropriate.

AVAsset and AVAssetTrack - Track Management in IOS 4.0

The new features list of IOS 4.0 says that AV Foundation framework has got Media asset management, Track management, Media editing, and Metadata management for media items. What do they mean by this?
Using track management and media asset management can i access media files from the photos app?
Can i make my custom compositions using AVComposition and export and send it to a server?
Can i rename, move, edit the metadata information of an asset?
I tried to get some help/documentation on this and couldn't find any thing..
Thanks,
Tony
Yes, you can do most of the stuff you mentioned.
I think it's not that simple to access your media files of your phone. But you can read data from the network and export it to your camera roll if you like.
First you have to import your videos or audio files.
What you need to get started is your own videoplayer which you create in your own view.
If you don't like to play your videos but simply compose your stuff, you can simply go without a view.
This is very easy:
1. create a mutable composition:
AVMutableComposition *composition = [AVMutableComposition composition];
This will hold your videos. Now you have an empty Composition-Asset.
Add some files from your directory or from the web:
NSURL* sourceMovieURL = [NSURL fileURLWithPath:moviePath];
AVURLAsset* sourceAsset = [AVURLAsset URLAssetWithURL:sourceMovieURL options:nil];
Then add your video to your composition
// calculate time
CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), CMTimeMake(sourceAsset.duration.value, sourceAsset.duration.timescale));
// and add into your composition
BOOL result = [composition insertTimeRange:editRange ofAsset:sourceAsset atTime:composition.duration error:&editError];
If you would like to add more videos into your composition, you can add another Assets and set it again into your composition using your time range.
Now you can EXPORT your new composition using code like this:
NSError *exportError = nil;
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
NSURL *exportURL = [NSURL fileURLWithPath:exportVideoPath];
exportSession.outputURL = exportURL;
exportSession.outputFileType = #"com.apple.quicktime-movie";
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusFailed:{
NSLog (#"FAIL");
[self performSelectorOnMainThread:#selector (doPostExportFailed:)
withObject:nil
waitUntilDone:NO];
break;
}
case AVAssetExportSessionStatusCompleted: {
NSLog (#"SUCCESS");
[self performSelectorOnMainThread:#selector (doPostExportSuccess:)
withObject:nil
waitUntilDone:NO];
break;
}
};
}];
If you want to PLAY your videos, use code like this (I assume, you have access to your view):
// create an AVPlayer with your composition
AVPlayer* mp = [AVPlayer playerWithPlayerItem:[AVPlayerItem playerItemWithAsset:composition]];
// Add the player to your UserInterface
// Create a PlayerLayer:
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:mp];
// integrate it to your view. Here you can customize your player (Fullscreen, or a small preview)
[[self view].layer insertSublayer:playerLayer atIndex:0];
playerLayer.frame = [self view].layer.bounds;
playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
And finally play your video:
[mp play];
Export to camera roll:
NSString* exportVideoPath = >>your local path to your exported file<<
UISaveVideoAtPathToSavedPhotosAlbum (exportVideoPath, self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
And get the notification if it's finished (your callback method)
- (void) video: (NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo {
// Error is nil, if succeeded
NSLog(#"Finished saving video with error: %#", error);
// do postprocessing here, i.e. notifications or UI stuff
}
Unfortunately I haven't found any "legal" solution to read from the cameraroll.
A very good source on getting started is:
http://www.subfurther.com/blog/?cat=51
download VTM_Player.zip, VTM_AVRecPlay.zip or VTM_AVEditor.zip for a very nice introduction into this