having issue in playing video using AVFoundation framework - iphone

I am getting frame buffer one by one from video file using AVAssetReader and doing some operation on the frame and then saving new frame to temp file using AVAssetWritter.Now I have temp file path where all new frame is saving one by one. Is there any way to play video at the time frames is continuously adding to temp file?
here is code to play video from temp path(where frames is continuously adding):
(void)loadAssetFromFile {
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[(mMediaReader.mCameraRecorder) tempVideoFilePath ]] options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
// Completion handler block.
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
self.mPlayerItem = [AVPlayerItem playerItemWithAsset:asset];
[mPlayerItem addObserver:self forKeyPath:#"status"
options:0 context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:mPlayerItem];
self.mPlayer = [AVPlayer playerWithPlayerItem:mPlayerItem];
[mPlayerView setPlayer:mPlayer];
[self play:nil];
}
else {
// You should deal with the error appropriately.
NSLog(#"The asset's tracks were not loaded:\n%#", [error localizedDescription]);
}
});
}];
}
(IBAction)play:sender {
[mPlayer play];
}
And code inside the block never runs.

You can convert the CMSampleBuffer that the AVAssetReader returns into a CGImage and then a UIImage and display that in a UIImageView, to render the frames as they are pulled out of the original video file.
There is example code inside the AVFoundation Programming Guide that shows how to do this conversion.

Related

How to play video file from asset library using AVPlayerLayer

Im trying to play alasset MOV file from asset library. Does anyone use AVPlayerLayer?
if yes some hist will be great...
NSURL *myMovieURL = [NSURL URLWithString:_vUrl.absoluteString];
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:myMovieURL options:nil];
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:avAsset];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:playerItem];
self.player = [[AVPlayer alloc]initWithPlayerItem:playerItem];
[_playerlayer addObserver:self forKeyPath:#"readyForDisplay" options:0 context:nil];
((AVPlayerLayer *)[self layer]).bounds = ((AVPlayerLayer *)[self layer]).bounds;
[(AVPlayerLayer *)[self layer] setPlayer:_player];
_playerlayer = (AVPlayerLayer *)[self layer];
[_player seekToTime:kCMTimeZero];
You can't get the actual file-path from the AssetsLibrary because of sandboxing. However, you have various options to access/play the video file.
1) Query the URL of the Asset using the url method of ALAssetRepresentation and pass it to an instance of MPMoviePlayerController to play the video. This url starts with assets-library:// and is not a file-system url, but MPMoviePlayerController knows how to handle such an URL.
2) Get the video contents by using the getBytes:fromOffset:length:error: of ALAssetsRepresentation to save the video to your own app sandbox to play/edit/share it or use getBytes:fromOffset:length:error: to stream the video contents.

how to play another video after finishing one video as video names are selected by user in a uipickerview

My objective is to put video files in a uipicker. I have added my video files names in an array and successfully extracting video file name and extensions but now my target is when user taps on row of picker then that selected video from the array will be passed to mpmovieplayercontroller object then that will play movie please let me know how can I write code to play another selected file after completion of one file i.e.in 1mpmovieplaybackfinished`
My code to play single video file is below.
Just let me know how can i play another video from an array:
-(IBAction)play:(id)sender
{
for (i = 0;i<[arrayofvideo count];i++) {<br /> NSLog(#"%d",i);
NSString *fullFileName = [NSString stringWithFormat:#"%#",[arrayofvideo objectAtIndex:i]];
NSString *filename = [fullFileName stringByDeletingPathExtension];
NSLog(#"%#",filename);
NSString *extension = [fullFileName pathExtension];
NSLog(#"%#",extension);
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"filename"
ofType:#"extension"]];
movieplayer = [[MPMoviePlayerController alloc]initWithContentURL:url];
movieplayer.movieSourceType = MPMovieSourceTypeFile;
movieplayer.controlStyle = MPMovieControlStyleDefault;
movieplayer.view.frame = CGRectMake(0,0,300,400);
[self.view addSubview:movieplayer.view];
[[NSNotificationCenter defaultCenter]
addObserver:self
selector:#selector(movieFinishedCallback:)
name:MPMoviePlayerPlaybackDidFinishNotification
object:movieplayer];
//---play movie---
[movieplayer play];
}
-(void)movieFinishedCallback:(NSNotification*) aNotification
{
MPMoviePlayerController *player = [aNotification object];
[[NSNotificationCenter defaultCenter]
removeObserver:self
name:MPMoviePlayerPlaybackDidFinishNotification
object:player];
[player stop];
// self.view removeFromSuperView;
[self.view removeFromSuperview];
//call the play button again
// pls tell me what to write here to call play function
}
You need to restructure your code.
The code you posted builds a pathname from your array of video pathnames, and then uses a fixed filename of #"filename". That's not right.
You should write a method "playFileAtIndex" that finds a filename at a specific index and plays that file. I would suggest adding a BOOL parameter playMultipleVideos to the method. If the flag is true, the method would set up a movieFinishedCallback to increment the index and start the next movie playing.
Then make your button IBAction method set an index iVar to the selected row and invoke the playFileAtIndex method.

player.duration shows always zero in MPMoviePlayerController for video file

I am playing a small video in mpmediaplayer controller using this code
MPMoviePlayerController *player = [[MPMoviePlayerController alloc]
initWithContentURL:[NSURL fileURLWithPath:videostr]];
where videostr is path of that video file.
Now i need to get length of that video file for that i am using this code.
length = player.duration;
But it always shows 0.000. But the video is playing well.
I am checking by googling every where code to get video duration is player.duration only.
And i try some other code also
AVURLAsset *asset = [[[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:videostr]
options:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil]] autorelease];
NSTimeInterval duration;
if (asset)
duration = CMTimeGetSeconds(asset.duration) ;
NSLog(#">>>>>>>>>>>>>>>>>>>> %f", asset.duration);
even though it shows zero.Can any one please help me.
Thank in advance.
You can not get a useful duration until the content is actually playable.
Register for load state changes:
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(MPMoviePlayerLoadStateDidChange:)
name:MPMoviePlayerLoadStateDidChangeNotification
object:nil];
Evaluate the state once being notified:
- (void)MPMoviePlayerLoadStateDidChange:(NSNotification *)notification
{
if ((player.loadState & MPMovieLoadStatePlaythroughOK) == MPMovieLoadStatePlaythroughOK)
{
NSLog(#"content play length is %g seconds", player.duration);
}
}
SWIFT
You can do in swift like this
func getMediaDuration(url: NSURL!) -> Float64{
var asset : AVURLAsset = AVURLAsset.assetWithURL(url) as AVURLAsset
var duration : CMTime = asset.duration
return CMTimeGetSeconds(duration)
}
use MPMovieDurationAvailableNotification
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(movieDurationAvailable:)
MPMovieDurationAvailableNotification object:nil];
- (void)movieDurationAvailable:(NSNotification *)notification {
NSLog(#"content play length is %g seconds", moviePlayer.duration);
}
You can fetch time duration from video file using below code.
NSURL *videoURL = [NSURL fileURLWithPath:VideoFileURL];
AVURLAsset *sourceAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
CMTime duration = sourceAsset.duration;
double durationInSeconds = duration.value/duration.timescale;
In durationInSeconds variable you get the exact duration in seconds.

Play video using AVPlayer

I am getting frame buffer one by one from video file using AVAssetReader and doing some operation on the frame and then saving new frame to temp file using AVAssetWritter.Now I have temp file path where all new frame is saving one by one.
Is there any way to play video at the time frames is continuously adding to temp file??
here is code to play video from temp path(where frames is continuously adding)
- (void)loadAssetFromFile {
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[(mMediaReader.mCameraRecorder) tempVideoFilePath ]] options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
// Completion handler block.
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
self.mPlayerItem = [AVPlayerItem playerItemWithAsset:asset];
[mPlayerItem addObserver:self forKeyPath:#"status"
options:0 context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:mPlayerItem];
self.mPlayer = [AVPlayer playerWithPlayerItem:mPlayerItem];
[mPlayerView setPlayer:mPlayer];
[self play:nil];
}
else {
// You should deal with the error appropriately.
NSLog(#"The asset's tracks were not loaded:\n%#", [error localizedDescription]);
}
});
}];
}
- (IBAction)play:sender {
[mPlayer play];
}
And code inside the block never runs.
Dividing the video in multiple sub videos worked for me.
What I did instead of saving full video in one temp path. I divided that video in multiple sub videos and then replaced AVPlayerItem property of AVPlayer accordingly.
So now functionality is working same as video streaming . :)
You can also convert the CMSampleBuffer that the AVAssetReader returns into a CGImage and then a UIImage and display that in a UIImageView, to render the frames as they are pulled out of the original video file.
There is example code inside the AVFoundation Programming Guide that shows how to do this conversion.

iphone trying to play video from a file but only get a black screen

I'm trying to download a mp4 file from a server, then save it in a file. After it is saved in the file, I try to play it. When the program runs, it pauses when the file is being download, then the screen turns black when I create the MPMoviePlayerController object.
I did the following.
1. Traced through the code and view the NSDAta object to make sure the right anount of data was being load.
2. added a if ( [mFile fileExistsAtPath:filename]==true ) to make sure the file exist.
3. checked all the return values for a nil.
I'm left out of idears now!
// set up file name to save it under and play it
NSFileManager *mFile= [NSFileManager defaultManager];
NSString *filename=[ NSHomeDirectory() stringByAppendingPathComponent:#"ted10.dat"];
// load video
NSURL *movieUrl=[[NSURL alloc] initWithString:#"http://www.besttechsolutions.biz/projects/golfflix/ted.mp4"];
NSData *data = [NSData dataWithContentsOfURL:movieUrl];
[data writeToURL:movieUrl atomically:YES];
[mFile createFileAtPath:filename contents: data attributes:nil];
// makse sure file exist
if ( [mFile fileExistsAtPath:filename]==true )
{
// play it
NSURL *fileUrl=[ NSURL fileURLWithPath:filename];
mp=[[MPMoviePlayerController alloc] initWithContentURL: fileUrl];
[mp.view setFrame: self.view.bounds];
[self.view addSubview: mp.view];
if ([mp respondsToSelector:#selector(loadState)])
{
// Set movie player layout
[mp setControlStyle:MPMovieControlStyleFullscreen];
[mp setFullscreen:YES];
// May help to reduce latency
[mp prepareToPlay];
// Register that the load state changed (movie is ready)
}//localhost/Dino/golflix2/Classes/videolist.h
else
{
}
[mp play];
}
Here's your fixed code:
- (IBAction) doMovie: (id) sender
{
// set up file name to save it under and play it
NSFileManager *mFile= [NSFileManager defaultManager];
NSString *filename=[ NSHomeDirectory() stringByAppendingPathComponent:#"ted10.mp4"];
// load video
NSURL *movieUrl=[[NSURL alloc] initWithString:#"http://www.besttechsolutions.biz/projects/golfflix/ted.mp4"];
NSError *error = NULL;
NSData *data = [NSData dataWithContentsOfURL:movieUrl];
if( [data writeToFile:filename options: NSDataWritingAtomic error: &error] != YES)
{
NSLog( #"error writing data to file %#", filename );
} else {
// make sure file exist
if ( [mFile fileExistsAtPath:filename]==true )
{
// play it
NSURL *fileUrl=[ NSURL fileURLWithPath:filename];
MPMoviePlayerController * mp=[[MPMoviePlayerController alloc] initWithContentURL: fileUrl];
[mp.view setFrame: self.view.bounds];
[self.view addSubview: mp.view];
// Set movie player layout
[mp setControlStyle:MPMovieControlStyleFullscreen];
[mp setFullscreen:YES];
// May help to reduce latency
[mp prepareToPlay];
[mp play];
}
}
}
1)
Most importantly, you were originally trying to write your movie back to the remote URL. I think what you really wanted to do was write to a local file (or cache).
2)
The file extension (.mp4) gives a hint to the movie player what kind of file is playing. ".dat" is too generic.
3)
Speaking of writing to the local file, don't put it in the home directory. Find the cache directory or somewhere else that's more appropriate.