I am playing a small video in mpmediaplayer controller using this code
MPMoviePlayerController *player = [[MPMoviePlayerController alloc]
initWithContentURL:[NSURL fileURLWithPath:videostr]];
where videostr is path of that video file.
Now i need to get length of that video file for that i am using this code.
length = player.duration;
But it always shows 0.000. But the video is playing well.
I am checking by googling every where code to get video duration is player.duration only.
And i try some other code also
AVURLAsset *asset = [[[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:videostr]
options:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil]] autorelease];
NSTimeInterval duration;
if (asset)
duration = CMTimeGetSeconds(asset.duration) ;
NSLog(#">>>>>>>>>>>>>>>>>>>> %f", asset.duration);
even though it shows zero.Can any one please help me.
Thank in advance.
You can not get a useful duration until the content is actually playable.
Register for load state changes:
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(MPMoviePlayerLoadStateDidChange:)
name:MPMoviePlayerLoadStateDidChangeNotification
object:nil];
Evaluate the state once being notified:
- (void)MPMoviePlayerLoadStateDidChange:(NSNotification *)notification
{
if ((player.loadState & MPMovieLoadStatePlaythroughOK) == MPMovieLoadStatePlaythroughOK)
{
NSLog(#"content play length is %g seconds", player.duration);
}
}
SWIFT
You can do in swift like this
func getMediaDuration(url: NSURL!) -> Float64{
var asset : AVURLAsset = AVURLAsset.assetWithURL(url) as AVURLAsset
var duration : CMTime = asset.duration
return CMTimeGetSeconds(duration)
}
use MPMovieDurationAvailableNotification
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(movieDurationAvailable:)
MPMovieDurationAvailableNotification object:nil];
- (void)movieDurationAvailable:(NSNotification *)notification {
NSLog(#"content play length is %g seconds", moviePlayer.duration);
}
You can fetch time duration from video file using below code.
NSURL *videoURL = [NSURL fileURLWithPath:VideoFileURL];
AVURLAsset *sourceAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
CMTime duration = sourceAsset.duration;
double durationInSeconds = duration.value/duration.timescale;
In durationInSeconds variable you get the exact duration in seconds.
Related
Im trying to play alasset MOV file from asset library. Does anyone use AVPlayerLayer?
if yes some hist will be great...
NSURL *myMovieURL = [NSURL URLWithString:_vUrl.absoluteString];
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:myMovieURL options:nil];
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:avAsset];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:playerItem];
self.player = [[AVPlayer alloc]initWithPlayerItem:playerItem];
[_playerlayer addObserver:self forKeyPath:#"readyForDisplay" options:0 context:nil];
((AVPlayerLayer *)[self layer]).bounds = ((AVPlayerLayer *)[self layer]).bounds;
[(AVPlayerLayer *)[self layer] setPlayer:_player];
_playerlayer = (AVPlayerLayer *)[self layer];
[_player seekToTime:kCMTimeZero];
You can't get the actual file-path from the AssetsLibrary because of sandboxing. However, you have various options to access/play the video file.
1) Query the URL of the Asset using the url method of ALAssetRepresentation and pass it to an instance of MPMoviePlayerController to play the video. This url starts with assets-library:// and is not a file-system url, but MPMoviePlayerController knows how to handle such an URL.
2) Get the video contents by using the getBytes:fromOffset:length:error: of ALAssetsRepresentation to save the video to your own app sandbox to play/edit/share it or use getBytes:fromOffset:length:error: to stream the video contents.
My objective is to put video files in a uipicker. I have added my video files names in an array and successfully extracting video file name and extensions but now my target is when user taps on row of picker then that selected video from the array will be passed to mpmovieplayercontroller object then that will play movie please let me know how can I write code to play another selected file after completion of one file i.e.in 1mpmovieplaybackfinished`
My code to play single video file is below.
Just let me know how can i play another video from an array:
-(IBAction)play:(id)sender
{
for (i = 0;i<[arrayofvideo count];i++) {<br /> NSLog(#"%d",i);
NSString *fullFileName = [NSString stringWithFormat:#"%#",[arrayofvideo objectAtIndex:i]];
NSString *filename = [fullFileName stringByDeletingPathExtension];
NSLog(#"%#",filename);
NSString *extension = [fullFileName pathExtension];
NSLog(#"%#",extension);
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"filename"
ofType:#"extension"]];
movieplayer = [[MPMoviePlayerController alloc]initWithContentURL:url];
movieplayer.movieSourceType = MPMovieSourceTypeFile;
movieplayer.controlStyle = MPMovieControlStyleDefault;
movieplayer.view.frame = CGRectMake(0,0,300,400);
[self.view addSubview:movieplayer.view];
[[NSNotificationCenter defaultCenter]
addObserver:self
selector:#selector(movieFinishedCallback:)
name:MPMoviePlayerPlaybackDidFinishNotification
object:movieplayer];
//---play movie---
[movieplayer play];
}
-(void)movieFinishedCallback:(NSNotification*) aNotification
{
MPMoviePlayerController *player = [aNotification object];
[[NSNotificationCenter defaultCenter]
removeObserver:self
name:MPMoviePlayerPlaybackDidFinishNotification
object:player];
[player stop];
// self.view removeFromSuperView;
[self.view removeFromSuperview];
//call the play button again
// pls tell me what to write here to call play function
}
You need to restructure your code.
The code you posted builds a pathname from your array of video pathnames, and then uses a fixed filename of #"filename". That's not right.
You should write a method "playFileAtIndex" that finds a filename at a specific index and plays that file. I would suggest adding a BOOL parameter playMultipleVideos to the method. If the flag is true, the method would set up a movieFinishedCallback to increment the index and start the next movie playing.
Then make your button IBAction method set an index iVar to the selected row and invoke the playFileAtIndex method.
I'm porting an app that works with aac audio files to iOS6 and I've found an strange behavior, when I try to get the duration of the (valid) aac audio file, it's always returns 0, in iOS4 and iOS5 it works fine.
¿Is there any bug on AvAudioPlayer class that affects duration property? I have read about some troubles with the currentTime property.
Here's the code:
NSURL* urlFichero = [NSURL fileURLWithPath:rutaFichero];
avaPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL: urlFichero error:nil];
segundos = avaPlayer.duration;
NSLog(#"[ControladorFicheros] Fichero: '%#' Duración: '%f'", nombreFichero, segundos);
[avaPlayer stop];
[avaPlayer release];
Thanks ;)
Finally the problem is that in newest versions of the API, AVAudioPlayer appears to only returns the correct duration of a file when it is ready for play it, that's why my solution was wrong, the correct way to get the duration of a file (in seconds) if you don't want to reproduce it is:
AVURLAsset *asset = [[[AVURLAsset alloc] initWithURL:anURI_ToResource
options:[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES],
AVURLAssetPreferPreciseDurationAndTimingKey,
nil]] autorelease];
NSTimeInterval durationInSeconds = 0.0;
if (asset)
durationInSeconds = CMTimeGetSeconds(asset.duration) ;
Swift
let asset = AVURLAsset(url: url, options: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
let durationInSeconds = CMTimeGetSeconds(asset.duration)
I noticed the same problem. My solution is to use instead.
MPMoviePlayerController *testPlayer = [[MPMoviePlayerController alloc] initWithContentURL:filePath];
[testPlater prepareToPlay];
[testPlater play];
I am getting frame buffer one by one from video file using AVAssetReader and doing some operation on the frame and then saving new frame to temp file using AVAssetWritter.Now I have temp file path where all new frame is saving one by one.
Is there any way to play video at the time frames is continuously adding to temp file??
here is code to play video from temp path(where frames is continuously adding)
- (void)loadAssetFromFile {
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[(mMediaReader.mCameraRecorder) tempVideoFilePath ]] options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
// Completion handler block.
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
self.mPlayerItem = [AVPlayerItem playerItemWithAsset:asset];
[mPlayerItem addObserver:self forKeyPath:#"status"
options:0 context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:mPlayerItem];
self.mPlayer = [AVPlayer playerWithPlayerItem:mPlayerItem];
[mPlayerView setPlayer:mPlayer];
[self play:nil];
}
else {
// You should deal with the error appropriately.
NSLog(#"The asset's tracks were not loaded:\n%#", [error localizedDescription]);
}
});
}];
}
- (IBAction)play:sender {
[mPlayer play];
}
And code inside the block never runs.
Dividing the video in multiple sub videos worked for me.
What I did instead of saving full video in one temp path. I divided that video in multiple sub videos and then replaced AVPlayerItem property of AVPlayer accordingly.
So now functionality is working same as video streaming . :)
You can also convert the CMSampleBuffer that the AVAssetReader returns into a CGImage and then a UIImage and display that in a UIImageView, to render the frames as they are pulled out of the original video file.
There is example code inside the AVFoundation Programming Guide that shows how to do this conversion.
I am getting frame buffer one by one from video file using AVAssetReader and doing some operation on the frame and then saving new frame to temp file using AVAssetWritter.Now I have temp file path where all new frame is saving one by one. Is there any way to play video at the time frames is continuously adding to temp file?
here is code to play video from temp path(where frames is continuously adding):
(void)loadAssetFromFile {
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[(mMediaReader.mCameraRecorder) tempVideoFilePath ]] options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
// Completion handler block.
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
self.mPlayerItem = [AVPlayerItem playerItemWithAsset:asset];
[mPlayerItem addObserver:self forKeyPath:#"status"
options:0 context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:mPlayerItem];
self.mPlayer = [AVPlayer playerWithPlayerItem:mPlayerItem];
[mPlayerView setPlayer:mPlayer];
[self play:nil];
}
else {
// You should deal with the error appropriately.
NSLog(#"The asset's tracks were not loaded:\n%#", [error localizedDescription]);
}
});
}];
}
(IBAction)play:sender {
[mPlayer play];
}
And code inside the block never runs.
You can convert the CMSampleBuffer that the AVAssetReader returns into a CGImage and then a UIImage and display that in a UIImageView, to render the frames as they are pulled out of the original video file.
There is example code inside the AVFoundation Programming Guide that shows how to do this conversion.