I have a AVMutableComposition that I've associated with an AVPlayer.
It seems if I add tracks to this composition the AVPlayer doesn't play the added tracks unless I set the composition on the player again. Right now I'm doing this by just replacing the current item with the same composition that is already set. After that it picks up the new tracks.
I haven't been able to figure out if this is the correct way of doing it, it somehow seems a bit strange having to do it like that.
Any ideas?
Thanks
Basically although it's mutable composition, apple recommend to create an immutable composition and create AVPlayerItem for playback purposes. It's a bit annoying but you can go around it. When you want to refresh the video just sample the time, create another AVPlayerItem and load it to the right time.
Here's snippet from apple's AVMutableComposition documentation
AVMutableComposition is a mutable subclass of AVComposition you use
when you want to create a new composition from existing assets. You
can add and remove tracks, and you can add, remove, and scale time
ranges.
You can make an immutable snapshot of a mutable composition for
playback or inspection as follows:
AVMutableComposition *myMutableComposition =
<#a mutable composition you want to inspect or play in its current state#>;
AVComposition *immutableSnapshotOfMyComposition = [myMutableComposition copy];
// Create a player to inspect and play the composition.
AVPlayerItem *playerItemForSnapshottedComposition =
[[AVPlayerItem alloc] initWithAsset:immutableSnapshotOfMyComposition];
Further information: Full Document
If you just remove a track from the mutable composition which has associated with the playerItem, you can call the seeking method on the AVPlayer object to the current time with a completion handler to force the player clearing the loaded buffer.
// refresh player
player.seek(to: playerItem.currentTime()) { _ in }
Related
I am playing live audio stream using AVPlayer and AVPlayerItem and trying to determine the current bit rate of the stream. I searched in the net and found this help :
Determening MPMovieController bit-rate
Inspired by the above thread, I tried to compute it using the following code:
NSArray *logEvents=playerItem.accessLog.events;
AVPlayerItemAccessLogEvent *event = (AVPlayerItemAccessLogEvent *)[logEvents lastObject];
double bitRate=event.observedBitrate;
But the variable bitRate is always zero when checked inside a timer.
In fact [logEvents count] is also always zero.
Could you please tell me what is wrong with the technique ?
Thanks a lot.
In addition to Ooops's suggestion, it might be wise to register for the AVPlayerItemNewAccessLogEntryNotification notification to check for the bitrate.
Since the access log array isn't KVO compliant, using the notification would allow you to not use a timer to check for updates and you wouldn't have to worry about waiting for the player item to be ready. If the events are being fired too frequently, you could choose to ignore some.
Nothing's wrong with the method. Check if your playerItem is actually loaded. The accessLog is nil until the playerItem is 'access'ed. Try to get the accessLogs after your player becomes AVPlayerStatusReadyToPlay and you'll get the log.
While streaming when encountering slow connectivity the AVPlayer may choose to play the lowest bit-rate in the HTTP Live Streaming playlist.
Is there a way to identify this transition?
I've tried observing the AVPlayerItem "tracks" property via KVO to see when it contains only audio but in most cases the tracks property isn't changed even though the player switched to the audio only stream.
I found out that the AVPlayerItem tracks property was not dependable on the simulator but somewhat more dependable on the actual device (with a ~5 seconds deviation).
Whenever the tracks property changes (you can find out when via KVO) you should traverse the tracks and see if there are any tracks with 'mediaType' set to AVMediaTypeVideo.
If there are none then you can conclude that you are in an audio only state.
Here is the question:
Can a View in iPad to handle two different MpMoviePlayerView in the same Time
I have read that the moviePlayer is a singleton and I have to fix the observer with hard Code.
I am triying to do multiple Movie player in the same View??
Is it possible???
Any one have an idea???
MPMoviePlayerControllers won't let you play back two videos simultaneously. Sorry! Whilst you can create two MPMoviePlayer objects if you try and play two streams back at the same time it will just break. This is mentioned in the documentation for that class, buried away somewhere.
I have multiple AVAssets, and I create individual AVMutableCompositionTracks for each. I then create an AVMutableComposition and add each AVMutableCompositionTrack to it and then create an AVAssetExportSession, init with the AVMutableComposition and run the exporter. This allows me to create a single audio file made up of many overlapping audio sources.
I can trim and delay each source audio file by setting the parameters when I insertTimeRange on each AVMutableCompositionTrack. What I can't figure out is how to fade in and out of each individual track. I can do a master fade on the export session by using setVolumeRampFromStartVolume via AVMutableAudioMixInputParameters, and I know how to do fades on AVPlayers using the same method, but I don't think AVMutableAudioMixInputParameters can be used on an AVMutableCompositionTrack, right?
So how can I add a fade to a AVMutableCompositionTrack?
Thanks!
AVMutableAudioMixInputParameters actually can be used with AVMutableCompositionTracks. I use them. It just isn't stored within the composition. Instead, you'll need to set the audioMix property of any AVPlayer or AVAssetExportSession you use.
I have successfully composed an AVMutableComposition with multiple video clips and can view it and export it, and I would like to be able to transition between them using a cross-fade, so I want to use AVMutableVideoComposition. I can't find any examples on how to even arrange and play a couple AVAsset videos in succession. Does anyone have an example of how to add tracks to an AVMutableVideoComposition with the equivalent of AVMutableComposition's insertTimeRange, or how to set up a cross-fade?
[self.composition insertTimeRange:CMTimeRangeMake(kCMTimeZero,asset.avAsset.duration)
ofAsset:asset.avAsset
atTime:self.composition.frameDuration
error:nil]
I found an example called AVEditDemo from Apple's WWDC 2010 Sample Code.
https://developer.apple.com/library/ios/samplecode/AVCustomEdit/Introduction/Intro.html
There is a lot of detail in the sample, but I'll summarize: You need to use both AVMutableComposition and AVMutableVideoComposition. Add tracks individually to AVMutableComposition instead of with the simpler insertTimeRange, as it allows you to set overlapping times on the tracks. The tracks also need to be added to the AVMutableVideoComposition as AVMutableVideoCompositionLayerInstructions with an opacity ramp. Finally, to play back in an AVPlayer, you need to create an AVPlayerItem using both the AVMutableComposition and AVMutableVideoComposition.
It seems like going each level deeper in the api – in this case from MPMoviePlayer with an asset to AVPlayer with an AVComposition and finally to an AVVideoComposition – increases necessary coding exponentially.