How to get notification for audio streaming status from AVPlayer? - iphone

I am using AVPlayer to stream some live HTTP audio, not AVAudioPlayer which does not support live HTTP audio streaming, the question is, how do I get the status of current playback? For example:
Tap Play Button -> [Loading] -> [Playing]
Tap Pause Button -> [Paused]
I need to show a spinner when loading, show a pause button when playing and show a play button when paused, I know I can observe the 'status' and 'rate' properties of AVPlayer:
rate:
the current rate of playback. 0.0 means “stopped”, 1.0 means “play at the natural rate of the current item”.
status:
Indicates whether the player can be used for playback.
AVPlayerStatusUnknown,
AVPlayerStatusReadyToPlay,
AVPlayerStatusFailed
so there is no way to indicate the audio is "LOADING", and after the status changes to AVPlayerStatusReadyToPlay, it still takes some time to have the audio playing(maybe because it is a live audio).
But anyway, how do I get the correct status of current playback? I know there is an AudioStream from Matt, but it does not support HTTP Live audio.
Thanks very much!

I used
[self.mPlayerItem addObserver:self
forKeyPath:kStatusKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:AVPlayerDemoPlaybackViewControllerStatusObservationContext];
to monitor the status key ("status"). Then I created the player
[self setPlayer:[AVPlayer playerWithPlayerItem:self.mPlayerItem]];
And in the observeValueForKeyPath
if (context == AVPlayerDemoPlaybackViewControllerStatusObservationContext)
{
AVPlayerStatus status = [[change objectForKey:NSKeyValueChangeNewKey] integerValue];
switch (status)
{
/* Indicates that the status of the player is not yet known because
it has not tried to load new media resources for playback */
case AVPlayerStatusUnknown:
{
[lblvalidation setText:#"Loading..."];
NSLog(#"AVPlayerStatusUnknown");
}
break;
case AVPlayerStatusReadyToPlay:
{
/* Once the AVPlayerItem becomes ready to play, i.e.
[playerItem status] == AVPlayerItemStatusReadyToPlay,
its duration can be fetched from the item. */
NSLog(#"AVPlayerStatusReadyToPlay");
[self.player play];
[lblvalidation setText:#"Playing..."];
}
break;
case AVPlayerStatusFailed:
{
[lblvalidation setText:#"Error..."];
NSLog(#"AVPlayerStatusFailed");
}
break;
}
}
this works for me... I hope it help for you.

Updated for Swift 2:
private var AVPlayerDemoPlaybackViewControllerStatusObservationContext = 0
Add observer:
player.currentItem!.addObserver(self, forKeyPath: "status", options: NSKeyValueObservingOptions.New, context: &AVPlayerDemoPlaybackViewControllerStatusObservationContext)
Observer
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
if context == &AVPlayerDemoPlaybackViewControllerStatusObservationContext {
if let change = change as? [String: Int]
{
let status = change[NSKeyValueChangeNewKey]!
switch status {
case AVPlayerStatus.Unknown.rawValue:
print("The status of the player is not yet known because it has not tried to load new media resources for playback")
case AVPlayerStatus.ReadyToPlay.rawValue:
self.playButtonPressed(playButton)
print("The player is Ready to Play")
case AVPlayerStatus.Failed.rawValue:
print("The player failed to load the video")
default:
print("Other status")
}
}
} else {
super.observeValueForKeyPath(keyPath, ofObject: object, change: change, context: context)
}
}

Related

Delegate which tells did other sound play in Swift AvAudio?

I am recording voice using AVAudioEngine for converting speech to text but I need to stop the recording if user phone suddenly start playing a music or ring? Is there an AVAudio delegate that tells me when any other music is playing?
Assuming that your AVAudioSession's category is playback (the default one), then you can subscribe to and receive notifications of such interruptions:
func setupNotifications() {
let nc = NotificationCenter.default
nc.addObserver(self,
selector: #selector(handleInterruption),
name: AVAudioSession.interruptionNotification,
object: AVAudioSession.sharedInstance)
}
#objc func handleInterruption(notification: Notification) {
guard let userInfo = notification.userInfo,
let typeValue = userInfo[AVAudioSessionInterruptionTypeKey] as? UInt,
let type = AVAudioSession.InterruptionType(rawValue: typeValue) else {
return
}
// Switch over the interruption type.
switch type {
case .began:
// An interruption began. Update the UI as necessary.
case .ended:
// An interruption ended. Resume playback, if appropriate.
guard let optionsValue = userInfo[AVAudioSessionInterruptionOptionKey] as? UInt else { return }
let options = AVAudioSession.InterruptionOptions(rawValue: optionsValue)
if options.contains(.shouldResume) {
// An interruption ended. Resume playback.
} else {
// An interruption ended. Don't resume playback.
}
default: ()
}
}
From the relevant documentation.

ObservedObject class function not being called when protocol fires

While implementing the Google Interactive Media Ads (IMA) SDK protocols, my mediaPlayer/audioManager, which is an AVPlayer object, is not pausing during the adsManagerDidRequestContentPause delegate method. My mediaPlayer conforms to ObservableObject, which is where I think the problem is coming from but i'm not 100% positive.
When I press the play button, I play the audio and request the ads from my adsManager class. The issue is the preroll video plays but the audio from the content player plays over the preroll. The audio from the content player is supposed to pause when the preroll is playing and resume after it finishes. As you'll see in the code, AudioManager is also a singleton class.
Here's the code for when a user presses play.
#ObservedObject var manager = AudioManager.sharedInstance
func didTapPlayButton(){
isPlaying.toggle()
if isPlaying {
audioManager.playLiveStream(with: pageInfo.tritonMount)
adManager.requestAds()
} else {
audioManager.pause()
}
}
And here's the adManager class with the Google IMA delegate methods. After setting breakpoints at each delegate method, I found that each call to audioManager is being called successfully, however, the audio from the audioManager doesn't actually pause.
#ObservedObject var manager = AudioManager.sharedInstance
func setUpContentPlayer() {
contentPlayhead = IMAAVPlayerContentPlayhead(avPlayer: player)
// Create a player layer for the player.
playerLayer = AVPlayerLayer(player: player)
playerLayer!.frame = videoView.underlyingView.layer.bounds
videoView.underlyingView.layer.addSublayer(playerLayer!)
}
func setUpAdsLoader() {
adsLoader = IMAAdsLoader(settings: nil)
adsLoader!.delegate = self
}
func requestAds() {
// Create an ad display container for ad rendering.
let adDisplayContainer = IMAAdDisplayContainer(adContainer: videoView.underlyingView, companionSlots: nil)
// Create an ad request with our ad tag, display container, and optional user context.
let request = IMAAdsRequest(
adTagUrl: kLivePrerollVastTag,
adDisplayContainer: adDisplayContainer,
contentPlayhead: contentPlayhead,
userContext: nil)
adsLoader!.requestAds(with: request)
}
func adsLoader(_ loader: IMAAdsLoader!, adsLoadedWith adsLoadedData: IMAAdsLoadedData!) {
// Grab the instance of the IMAAdsManager and set ourselves as the delegate
adsManager = adsLoadedData.adsManager
adsManager!.delegate = self
// Create ads rendering settings and tell the SDK to use the in-app browser.
let adsRenderingSettings = IMAAdsRenderingSettings()
adsRenderingSettings.webOpenerPresentingController = viewController
// Initialize the ads manager.
adsManager!.initialize(with: adsRenderingSettings)
}
//Ads Manager recieved ad request and is loading and starting ad
func adsManager(_ adsManager: IMAAdsManager!, didReceive event: IMAAdEvent!) {
if (event.type == IMAAdEventType.LOADED) {
// When the SDK notifies us that ads have been loaded, play them.
adsManager.start()
}
}
//Ads manager failed to receive ad request
func adsLoader(_ loader: IMAAdsLoader!, failedWith adErrorData: IMAAdLoadingErrorData!) {
print("Error loading ads: \(String(describing: adErrorData.adError.message))")
audioManager.resume()
}
func adsManager(_ adsManager: IMAAdsManager!, didReceive error: IMAAdError!) {
// Something went wrong with the ads manager after ads were loaded. Log the
// error and play the content.
NSLog("AdsManager error: \(String(describing: error.message))")
audioManager.resume()
}
/*Ads manager received request, initiated avplayer and is now
requesting player be paused in order or the Ads manager to play preroll*/
func adsManagerDidRequestContentPause(_ adsManager: IMAAdsManager!) {
// The SDK is going to play ads, so pause the content.
audioManager.pause()
}
/*Ads manager received request, played the preroll and is now
requesting avplayer to resume live stream*/
func adsManagerDidRequestContentResume(_ adsManager: IMAAdsManager!) {
// The SDK is done playing ads (at least for now), so resume the content.
audioManager.resume()
}
AudioManagerClass:
/// Upon setting this property any observers for 'currentItem' as well as any time observers will
/// be removed from the old value where applicable and added to the new value where applicable.
var player: AVPlayer? {
didSet {
oldValue?.removeObserver(self, forKeyPath: "currentItem", context: &AudioManager.ObserveAVPlayerCurrentItem)
oldValue?.removeObserver(self, forKeyPath: "rate", context: &AudioManager.ObserveAVPlayerRate)
guard let player = self.player else { return }
player.addObserver(self, forKeyPath: "currentItem", options: [.initial, .new], context: &AudioManager.ObserveAVPlayerCurrentItem)
player.addObserver(self, forKeyPath: "rate", options: [.initial, .new], context: &AudioManager.ObserveAVPlayerRate)
player.automaticallyWaitsToMinimizeStalling = false
if let tmpObs = timeObserver {
print("Already have timeobserver, removing it")
oldValue?.removeTimeObserver(tmpObs)
timeObserver = nil
}
timeObserver = player.addPeriodicTimeObserver(forInterval: CMTimeMakeWithSeconds(1,preferredTimescale: 1),
queue: nil,
using: timeObserverCallback) as AnyObject?
}
}

Playing one sound instance at a time

I have an application that is constantly receiving integer data from a bluetooth sensor and I made it so that if the integer is less than 50, then it should play the MP3.
The problem is that the sensor is very rapidly checking and sending the integers, which is resulting in too many audio instances, basically the the mp3 file is being played too many times at the same time. How can I have it so that it finishes the audio before starting again?
This is the main code:
var player: AVAudioPlayer?
if let unwrappedString = Reading {
let optionalInt = Int(unwrappedString)
if let upwrappedInt = optionalInt {
if(upwrappedInt < 50){
DispatchQueue.global(qos: .background).async {
self.playSound()
}
}
}
}
Sound function:
func playSound() {
guard let url = Bundle.main.url(forResource: "beep1", withExtension: "mp3") else {
print("url not found")
return
}
do {
/// this codes for making this app ready to takeover the device audio
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
try AVAudioSession.sharedInstance().setActive(true)
/// change fileTypeHint according to the type of your audio file (you can omit this)
player = try AVAudioPlayer(contentsOf: url, fileTypeHint: AVFileTypeMPEGLayer3)
// no need for prepareToPlay because prepareToPlay is happen automatically when calling play()
player!.play()
} catch let error as NSError {
print("error: \(error.localizedDescription)")
}
}
If the audio player is already playing (isPlaying), don't start playing!
https://developer.apple.com/reference/avfoundation/avaudioplayer/1390139-isplaying
I believe AVAudioPlayer has a delegate method to check if the audio has finished playing:
-(void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag
{
// ----------------------------------------------
// set your custom boolean flag 'isPlayingAudio'
// to false so you can play another audio again
// ----------------------------------------------
}
...
-(void)monitorBluetoothNumber
{
if(bluetoothNumber < 50 && !self.isPlayingAudio)
{
[self playMusic];
self.isPlayingAudio = YES;
}
}
You'll need to setup your audio player and set its delegate obviously.
The code is Objective C but you can easily adapt to Swift.

Is there a way to distinguish between a live stream and an on-demand file stream with AVPlayer?

I'm trying to create a more generic media controller for several types of streaming media and want to adapt the UI to the type of stream;
When it's an on-demand file stream (i.e. a single MP3 file that's being streamed), you should be able to seek forward and backward. Thus, the seek slider should be visible.
When it's a live stream, it isn't possible to seek forward and backward, and thus the seek slider should be hidden.
Is there any way to determine from the AVPlayer (or perhaps the AVPlayerItem or AVAsset) what the type of stream is?
The duration of live video is indefinite:
AVPlayer * player = ...;
const BOOL isLive = CMTIME_IS_INDEFINITE([player currentItem].duration);
You have to check the duration only when the AVPlayerItem item status is AVPlayerItemStatusReadyToPlay.
For those who are still looking for this feature,
AVPlayerItem > AVPlayerItemAccessLogEvent > playbackType property might be helpful.
I already checked "VOD", "LIVE" types were appropriately returned from it.
more detail in here
It appears that this is not possible.
However, one could check the duration of a live stream, which seems to be consistently above 33000 seconds. However, this value still fluctuates and checking for this is undesirable, since it might cause unexpected behavior.
Solution
You can use this code to easily detect the playback type:
NotificationCenter.default.addObserver(
forName: NSNotification.Name.AVPlayerItemNewAccessLogEntry,
object: nil,
queue: OperationQueue.main) { [weak self] (notification) in
guard let self = self else { return }
guard let playerItem = notification.object as? AVPlayerItem,
let lastEvent = playerItem.accessLog()?.events.last else {
return
}
// Here you can set the type (LIVE | VOD | FILE or unknow if it's a nil):
print("Playback Type: \(lastEvent.playbackType ?? "NA")")
}
Add the observer code to where you normally start to listen to them.
Also, don't forget to remove the observer at the deinit ;)
deinit {
NotificationCenter.default.removeObserver(self,
name: NSNotification.Name.AVPlayerItemNewAccessLogEntry,
object: self)
}
Hope this will help someone :)
player?.addPeriodicTimeObserver(forInterval: interval, queue: .main, using: { time in
let playbackType = self.player?.currentItem?.accessLog()?.events.last?.playbackType!
print("Playback Type: \(lastEvent.playbackType ?? "NA")")
if playbackType == StreamingType.Live.rawValue {
}
else if playbackType == StreamingType.Vod.rawValue {
}
})
The playback type can be live, VOD, or from a file. If nil is returned the playback type is unknown. more detail in here

Knowing when AVPlayer object is ready to play

I'm trying to play an MP3 file that is passed to an UIView from a previous UIView (stored in a NSURL *fileURL variable).
I'm initializing an AVPlayer with:
player = [AVPlayer playerWithURL:fileURL];
NSLog(#"Player created:%d",player.status);
The NSLog prints Player created:0, which i figured means it is not ready to play yet.
When i click the play UIButton, the code i run is:
-(IBAction)playButtonClicked
{
NSLog(#"Clicked Play. MP3:%#",[fileURL absoluteString]);
if(([player status] == AVPlayerStatusReadyToPlay) && !isPlaying)
// if(!isPlaying)
{
[player play];
NSLog(#"Playing:%# with %d",[fileURL absoluteString], player.status);
isPlaying = YES;
}
else if(isPlaying)
{
[player pause];
NSLog(#"Pausing:%#",[fileURL absoluteString]);
isPlaying = NO;
}
else {
NSLog(#"Error in player??");
}
}
When i run this, I always get Error in player?? in the console.
If i however replace the if condition that checks if AVPlayer is ready to play, with a simple if(!isPlaying)..., then the music plays the SECOND TIME I click on the play UIButton.
The console log is:
Clicked Play. MP3:http://www.nimh.nih.gov/audio/neurogenesis.mp3
Playing:http://www.nimh.nih.gov/audio/neurogenesis.mp3 **with 0**
Clicked Play. MP3:http://www.nimh.nih.gov/audio/neurogenesis.mp3
Pausing:http://www.nimh.nih.gov/audio/neurogenesis.mp3
Clicked Play. MP3:http://www.nimh.nih.gov/audio/neurogenesis.mp3
2011-03-23 11:06:43.674 Podcasts[2050:207] Playing:http://www.nimh.nih.gov/audio/neurogenesis.mp3 **with 1**
I see that the SECOND TIME, the player.status seems to hold 1, which I'm guessing is AVPlayerReadyToPlay.
What can I do to have the playing to work properly the first time i click the play UIButton?
(ie, how can i make sure the AVPlayer is not just created, but also ready to play?)
You are playing a remote file. It may take some time for the AVPlayer to buffer enough data and be ready to play the file (see AV Foundation Programming Guide)
But you don't seem to wait for the player to be ready before tapping the play button. What I would to is disable this button and enable it only when the player is ready.
Using KVO, it's possible to be notified for changes of the player status:
playButton.enabled = NO;
player = [AVPlayer playerWithURL:fileURL];
[player addObserver:self forKeyPath:#"status" options:0 context:nil];
This method will be called when the status changes:
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context {
if (object == player && [keyPath isEqualToString:#"status"]) {
if (player.status == AVPlayerStatusReadyToPlay) {
playButton.enabled = YES;
} else if (player.status == AVPlayerStatusFailed) {
// something went wrong. player.error should contain some information
}
}
}
Swift Solution
var observer: NSKeyValueObservation?
func prepareToPlay() {
let url = <#Asset URL#>
// Create asset to be played
let asset = AVAsset(url: url)
let assetKeys = [
"playable",
"hasProtectedContent"
]
// Create a new AVPlayerItem with the asset and an
// array of asset keys to be automatically loaded
let playerItem = AVPlayerItem(asset: asset,
automaticallyLoadedAssetKeys: assetKeys)
// Register as an observer of the player item's status property
self.observer = playerItem.observe(\.status, options: [.new, .old], changeHandler: { (playerItem, change) in
if playerItem.status == .readyToPlay {
//Do your work here
}
})
// Associate the player item with the player
player = AVPlayer(playerItem: playerItem)
}
Also you can invalidate the observer this way
self.observer.invalidate()
Important: You must keep the observer variable retained otherwise it will deallocate and the changeHandler will no longer get called. So don't define the observer as a function variable but define it as a instance variable like the given example.
This key value observer syntax is new to Swift 4.
For more information, see here https://github.com/ole/whats-new-in-swift-4/blob/master/Whats-new-in-Swift-4.playground/Pages/Key%20paths.xcplaygroundpage/Contents.swift
I had a lot of trouble trying to figure out the status of an AVPlayer. The status property didn't always seem to be terribly helpful, and this led to endless frustration when I was trying to handle audio session interruptions. Sometimes the AVPlayer told me it was ready to play (with AVPlayerStatusReadyToPlay) when it didn't actually seem to be. I used Jilouc's KVO method, but it didn't work in all cases.
To supplement, when the status property wasn't being useful, I queried the amount of the stream that the AVPlayer had loaded by looking at the loadedTimeRanges property of the AVPlayer's currentItem (which is an AVPlayerItem).
It's all a little confusing, but here's what it looks like:
NSValue *val = [[[audioPlayer currentItem] loadedTimeRanges] objectAtIndex:0];
CMTimeRange timeRange;
[val getValue:&timeRange];
CMTime duration = timeRange.duration;
float timeLoaded = (float) duration.value / (float) duration.timescale;
if (0 == timeLoaded) {
// AVPlayer not actually ready to play
} else {
// AVPlayer is ready to play
}
private var playbackLikelyToKeepUpContext = 0
For register observer
avPlayer.addObserver(self, forKeyPath: "currentItem.playbackLikelyToKeepUp",
options: .new, context: &playbackLikelyToKeepUpContext)
Listen the observer
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if context == &playbackLikelyToKeepUpContext {
if avPlayer.currentItem!.isPlaybackLikelyToKeepUp {
// loadingIndicatorView.stopAnimating() or something else
} else {
// loadingIndicatorView.startAnimating() or something else
}
}
}
For remove observer
deinit {
avPlayer.removeObserver(self, forKeyPath: "currentItem.playbackLikelyToKeepUp")
}
The key point in the code is instance property isPlaybackLikelyToKeepUp.
After researching a lot and try many ways I've noticed that normally the status observer is not the better for know really when AVPlayer object is ready to play, because the object can be ready for play but this not that mean it will be play immediately.
The better idea for know this is with loadedTimeRanges.
For Register observer
[playerClip addObserver:self forKeyPath:#"currentItem.loadedTimeRanges" options:NSKeyValueObservingOptionNew context:nil];
Listen the observer
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (object == playerClip && [keyPath isEqualToString:#"currentItem.loadedTimeRanges"]) {
NSArray *timeRanges = (NSArray*)[change objectForKey:NSKeyValueChangeNewKey];
if (timeRanges && [timeRanges count]) {
CMTimeRange timerange=[[timeRanges objectAtIndex:0]CMTimeRangeValue];
float currentBufferDuration = CMTimeGetSeconds(CMTimeAdd(timerange.start, timerange.duration));
CMTime duration = playerClip.currentItem.asset.duration;
float seconds = CMTimeGetSeconds(duration);
//I think that 2 seconds is enough to know if you're ready or not
if (currentBufferDuration > 2 || currentBufferDuration == seconds) {
// Ready to play. Your logic here
}
} else {
[[[UIAlertView alloc] initWithTitle:#"Alert!" message:#"Error trying to play the clip. Please try again" delegate:nil cancelButtonTitle:#"Ok" otherButtonTitles:nil, nil] show];
}
}
}
For remove observer (dealloc, viewWillDissapear or before register observer) its a good places for called
- (void)removeObserverForTimesRanges
{
#try {
[playerClip removeObserver:self forKeyPath:#"currentItem.loadedTimeRanges"];
} #catch(id anException){
NSLog(#"excepcion remove observer == %#. Remove previously or never added observer.",anException);
//do nothing, obviously it wasn't attached because an exception was thrown
}
}
Based on Tim Camber answer, here is the Swift function I use :
private func isPlayerReady(_ player:AVPlayer?) -> Bool {
guard let player = player else { return false }
let ready = player.status == .readyToPlay
let timeRange = player.currentItem?.loadedTimeRanges.first as? CMTimeRange
guard let duration = timeRange?.duration else { return false } // Fail when loadedTimeRanges is empty
let timeLoaded = Int(duration.value) / Int(duration.timescale) // value/timescale = seconds
let loaded = timeLoaded > 0
return ready && loaded
}
Or, as an extension
extension AVPlayer {
var ready:Bool {
let timeRange = currentItem?.loadedTimeRanges.first as? CMTimeRange
guard let duration = timeRange?.duration else { return false }
let timeLoaded = Int(duration.value) / Int(duration.timescale) // value/timescale = seconds
let loaded = timeLoaded > 0
return status == .readyToPlay && loaded
}
}
I had issues with not getting any callbacks.
Turns out it depends on how you create the stream. In my case I used a playerItem to initialize, and thus I had to add the observer to the item instead.
For example:
- (void) setup
{
...
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
...
// add callback
[self.player.currentItem addObserver:self forKeyPath:#"status" options:0 context:nil];
}
// the callback method
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context
{
NSLog(#"[VideoView] player status: %i", self.player.status);
if (object == self.player.currentItem && [keyPath isEqualToString:#"status"])
{
if (self.player.currentItem.status == AVPlayerStatusReadyToPlay)
{
//do stuff
}
}
}
// cleanup or it will crash
-(void)dealloc
{
[self.player.currentItem removeObserver:self forKeyPath:#"status"];
}
Swift 4:
var player:AVPlayer!
override func viewDidLoad() {
super.viewDidLoad()
NotificationCenter.default.addObserver(self,
selector: #selector(playerItemDidReadyToPlay(notification:)),
name: .AVPlayerItemNewAccessLogEntry,
object: player?.currentItem)
}
#objc func playerItemDidReadyToPlay(notification: Notification) {
if let _ = notification.object as? AVPlayerItem {
// player is ready to play now!!
}
}
Check the status of the player's currentItem:
if (player.currentItem.status == AVPlayerItemStatusReadyToPlay)
#JoshBernfeld's answer didn't work for me. Not sure why. He observed playerItem.observe(\.status. I had to observe player?.observe(\.currentItem?.status. Seems like they're the same thing, the playerItem status property.
var playerStatusObserver: NSKeyValueObservation?
player?.automaticallyWaitsToMinimizeStalling = false // starts faster
playerStatusObserver = player?.observe(\.currentItem?.status, options: [.new, .old]) { (player, change) in
switch (player.status) {
case .readyToPlay:
// here is where it's ready to play so play player
DispatchQueue.main.async { [weak self] in
self?.player?.play()
}
case .failed, .unknown:
print("Media Failed to Play")
#unknown default:
break
}
}
when you are finished using the player set playerStatusObserver = nil